Title: Seeing through Space and Time: Asynchronous Event Processing for Robots
Time, Date, Location 11:00, Thursday, 02 May 2024, Building 145, Room 1.33 and Zoom
Abstract: Vision sensors are the eyes of robotic systems, facilitating perception, planning, and interaction with the world. Event perception offers a revolutionary new sensing modality to enable real-time perception for autonomous systems. These sensors are ideal for robotics applications due to several advantages: continuous output with microsecond resolution, High Dynamic Range (HDR) imaging, less motion blur, real-time response without frame rate limitations, and low power consumption.
In this PhD oral presentation, I will present case studies addressing four real-world challenges: (1) Hybrid Event-Frame Fusion for High-Speed HDR Video Reconstruction, (2) Event Data Pre-processing, (3) High-Speed Visual Tracking, (4) Optical Communication using Smart Beacons. These algorithms process each event independently upon arrival, preserving rich temporal information and improving computational efficiency. Their asynchrony and efficiency make them well-suited for direct implementation on low-level hardware like Field Programmable Gate Arrays (FPGAs) and Application-Specific Integrated Circuits (ASICs) which could further reduce latency and enhance real-time processing capabilities at the hardware level.
Bio: Ziwei Wang is a final-year Ph.D. student at the Australian National University in the Systems Theory and Robotics (STR) group at the College of Engineering and Computer Science under the supervision of Prof. Robert Mahony, Dr. Timothy Molloy, and Dr. Yonhon Ng. She received her B.Eng degree from ANU (Mechatronics) in 2019. Her interests include event-based vision, asynchronous image processing, and practical robotic vision applications.