In this keynote held at the 2024 International Conference on Computational Photography, Prof. Davide Scaramuzza from the University of Zurich presents a visionary keynote on event cameras, which are bio-inspired vision sensors that outperform conventional cameras with ultra-low latency, high dynamic range, and minimal power consumption. He dives into the motivation behind event-based cameras, explains how these sensors work, and explores their mathematical modeling and processing frameworks. He highlights cutting-edge applications across computer vision, robotics, autonomous vehicles, virtual reality, and mobile devices while also addressing the open challenges and future directions shaping this exciting field.
00:00 – Why event cameras matter to robotics and computer vision
07:24 – Bandwidth-latency tradeoff
08:24 – Working principle of the event camera
10:50 – Who sells event cameras
12:27 – Relation between event cameras and the biological eye
13:19 – Mathematical model of the event camera
15:35 – Image reconstruction from events
18:32 – A simple optical-flow algorithm
20:20 – How to process events in general
21:28 – 1st order approximation of the event generation model
23:56 – Application 1: Event-based feature tracking
25:03 – Application 2: Ultimate SLAM
26:30 – Application 3: Autonomous navigation in low light
27:38 – Application 4: Keeping drones fly when a rotor fails
31:06 – Contrast maximization for event cameras
34:14 – Application 1: Video stabilization
35:16 – Application 2: Motion segmentation
36:32 – Application 3: Dodging dynamic objects
38:57 – Application 4: Catching dynamic objects
39:41 – Application 5: High-speed inspection at Boeing and Strata
41:33 – Combining events and RGB cameras and how to apply deep learning
45:18 – Application 1: Slow-motion video
48:34 – Application 2: Video deblurring
49:45 – Application 3: Advanced Driving Assistant Systems
56:34 – History and future of event cameras
58:42 – Reading material and Q&A
Other resources:
Our research page on event-based vision: http://rpg.ifi.uzh.ch/research_dvs.html
All worldwide resources on event cameras (publications, software, drivers, datasets, simulators, where to buy, etc.):
https://github.com/uzh-rpg/event-based_vision_resources
For a course on event-based robot vision:
1. Slides: https://sites.google.com/view/guillermogallego/teaching/event-based-robot-vision?authuser=0
2. Video recordings: https://www.youtube.com/watch?v=cXSdPdhH4RI&list=PL03Gm3nZjVgUFYUh3v5x8jVonjrGfcal8
For a tutorial on noise modeling of event cameras: https://www.youtube.com/watch?v=YY31GaiOkNM
Our key event-camera datasets:
1. https://dsec.ifi.uzh.ch/
2. https://rpg.ifi.uzh.ch/davis_data.html
3. https://github.com/uzh-rpg/event-based_vision_resources#datasets
Our key event camera simulator: http://rpg.ifi.uzh.ch/esim
Survey paper on event cameras:
https://rpg.ifi.uzh.ch/docs/EventVisionSurvey.pdf
Affiliation:
Davide Scaramuzza is with the Robotics and Perception Group, Dept. of Informatics, University of Zurich, and Dept. of Neuroinformatics, University of Zurich and ETH Zurich, Switzerland https://rpg.ifi.uzh.ch/