Low Latency Automotive Vision with Event Cameras (Nature, 2024)

preview_player
Показать описание
The computer vision algorithms used in today’s advanced driver assistance systems rely on image-based RGB cameras, leading to a critical bandwidth-latency trade-off for delivering safe driving experiences. To address this, event cameras have emerged as alternative vision sensors. Event cameras measure changes in intensity asynchronously, offering high temporal resolution and sparsity, drastically reducing bandwidth and latency requirements. Despite these advantages, event camera-based algorithms are either highly efficient but lag behind image-based ones in terms of accuracy or sacrifice the sparsity and efficiency of events to achieve comparable results. To overcome this, we propose a novel hybrid event- and frame-based object detector that preserves the advantages of each modality and thus does not suffer from this tradeoff. Our method exploits the high temporal resolution and sparsity of events and the rich but low temporal resolution information in standard images to generate efficient, high-rate object detections, reducing perceptual and computational latency. We show that the use of a 20 Hz RGB camera plus an event camera can achieve the same latency as a 5,000 Hz camera with the bandwidth of a 45 Hz camera without compromising accuracy. Our approach paves the way for efficient and robust perception in edge-case scenarios by uncovering the potential of event cameras.

Daniel Gehrig, Davide Scaramuzza
Low Latency Automotive Vision with Event Cameras.
Nature, May 29th, 2024



Affiliation:
Рекомендации по теме
Комментарии
Автор

Where can we order such Event Cameras?

The-Dark-Tower
Автор

ALARMING! This work can't be repeated now, all the issue pulled on github hasn't been processed, I wonder wether this programm is real.

lightjarvis
Автор

Congrats to the team! Did Elon call already? 😄

JariVasell
Автор

this system is bound to fail because you cant predict whats you havent seen yet.

Graveness