Evaluating Event-Based Vision Sensing in Rain and Fog

  • Ethan Delaney
  • , Tim Brophy
  • , Enda Ward
  • , Fiachra Collins
  • , Edward Jones
  • , Brian Deegan
  • , Martin Glavin

Research output: Contribution to journalArticlepeer-review

Abstract

Event-based vision sensors have a higher temporal resolution, a wider dynamic range, and a lower latency than conventional frame-based cameras. For these reasons, event-based sensors are being considered for advanced driver assistance system (ADAS) applications. If these sensors are to be used for automotive sensing and perception, their performance under adverse conditions, such as rain and fog, must be characterized to ensure reliable performance. This study presents a suite of tests using an event-based sensor under controlled conditions across a range of rainfall rates, ambient light levels, fog visibility levels, and distances from the targets. To evaluate the performance of these sensors, the average event rate (number of events per second) was compared with rainfall rates and visibility. The results indicated that the diameter of the raindrops had a larger effect on the number of events than the rainfall rate. Furthermore, the investigation revealed that, by carefully configuring the camera settings, it is possible to mitigate the effects of rain on the sensor output.

Original languageEnglish
Pages (from-to)31545-31562
Number of pages18
JournalIEEE Sensors Journal
Volume25
Issue number16
DOIs
Publication statusPublished - 2025
Externally publishedYes

Keywords

  • Autonomous vehicles
  • bias setting
  • camera
  • event rate
  • event-based sensing
  • fog
  • light intensity
  • neuromorphic
  • rain
  • visibility

Fingerprint

Dive into the research topics of 'Evaluating Event-Based Vision Sensing in Rain and Fog'. Together they form a unique fingerprint.

Cite this