TY - JOUR
T1 - Optimizing Camera Exposure Time for Automotive Applications
AU - Lin, Hao
AU - Mullins, Darragh
AU - Molloy, Dara
AU - Ward, Enda
AU - Collins, Fiachra
AU - Denny, Patrick
AU - Glavin, Martin
AU - Deegan, Brian
AU - Jones, Edward
N1 - Publisher Copyright:
© 2024 by the authors.
PY - 2024/8
Y1 - 2024/8
N2 - Camera-based object detection is integral to advanced driver assistance systems (ADAS) and autonomous vehicle research, and RGB cameras remain indispensable for their spatial resolution and color information. This study investigates exposure time optimization for such cameras, considering image quality in dynamic ADAS scenarios. Exposure time, the period during which the camera sensor is exposed to light, directly influences the amount of information captured. In dynamic scenarios, such as those encountered in typical driving scenarios, optimizing exposure time becomes challenging due to the inherent trade-off between Signal-to-Noise Ratio (SNR) and motion blur, i.e., extending exposure time to maximize information capture increases SNR, but also increases the risk of motion blur and overexposure, particularly in low-light conditions where objects may not be fully illuminated. The study introduces a comprehensive methodology for exposure time optimization under various lighting conditions, examining its impact on image quality and computer vision performance. Traditional image quality metrics show a poor correlation with computer vision performance, highlighting the need for newer metrics that demonstrate improved correlation. The research presented in this paper offers guidance into the enhancement of single-exposure camera-based systems for automotive applications. By addressing the balance between exposure time, image quality, and computer vision performance, the findings provide a road map for optimizing camera settings for ADAS and autonomous driving technologies, contributing to safety and performance advancements in the automotive landscape.
AB - Camera-based object detection is integral to advanced driver assistance systems (ADAS) and autonomous vehicle research, and RGB cameras remain indispensable for their spatial resolution and color information. This study investigates exposure time optimization for such cameras, considering image quality in dynamic ADAS scenarios. Exposure time, the period during which the camera sensor is exposed to light, directly influences the amount of information captured. In dynamic scenarios, such as those encountered in typical driving scenarios, optimizing exposure time becomes challenging due to the inherent trade-off between Signal-to-Noise Ratio (SNR) and motion blur, i.e., extending exposure time to maximize information capture increases SNR, but also increases the risk of motion blur and overexposure, particularly in low-light conditions where objects may not be fully illuminated. The study introduces a comprehensive methodology for exposure time optimization under various lighting conditions, examining its impact on image quality and computer vision performance. Traditional image quality metrics show a poor correlation with computer vision performance, highlighting the need for newer metrics that demonstrate improved correlation. The research presented in this paper offers guidance into the enhancement of single-exposure camera-based systems for automotive applications. By addressing the balance between exposure time, image quality, and computer vision performance, the findings provide a road map for optimizing camera settings for ADAS and autonomous driving technologies, contributing to safety and performance advancements in the automotive landscape.
KW - ADAS
KW - autonomous vehicles
KW - computer vision
KW - image quality
KW - low light conditions
KW - object detection
UR - http://www.scopus.com/inward/record.url?scp=85202433943&partnerID=8YFLogxK
U2 - 10.3390/s24165135
DO - 10.3390/s24165135
M3 - Article
C2 - 39204832
AN - SCOPUS:85202433943
SN - 1424-8220
VL - 24
JO - Sensors
JF - Sensors
IS - 16
M1 - 5135
ER -