TY - JOUR
T1 - Exploring Sensor Impact and Architectural Robustness in Adverse Weather on BEV Perception
AU - Kumar, Sanjay
AU - Sharma, Sushil
AU - Asghar, Rabia
AU - Mohandas, Reenu
AU - Brophy, Tim
AU - Sistu, Ganesh
AU - Grua, Eoin Martino
AU - Donzella, Valentina
AU - Eising, Ciarán
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2025
Y1 - 2025
N2 - Reliable perception in automated vehicles under adverse conditions, such as fog, rain, snow, and lens defocus, is essential for maintaining the safety of road actors and particularly of vulnerable road users. While prior work has primarily focused on camera occlusions, the impact on RADAR and LiDAR remains underexplored, particularly in a unified Bird's Eye View (BEV) space. To address this gap, we first apply occlusion to all three primary sensors: camera, RADAR, and LiDAR, and then systematically investigate its impact by projecting their outputs into the BEV space for unified analysis of vehicle and map segmentation. A parametrised occlusion pipeline is developed to apply occlusions to each of the sensor modalities. We evaluate both geometry-based and transformer-based fusion architectures, revealing that transformer-based architectures consistently demonstrate greater robustness to sensor degradation. Notably, we demonstrate that BEVCar achieves 45.6% vehicle Intersection-over-Union (IoU) and 53.6% Mean Intersection-over-Union (mIoU) under camera occlusion, surpassing other State-of-the-art (SOTA) models such as MMTraP (37.9% IoU / 47.9% mIoU) and CVT (36.0% IoU / 46.6% mIoU). These improvements are statistically significant (paired t-tests with 95% CI bootstrap, p < 0.001). Furthermore, projecting camera features into the BEV space using a backward projection strategy seems to offer greater resilience to occlusion than forward projection. These insights highlight the importance of architectural design, projection choice, and multi-sensor fusion in developing robust perception systems for automated driving under realistic multi-sensor occlusions.
AB - Reliable perception in automated vehicles under adverse conditions, such as fog, rain, snow, and lens defocus, is essential for maintaining the safety of road actors and particularly of vulnerable road users. While prior work has primarily focused on camera occlusions, the impact on RADAR and LiDAR remains underexplored, particularly in a unified Bird's Eye View (BEV) space. To address this gap, we first apply occlusion to all three primary sensors: camera, RADAR, and LiDAR, and then systematically investigate its impact by projecting their outputs into the BEV space for unified analysis of vehicle and map segmentation. A parametrised occlusion pipeline is developed to apply occlusions to each of the sensor modalities. We evaluate both geometry-based and transformer-based fusion architectures, revealing that transformer-based architectures consistently demonstrate greater robustness to sensor degradation. Notably, we demonstrate that BEVCar achieves 45.6% vehicle Intersection-over-Union (IoU) and 53.6% Mean Intersection-over-Union (mIoU) under camera occlusion, surpassing other State-of-the-art (SOTA) models such as MMTraP (37.9% IoU / 47.9% mIoU) and CVT (36.0% IoU / 46.6% mIoU). These improvements are statistically significant (paired t-tests with 95% CI bootstrap, p < 0.001). Furthermore, projecting camera features into the BEV space using a backward projection strategy seems to offer greater resilience to occlusion than forward projection. These insights highlight the importance of architectural design, projection choice, and multi-sensor fusion in developing robust perception systems for automated driving under realistic multi-sensor occlusions.
KW - bird's eye view
KW - camera features projection
KW - geometric-based architectures
KW - map segmentation
KW - Multi-sensor fusion
KW - sensor level occlusion
KW - transformer-based architectures
KW - vehicle segmentation
UR - https://www.scopus.com/pages/publications/105019548512
U2 - 10.1109/OJVT.2025.3621862
DO - 10.1109/OJVT.2025.3621862
M3 - Article
AN - SCOPUS:105019548512
SN - 2644-1330
VL - 6
SP - 2857
EP - 2875
JO - IEEE Open Journal of Vehicular Technology
JF - IEEE Open Journal of Vehicular Technology
ER -