TY - JOUR
T1 - Minimizing Occlusion Effect on Multi-View Camera Perception in BEV with Multi-Sensor Fusion
AU - Kumar, Sanjay
AU - Truong, Hiep
AU - Sharma, Sushil
AU - Sistu, Ganesh
AU - Scanlan, Tony
AU - Grua, Eoin
AU - Eising, Ciarán
N1 - Publisher Copyright:
© 2025 Society for Imaging Science and Technology.
PY - 2025
Y1 - 2025
N2 - Autonomous driving technology is rapidly evolving, offering the potential for safer and more efficient transportation. However, the performance of these systems can be significantly compromised by the occlusion on sensors due to environmental factors like dirt, dust, rain, and fog. These occlusions severely affect vision-based tasks such as object detection, vehicle segmentation, and lane recognition. In this paper, we investigate the impact of various kinds of occlusions on camera sensor by projecting their effects from multi-view camera images of the nuScenes dataset into the Bird’s-Eye View (BEV) domain. This approach allows us to analyze how occlusions spatially distribute and influence vehicle segmentation accuracy within the BEV domain. Despite significant advances in sensor technology and multi-sensor fusion, a gap remains in the existing literature regarding the specific effects of camera occlusions on BEV-based perception systems. To address this gap, we use a multi-sensor fusion technique that integrates LiDAR and radar sensor data to mitigate the performance degradation caused by occluded cameras. Our findings demonstrate that this approach significantly enhances the accuracy and robustness of vehicle segmentation tasks, leading to more reliable autonomous driving systems. https: // youtu.
AB - Autonomous driving technology is rapidly evolving, offering the potential for safer and more efficient transportation. However, the performance of these systems can be significantly compromised by the occlusion on sensors due to environmental factors like dirt, dust, rain, and fog. These occlusions severely affect vision-based tasks such as object detection, vehicle segmentation, and lane recognition. In this paper, we investigate the impact of various kinds of occlusions on camera sensor by projecting their effects from multi-view camera images of the nuScenes dataset into the Bird’s-Eye View (BEV) domain. This approach allows us to analyze how occlusions spatially distribute and influence vehicle segmentation accuracy within the BEV domain. Despite significant advances in sensor technology and multi-sensor fusion, a gap remains in the existing literature regarding the specific effects of camera occlusions on BEV-based perception systems. To address this gap, we use a multi-sensor fusion technique that integrates LiDAR and radar sensor data to mitigate the performance degradation caused by occluded cameras. Our findings demonstrate that this approach significantly enhances the accuracy and robustness of vehicle segmentation tasks, leading to more reliable autonomous driving systems. https: // youtu.
KW - Bird’s Eye View (BEV)
KW - Multi-Sensor Fusion
KW - Occluded Image Data
KW - Vehicle Segmentation
UR - https://www.scopus.com/pages/publications/105000826436
U2 - 10.2352/EI.2025.37.15.AVM-113
DO - 10.2352/EI.2025.37.15.AVM-113
M3 - Conference article
AN - SCOPUS:105000826436
SN - 2470-1173
VL - 37
JO - IS and T International Symposium on Electronic Imaging Science and Technology
JF - IS and T International Symposium on Electronic Imaging Science and Technology
IS - 15
M1 - AVM-113
T2 - IS and T International Symposium on Electronic Imaging 2025: Autonomous Vehicles and Machines, AVM 2025
Y2 - 2 February 2025 through 6 February 2025
ER -