TY - JOUR
T1 - Surround-View Fisheye Camera Perception for Automated Driving
T2 - Overview, Survey & Challenges
AU - Kumar, Varun Ravi
AU - Eising, Ciaran
AU - Witt, Christian
AU - Yogamani, Senthil Kumar
N1 - Publisher Copyright:
© 2000-2011 IEEE.
PY - 2023/4/1
Y1 - 2023/4/1
N2 - Surround-view fisheye cameras are commonly used for near-field sensing in automated driving. Four fisheye cameras on four sides of the vehicle are sufficient to cover 360° around the vehicle capturing the entire near-field region. Some primary use cases are automated parking, traffic jam assist, and urban driving. There are limited datasets and very little work on near-field perception tasks as the focus in automotive perception is on far-field perception. In contrast to far-field, surround-view perception poses additional challenges due to high precision object detection requirements of 10cm and partial visibility of objects. Due to the large radial distortion of fisheye cameras, standard algorithms cannot be extended easily to the surround-view use case. Thus, we are motivated to provide a self-contained reference for automotive fisheye camera perception for researchers and practitioners. Firstly, we provide a unified and taxonomic treatment of commonly used fisheye camera models. Secondly, we discuss various perception tasks and existing literature. Finally, we discuss the challenges and future direction.
AB - Surround-view fisheye cameras are commonly used for near-field sensing in automated driving. Four fisheye cameras on four sides of the vehicle are sufficient to cover 360° around the vehicle capturing the entire near-field region. Some primary use cases are automated parking, traffic jam assist, and urban driving. There are limited datasets and very little work on near-field perception tasks as the focus in automotive perception is on far-field perception. In contrast to far-field, surround-view perception poses additional challenges due to high precision object detection requirements of 10cm and partial visibility of objects. Due to the large radial distortion of fisheye cameras, standard algorithms cannot be extended easily to the surround-view use case. Thus, we are motivated to provide a self-contained reference for automotive fisheye camera perception for researchers and practitioners. Firstly, we provide a unified and taxonomic treatment of commonly used fisheye camera models. Secondly, we discuss various perception tasks and existing literature. Finally, we discuss the challenges and future direction.
KW - Automated driving
KW - bird-eye's view perception
KW - fisheye camera
KW - multi-task learning
KW - omnidirectional camera
KW - surround view perception
UR - http://www.scopus.com/inward/record.url?scp=85146573925&partnerID=8YFLogxK
U2 - 10.1109/TITS.2023.3235057
DO - 10.1109/TITS.2023.3235057
M3 - Article
AN - SCOPUS:85146573925
SN - 1524-9050
VL - 24
SP - 3638
EP - 3659
JO - IEEE Transactions on Intelligent Transportation Systems
JF - IEEE Transactions on Intelligent Transportation Systems
IS - 4
ER -