TY - JOUR
T1 - Near-Field Perception for Low-Speed Vehicle Automation Using Surround-View Fisheye Cameras
AU - Eising, Ciaran
AU - Horgan, Jonathan
AU - Yogamani, Senthil
N1 - Publisher Copyright:
© 2000-2011 IEEE.
PY - 2022/9/1
Y1 - 2022/9/1
N2 - Cameras are the primary sensor in automated driving systems. They provide high information density and are optimal for detecting road infrastructure cues laid out for human vision. Surround-view camera systems typically comprise of four fisheye cameras with 190°+ field of view covering the entire 360° around the vehicle focused on near-field sensing. They are the principal sensors for low-speed, high accuracy, and close-range sensing applications, such as automated parking, traffic jam assistance, and low-speed emergency braking. In this work, we provide a detailed survey of such vision systems, setting up the survey in the context of an architecture that can be decomposed into four modular components namely Recognition, Reconstruction, Relocalization, and Reorganization. We jointly call this the 4R Architecture. We discuss how each component accomplishes a specific aspect and provide a positional argument that they can be synergized to form a complete perception system for low-speed automation. We support this argument by presenting results from previous works and by presenting architecture proposals for such a system. Qualitative results are presented in the video at https://youtu.be/ae8bCOF77uY.
AB - Cameras are the primary sensor in automated driving systems. They provide high information density and are optimal for detecting road infrastructure cues laid out for human vision. Surround-view camera systems typically comprise of four fisheye cameras with 190°+ field of view covering the entire 360° around the vehicle focused on near-field sensing. They are the principal sensors for low-speed, high accuracy, and close-range sensing applications, such as automated parking, traffic jam assistance, and low-speed emergency braking. In this work, we provide a detailed survey of such vision systems, setting up the survey in the context of an architecture that can be decomposed into four modular components namely Recognition, Reconstruction, Relocalization, and Reorganization. We jointly call this the 4R Architecture. We discuss how each component accomplishes a specific aspect and provide a positional argument that they can be synergized to form a complete perception system for low-speed automation. We support this argument by presenting results from previous works and by presenting architecture proposals for such a system. Qualitative results are presented in the video at https://youtu.be/ae8bCOF77uY.
KW - 4Rs
KW - Autonomous vehicles
KW - computer vision
KW - fisheye camera
KW - surround-view systems
UR - http://www.scopus.com/inward/record.url?scp=85108330121&partnerID=8YFLogxK
U2 - 10.1109/TITS.2021.3127646
DO - 10.1109/TITS.2021.3127646
M3 - Article
AN - SCOPUS:85108330121
SN - 1524-9050
VL - 23
SP - 13976
EP - 13993
JO - IEEE Transactions on Intelligent Transportation Systems
JF - IEEE Transactions on Intelligent Transportation Systems
IS - 9
ER -