Near-Field Perception for Low-Speed Vehicle Automation Using Surround-View Fisheye Cameras

Ciaran Eising, Jonathan Horgan, Senthil Yogamani

Research output: Contribution to journalArticlepeer-review

Abstract

Cameras are the primary sensor in automated driving systems. They provide high information density and are optimal for detecting road infrastructure cues laid out for human vision. Surround-view camera systems typically comprise of four fisheye cameras with 190°+ field of view covering the entire 360° around the vehicle focused on near-field sensing. They are the principal sensors for low-speed, high accuracy, and close-range sensing applications, such as automated parking, traffic jam assistance, and low-speed emergency braking. In this work, we provide a detailed survey of such vision systems, setting up the survey in the context of an architecture that can be decomposed into four modular components namely Recognition, Reconstruction, Relocalization, and Reorganization. We jointly call this the 4R Architecture. We discuss how each component accomplishes a specific aspect and provide a positional argument that they can be synergized to form a complete perception system for low-speed automation. We support this argument by presenting results from previous works and by presenting architecture proposals for such a system. Qualitative results are presented in the video at https://youtu.be/ae8bCOF77uY.

Original languageEnglish
Pages (from-to)13976-13993
Number of pages18
JournalIEEE Transactions on Intelligent Transportation Systems
Volume23
Issue number9
DOIs
Publication statusPublished - 1 Sep 2022

Keywords

  • 4Rs
  • Autonomous vehicles
  • computer vision
  • fisheye camera
  • surround-view systems

Fingerprint

Dive into the research topics of 'Near-Field Perception for Low-Speed Vehicle Automation Using Surround-View Fisheye Cameras'. Together they form a unique fingerprint.

Cite this