G-MIND: Galway Multimodal Infrastructure Node Dataset for Intelligent Transportation Systems

Research output: Contribution to journalArticlepeer-review

Abstract

Autonomous and semi-autonomous vehicles require accurate perception of their surrounding environment to ensure safe operation, yet onboard sensors frequently encounter occlusion challenges that result in incomplete dynamic environmental maps. Infrastructure-to-vehicle cooperative perception addresses this by deploying infrastructure nodes that monitor scenes and share reliable environmental maps with nearby vehicles via technologies like C-V2X. However, existing infrastructure perspective datasets lack diverse multi-modal data and aerial footage, which are crucial to determine effectively the necessary sensors for safety-critical infrastructure node applications. This paper introduces G-MIND, a multimodal infrastructure node dataset supporting research into sensor suitability for infrastructure-assisted safety-critical applications. G-MIND is the first dataset to incorporate this comprehensive range of sensing modalities for infrastructure-based perception: RGB, FIR, and neuromorphic cameras, LiDARs, RADAR, and aerial drone footage. With 91,500 annotated frames, G-MIND offers a larger scale than existing infrastructure perception datasets such as Ko-PER (10k frames), CoopScenes (40K frames), and DAIR-V2X (71k frames), enabling more comprehensive training and evaluation. The dataset captures day and night scenarios featuring cars, pedestrians, and cyclists across diverse traffic scenarios. Beyond standard perception benchmarking, G-MIND includes specialized collections designed to test perception system boundaries: maximum detection distance scenarios, far and occluded object scenarios, and pedestrian action prediction scenarios that challenge current algorithms. Additionally, this paper analyzes what constitutes effective ITS infrastructure node sensors from a practical perspective, comparing modalities against technical criteria (field of view, spatial resolution, low light performance, adverse weather resilience) and pragmatic criteria (cost, durability). The dataset is available for non-commercial research at https://ieee-dataport.org/documents/galway-multimodal-infrastructure-node-dataset , with the SDK at https://github.com/daramolloy/GMIND-sdk.

Original languageEnglish
JournalIEEE Open Journal of Vehicular Technology
DOIs
Publication statusAccepted/In press - 2025

Keywords

  • Automated Mobility
  • Collaborative Driving Automation
  • Cooperative Intelligent Transportation Systems (C-ITS)
  • Infrastructure Sensing
  • Roadside Units
  • V2I
  • V2X

Fingerprint

Dive into the research topics of 'G-MIND: Galway Multimodal Infrastructure Node Dataset for Intelligent Transportation Systems'. Together they form a unique fingerprint.

Cite this