TY - JOUR
T1 - G-MIND
T2 - Galway Multimodal Infrastructure Node Dataset for Intelligent Transportation Systems
AU - Molloy, Dara
AU - George, Roshan
AU - Brophy, Tim
AU - Deegan, Brian
AU - Mullins, Darragh
AU - Ward, Enda
AU - Horgan, Jonathan
AU - Eising, Ciaran
AU - Denny, Patrick
AU - Jones, Edward
AU - Glavin, Martin
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2025
Y1 - 2025
N2 - Autonomous and semi-autonomous vehicles require accurate perception of their surrounding environment to ensure safe operation, yet onboard sensors frequently encounter occlusion challenges that result in incomplete dynamic environmental maps. Infrastructure-to-vehicle cooperative perception addresses this by deploying infrastructure nodes that monitor scenes and share reliable environmental maps with nearby vehicles via technologies like C-V2X. However, existing infrastructure perspective datasets lack diverse multi-modal data and aerial footage, which are crucial to determine effectively the necessary sensors for safety-critical infrastructure node applications. This paper introduces G-MIND, a multimodal infrastructure node dataset supporting research into sensor suitability for infrastructure-assisted safety-critical applications. G-MIND is the first dataset to incorporate this comprehensive range of sensing modalities for infrastructure-based perception: RGB, FIR, and neuromorphic cameras, LiDARs, RADAR, and aerial drone footage. With 91,500 annotated frames, G-MIND offers a larger scale than existing infrastructure perception datasets such as Ko-PER (10k frames), CoopScenes (40K frames), and DAIR-V2X (71k frames), enabling more comprehensive training and evaluation. The dataset captures day and night scenarios featuring cars, pedestrians, and cyclists across diverse traffic scenarios. Beyond standard perception benchmarking, G-MIND includes specialized collections designed to test perception system boundaries: maximum detection distance scenarios, far and occluded object scenarios, and pedestrian action prediction scenarios that challenge current algorithms. Additionally, this paper analyzes what constitutes effective ITS infrastructure node sensors from a practical perspective, comparing modalities against technical criteria (field of view, spatial resolution, low light performance, adverse weather resilience) and pragmatic criteria (cost, durability). The dataset is available for non-commercial research at https://ieee-dataport.org/documents/galway-multimodal-infrastructure-node-dataset , with the SDK at https://github.com/daramolloy/GMIND-sdk.
AB - Autonomous and semi-autonomous vehicles require accurate perception of their surrounding environment to ensure safe operation, yet onboard sensors frequently encounter occlusion challenges that result in incomplete dynamic environmental maps. Infrastructure-to-vehicle cooperative perception addresses this by deploying infrastructure nodes that monitor scenes and share reliable environmental maps with nearby vehicles via technologies like C-V2X. However, existing infrastructure perspective datasets lack diverse multi-modal data and aerial footage, which are crucial to determine effectively the necessary sensors for safety-critical infrastructure node applications. This paper introduces G-MIND, a multimodal infrastructure node dataset supporting research into sensor suitability for infrastructure-assisted safety-critical applications. G-MIND is the first dataset to incorporate this comprehensive range of sensing modalities for infrastructure-based perception: RGB, FIR, and neuromorphic cameras, LiDARs, RADAR, and aerial drone footage. With 91,500 annotated frames, G-MIND offers a larger scale than existing infrastructure perception datasets such as Ko-PER (10k frames), CoopScenes (40K frames), and DAIR-V2X (71k frames), enabling more comprehensive training and evaluation. The dataset captures day and night scenarios featuring cars, pedestrians, and cyclists across diverse traffic scenarios. Beyond standard perception benchmarking, G-MIND includes specialized collections designed to test perception system boundaries: maximum detection distance scenarios, far and occluded object scenarios, and pedestrian action prediction scenarios that challenge current algorithms. Additionally, this paper analyzes what constitutes effective ITS infrastructure node sensors from a practical perspective, comparing modalities against technical criteria (field of view, spatial resolution, low light performance, adverse weather resilience) and pragmatic criteria (cost, durability). The dataset is available for non-commercial research at https://ieee-dataport.org/documents/galway-multimodal-infrastructure-node-dataset , with the SDK at https://github.com/daramolloy/GMIND-sdk.
KW - Automated Mobility
KW - Collaborative Driving Automation
KW - Cooperative Intelligent Transportation Systems (C-ITS)
KW - Infrastructure Sensing
KW - Roadside Units
KW - V2I
KW - V2X
UR - https://www.scopus.com/pages/publications/105025968253
U2 - 10.1109/OJVT.2025.3648251
DO - 10.1109/OJVT.2025.3648251
M3 - Article
AN - SCOPUS:105025968253
SN - 2644-1330
JO - IEEE Open Journal of Vehicular Technology
JF - IEEE Open Journal of Vehicular Technology
ER -