de-Mas-Giménez G, Subirana A, Bernal Pérez E, Riu J, Casas J, García-Gómez P, et al.. Preview of a multimodal data set for robust and safe autonomous driving under adverse weather conditions. In IMEKO PhotoMet - International Symposium on Modern Photonic Metrology. Modena, Italy: 11 IMEKO PhotoMet; 2025.

Abstract

Autonomous driving has made remarkable strides in recent years, yet adverse weather conditions such as fog, heavy rain, snow, and low-light environments remain major obstacles for perception systems. These conditions reduce visibility and introduce complex artifacts like scattering, reflection, and sensor saturation, which degrade the performance of traditional camera-LiDAR-based perception algorithms. Current state-of-theart autonomous driving data sets avoid these edge cases. To address this, we introduce a new multimodal data set specifically curated to support research in robust perception under challenging environmental conditions. The data set was collected using a multimodal platform installed on a custom prototype which includes a diverse suite of sensors: visible RGB cameras, a Short-Wave Infra-Red (SWIR) camera, long-wave infrared thermal camera, and polarimetric camera. In addition to these optical sensors, the system incorporates a high-resolution solid-state LiDAR and two automotive-grade radars. These modalities were selected for their complementary responses to different environmental challenges. The prototype has a [60º,40º] of overlapped FOV with all modalities ensuring enough redundancy to provide a good understanding of the vehicle’s surroundings. Additionally, a GNSS/INS system is mounted to provide with localization and vehicle odometry. All sensors are temporally paired using a best effort synchronization algorithm. They are also spatially registered using multimodal calibration boards, enabling accurate data fusion even at long distances. The data set covers a wide range of real-world driving scenarios with urban, road and highway at high-velocities scenes across various lighting and weather conditions. From clear sky, to heavy rain, snow and even fog. Each frame includes 2D and 3D annotations for object detection tasks. The data set is structured as a relational database with an architecture similar to the reference data set, nuScenes. To ease visualization and navigation and through the data set, an extended python developer kit tool will be published. This work offers a preview of a multimodal data set which offers a rich and unique resource to the research community, enabling the development and evaluation of algorithms designed to operate reliably in degraded visibility. The data set is expected to be open-sourced later this year.