de-Mas-Giménez G, Subirana A, García-Gómez P, Bernal Pérez E, Casas J, Royo S. Multimodal sensing prototype for robust autonomous driving under adverse weather conditions. In SPIE Optical Metrology: Optical Measurement Systems for Industrial Inspection XIV. Munich, Germany: SPIE; 2025.  (11.69 MB)

Abstract

Current autonomous driving datasets face significant limitations in adverse weather conditions diversity and sensor generalization. Additionally, commonly used sensors like visible cameras or rotating LiDARs, struggle to perform under harsh weather conditions. To address these challenges, this work introduces a multimodal data acquisition system that integrates high-resolution solid-state LiDAR, automotive RADARs, a combination of visible, thermal, SWIR, and polarimetric cameras, and a GNSS/INS system for odometry and localization. This diverse sensor suite ensures robust performance in low-visibility environments, such as fog or heavy rain, by providing complementary and redundant information. Controlled by an autonomous-safe Nvidia DRIVE AGX with a ROS-based architecture, the system enables precise spatial calibration, temporal synchronization, and realtime data fusion and perception algorithms across an overlapped field of view of 60◦x 20◦. With this system, this work aims to publish in the coming months an open-source multimodal labeled dataset for autonomous driving complemented by synthetic data generation through a digital twin in Nvidia’s Omniverse platform. The data set will have a similar structure as the well-known nuScenes, and will have a similar developer kit to navigate through it with ease. The data set will support a wide range of autonomous driving applications like 3D multimodal object detection and tracking, SLAM, depth completion, and perception enhancement through challenging scenarios.