Multimodal sensing and Perception for autonomous vehicles

Type Start End
National Jan 2023 Dec 2024

Reference

This research is funded by the Ministerio de Ciencia, Innovación y Universidades (MICINN) of Spain project number TED2021-132338B-I00.

Description

This project is a result of an ongoing collaboration with  S. RoyoG. de-Mas-Giménez, A. Subirana, P. García-Gómez, E. Bernal Pérez, and all the team from CD6-UPC / Beamagine.

Autonomous driving is the future of urban mobility, especially in an increasingly ageing society. In the medium term, journeys will be made in vehicles capable of making real-time decisions without human intervention in all types of situations, and the first trials have already begun in cities like San Francisco, Phoenix, and Singapore. Autonomous driving will help optimise energy consumption and traffic management, reduce accidents, and improve mobility for groups such as the elderly or disabled.

Achieving this transformation requires large volumes of precise data from detectors with various operational modes, enabling autonomous systems to interpret their surroundings and act safely, even in complex environments. This necessitates training advanced artificial intelligence (AI) algorithms, and for these algorithms to be reliable in all circumstances, they need a broad and diverse set of data to generate realistic and generalised information that helps the AI learn and make robust decisions.

With USEFUL, millions of images from multiple sensors, some of which are disruptive in the field of autonomous vehicles, will be gathered and converted into usable data to design reliable autonomous driving systems.

The data will be anonymised (not identifying pedestrians or vehicles) and incorporated into a public dataset that will allow other researchers to develop related algorithms. The goal is for the dataset to be as diverse as possible, capturing rural, urban, motorway, and other scenarios; including all types of vehicles, especially the most vulnerable such as bicycles or scooters; and collecting data in real weather conditions and realistic environments where sensor performance may vary. In fact, this data will be used to develop advanced applications for object detection on roads, environment mapping, movement prediction of road users, and digital twin creation, among other things, with greater reliability.

Additionally, USEFUL will allow testing different AI algorithm training methodologies to compare the most effective way of achieving secure perception systems. The vehicle will also enable future testing and validation of new types of autonomous driving sensors, new sensor distributions, or the development of detailed mapping applications for all types of environments.

Publications

de-Mas-Giménez G, Subirana A, Bernal Pérez E, Riu J, Casas JR, García-Gómez P, Royo S. Preview of a multimodal data set for robust and safe autonomous driving under adverse weather conditions. In: IMEKO PhotoMet - International Symposium on Modern Photonic Metrology. IMEKO PhotoMet - International Symposium on Modern Photonic Metrology. Modena, Italy: 11 IMEKO PhotoMet; 2025.
de-Mas-Giménez G, Subirana A, García-Gómez P, Bernal Pérez E, Riu J, Casas JR, Royo S. Multimodal data acquisition prototype for autonomous driving in adverse weather conditions. In: XIV Reunión OptoElectrónica (OPTOEL 2025). XIV Reunión OptoElectrónica (OPTOEL 2025). Terrassa, Spain: OPTOEL; 2025. (745.34 KB)
de-Mas-Giménez G, Subirana A, García-Gómez P, Bernal Pérez E, Casas JR, Royo S. Multimodal sensing prototype for robust autonomous driving under adverse weather conditions. In: SPIE Optical Metrology: Optical Measurement Systems for Industrial Inspection XIV. SPIE Optical Metrology: Optical Measurement Systems for Industrial Inspection XIV. Munich, Germany: SPIE; 2025.
Chávez Plasencia A, García-Gómez P, Bernal Pérez E, de-Mas-Giménez G, Casas JR, Royo S. A Preliminary Study of Deep Learning Sensor Fusion for Pedestrian Detection. Sensors. 2023 ;23(8). (9.94 MB)

Demos and Resources

USEFUL Dataset Dataset

Collaborators