VALISENS: A Validated Innovative Multi-Sensor System for Cooperative Automated Driving

Perception is a core capability of automated vehicles and has been significantly advanced through modern sensor technologies and artificial intelligence. However, perception systems still face challenges in complex real-world scenarios. To improve robustness against various external factors, multi-sensor fusion techniques are essential, combining the strengths of different sensor modalities. With recent developments in Vehicle-to-Everything (V2X communication, sensor fusion can now extend beyond a single vehicle to a cooperative multi-agent system involving Connected Automated Vehicle (CAV) and intelligent infrastructure. This paper presents VALISENS, an innovative multi-sensor system distributed across multiple agents. It integrates onboard and roadside LiDARs, radars, thermal cameras, and RGB cameras to enhance situational awareness and support cooperative automated driving. The thermal camera adds critical redundancy for perceiving Vulnerable Road User (VRU), while fusion with roadside sensors mitigates visual occlusions and extends the perception range beyond the limits of individual vehicles. We introduce the corresponding perception module built on this sensor system, which includes object detection, tracking, motion forecasting, and high-level data fusion. The proposed system demonstrates the potential of cooperative perception in real-world test environments and lays the groundwork for future Cooperative Intelligent Transport Systems (C-ITS) applications.
View on arXiv@article{wan2025_2505.06980, title={ VALISENS: A Validated Innovative Multi-Sensor System for Cooperative Automated Driving }, author={ Lei Wan and Prabesh Gupta and Andreas Eich and Marcel Kettelgerdes and Hannan Ejaz Keen and Michael Klöppel-Gersdorf and Alexey Vinel }, journal={arXiv preprint arXiv:2505.06980}, year={ 2025 } }