Omnidirectional Depth-Aided Occupancy Prediction based on Cylindrical Voxel for Autonomous Driving

Accurate 3D perception is essential for autonomous driving. Traditional methods often struggle with geometric ambiguity due to a lack of geometric prior. To address these challenges, we use omnidirectional depth estimation to introduce geometric prior. Based on the depth information, we propose a Sketch-Coloring framework OmniDepth-Occ. Additionally, our approach introduces a cylindrical voxel representation based on polar coordinate to better align with the radial nature of panoramic camera views. To address the lack of fisheye camera dataset in autonomous driving tasks, we also build a virtual scene dataset with six fisheye cameras, and the data volume has reached twice that of SemanticKITTI. Experimental results demonstrate that our Sketch-Coloring network significantly enhances 3D perception performance.
View on arXiv@article{wu2025_2504.01023, title={ Omnidirectional Depth-Aided Occupancy Prediction based on Cylindrical Voxel for Autonomous Driving }, author={ Chaofan Wu and Jiaheng Li and Jinghao Cao and Ming Li and Yongkang Feng and Jiayu Wu Shuwen Xu and Zihang Gao and Sidan Du and Yang Li }, journal={arXiv preprint arXiv:2504.01023}, year={ 2025 } }