Driving scene reconstruction and rendering have advanced significantly using the 3D Gaussian Splatting. However, most prior research has focused on the rendering quality along a pre-recorded vehicle path and struggles to generalize to out-of-path viewpoints, which is caused by the lack of high-quality supervision in those out-of-path views. To address this issue, we introduce an Inverse View Warping technique to create compact and high-quality images as supervision for the reconstruction of the out-of-path views, enabling high-quality rendering results for those views. For accurate and robust inverse view warping, a depth bootstrap strategy is proposed to obtain on-the-fly dense depth maps during the optimization process, overcoming the sparsity and incompleteness of LiDAR depth data. Our method achieves superior in-path and out-of-path reconstruction and rendering performance on the widely used Waymo Open dataset. In addition, a simulator-based benchmark is proposed to obtain the out-of-path ground truth and quantitatively evaluate the performance of out-of-path rendering, where our method outperforms previous methods by a significant margin.
View on arXiv@article{zhou2025_2502.21093, title={ FlexDrive: Toward Trajectory Flexibility in Driving Scene Reconstruction and Rendering }, author={ Jingqiu Zhou and Lue Fan and Linjiang Huang and Xiaoyu Shi and Si Liu and Zhaoxiang Zhang and Hongsheng Li }, journal={arXiv preprint arXiv:2502.21093}, year={ 2025 } }