4
v1v2 (latest)

Pose-Free Omnidirectional Gaussian Splatting for 360-Degree Videos with Consistent Depth Priors

Chuanqing Zhuang
Xin Lu
Zehui Deng
Zhengda Lu
Yiqun Wang
Junqi Diao
Jun Xiao
Main:8 Pages
8 Figures
Bibliography:2 Pages
5 Tables
Abstract

Omnidirectional 3D Gaussian Splatting with panoramas is a key technique for 3D scene representation, and existing methods typically rely on slow SfM to provide camera poses and sparse points priors. In this work, we propose a pose-free omnidirectional 3DGS method, named PFGS360, that reconstructs 3D Gaussians from unposed omnidirectional videos. To achieve accurate camera pose estimation, we first construct a spherical consistency-aware pose estimation module, which recovers poses by establishing consistent 2D-3D correspondences between the reconstructed Gaussians and the unposed images using Gaussians' internal depth priors. Besides, to enhance the fidelity of novel view synthesis, we introduce a depth-inlier-aware densification module to extract depth inliers and Gaussian outliers with consistent monocular depth priors, enabling efficient Gaussian densification and achieving photorealistic novel view synthesis. The experiments show significant outperformance over existing pose-free and pose-aware 3DGS methods on both real-world and synthetic 360-degree videos. Code is available atthis https URL.

View on arXiv
Comments on this paper