66
0

Event-boosted Deformable 3D Gaussians for Dynamic Scene Reconstruction

Abstract

Deformable 3D Gaussian Splatting (3D-GS) is limited by missing intermediate motion information due to the low temporal resolution of RGB cameras. To address this, we introduce the first approach combining event cameras, which capture high-temporal-resolution, continuous motion data, with deformable 3D-GS for dynamic scene reconstruction. We observe that threshold modeling for events plays a crucial role in achieving high-quality reconstruction. Therefore, we propose a GS-Threshold Joint Modeling strategy, creating a mutually reinforcing process that greatly improves both 3D reconstruction and threshold modeling. Moreover, we introduce a Dynamic-Static Decomposition strategy that first identifies dynamic areas by exploiting the inability of static Gaussians to represent motions, then applies a buffer-based soft decomposition to separate dynamic and static areas. This strategy accelerates rendering by avoiding unnecessary deformation in static areas, and focuses on dynamic areas to enhance fidelity. Additionally, we contribute the first event-inclusive 4D benchmark with synthetic and real-world dynamic scenes, on which our method achieves state-of-the-art performance.

View on arXiv
@article{xu2025_2411.16180,
  title={ Event-boosted Deformable 3D Gaussians for Dynamic Scene Reconstruction },
  author={ Wenhao Xu and Wenming Weng and Yueyi Zhang and Ruikang Xu and Zhiwei Xiong },
  journal={arXiv preprint arXiv:2411.16180},
  year={ 2025 }
}
Comments on this paper