ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.16422
56
1

1000+ FPS 4D Gaussian Splatting for Dynamic Scene Rendering

20 March 2025
Yuheng Yuan
Qiuhong Shen
Xingyi Yang
Xinchao Wang
    3DGS
ArXivPDFHTML
Abstract

4D Gaussian Splatting (4DGS) has recently gained considerable attention as a method for reconstructing dynamic scenes. Despite achieving superior quality, 4DGS typically requires substantial storage and suffers from slow rendering speed. In this work, we delve into these issues and identify two key sources of temporal redundancy. (Q1) \textbf{Short-Lifespan Gaussians}: 4DGS uses a large portion of Gaussians with short temporal span to represent scene dynamics, leading to an excessive number of Gaussians. (Q2) \textbf{Inactive Gaussians}: When rendering, only a small subset of Gaussians contributes to each frame. Despite this, all Gaussians are processed during rasterization, resulting in redundant computation overhead. To address these redundancies, we present \textbf{4DGS-1K}, which runs at over 1000 FPS on modern GPUs. For Q1, we introduce the Spatial-Temporal Variation Score, a new pruning criterion that effectively removes short-lifespan Gaussians while encouraging 4DGS to capture scene dynamics using Gaussians with longer temporal spans. For Q2, we store a mask for active Gaussians across consecutive frames, significantly reducing redundant computations in rendering. Compared to vanilla 4DGS, our method achieves a 41×41\times41× reduction in storage and 9×9\times9× faster rasterization speed on complex dynamic scenes, while maintaining comparable visual quality. Please see our project page atthis https URL.

View on arXiv
@article{yuan2025_2503.16422,
  title={ 1000+ FPS 4D Gaussian Splatting for Dynamic Scene Rendering },
  author={ Yuheng Yuan and Qiuhong Shen and Xingyi Yang and Xinchao Wang },
  journal={arXiv preprint arXiv:2503.16422},
  year={ 2025 }
}
Comments on this paper