ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.12905
20
0

Second-order Optimization of Gaussian Splats with Importance Sampling

17 April 2025
Hamza Pehlivan
Andrea Boscolo Camiletto
Lin Geng Foo
Marc Habermann
Christian Theobalt
    3DGS
ArXivPDFHTML
Abstract

3D Gaussian Splatting (3DGS) is widely used for novel view synthesis due to its high rendering quality and fast inference time. However, 3DGS predominantly relies on first-order optimizers such as Adam, which leads to long training times. To address this limitation, we propose a novel second-order optimization strategy based on Levenberg-Marquardt (LM) and Conjugate Gradient (CG), which we specifically tailor towards Gaussian Splatting. Our key insight is that the Jacobian in 3DGS exhibits significant sparsity since each Gaussian affects only a limited number of pixels. We exploit this sparsity by proposing a matrix-free and GPU-parallelized LM optimization. To further improve its efficiency, we propose sampling strategies for both the camera views and loss function and, consequently, the normal equation, significantly reducing the computational complexity. In addition, we increase the convergence rate of the second-order approximation by introducing an effective heuristic to determine the learning rate that avoids the expensive computation cost of line search methods. As a result, our method achieves a 3×3\times3× speedup over standard LM and outperforms Adam by  6×~6\times 6× when the Gaussian count is low while remaining competitive for moderate counts. Project Page:this https URL

View on arXiv
@article{pehlivan2025_2504.12905,
  title={ Second-order Optimization of Gaussian Splats with Importance Sampling },
  author={ Hamza Pehlivan and Andrea Boscolo Camiletto and Lin Geng Foo and Marc Habermann and Christian Theobalt },
  journal={arXiv preprint arXiv:2504.12905},
  year={ 2025 }
}
Comments on this paper