ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.09040
29
0

Motion Blender Gaussian Splatting for Dynamic Scene Reconstruction

12 March 2025
Xinyu Zhang
Haonan Chang
Yuhan Liu
Abdeslam Boularias
    3DGS
ArXivPDFHTML
Abstract

Gaussian splatting has emerged as a powerful tool for high-fidelity reconstruction of dynamic scenes. However, existing methods primarily rely on implicit motion representations, such as encoding motions into neural networks or per-Gaussian parameters, which makes it difficult to further manipulate the reconstructed motions. This lack of explicit controllability limits existing methods to replaying recorded motions only, which hinders a wider application in robotics. To address this, we propose Motion Blender Gaussian Splatting (MBGS), a novel framework that uses motion graphs as an explicit and sparse motion representation. The motion of a graph's links is propagated to individual Gaussians via dual quaternion skinning, with learnable weight painting functions that determine the influence of each link. The motion graphs and 3D Gaussians are jointly optimized from input videos via differentiable rendering. Experiments show that MBGS achieves state-of-the-art performance on the highly challenging iPhone dataset while being competitive on HyperNeRF. We demonstrate the application potential of our method in animating novel object poses, synthesizing real robot demonstrations, and predicting robot actions through visual planning. The source code, models, video demonstrations can be found atthis http URL.

View on arXiv
@article{zhang2025_2503.09040,
  title={ Motion Blender Gaussian Splatting for Dynamic Scene Reconstruction },
  author={ Xinyu Zhang and Haonan Chang and Yuhan Liu and Abdeslam Boularias },
  journal={arXiv preprint arXiv:2503.09040},
  year={ 2025 }
}
Comments on this paper