ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.02452
42
0

2DGS-Avatar: Animatable High-fidelity Clothed Avatar via 2D Gaussian Splatting

4 March 2025
Qipeng Yan
Mingyang Sun
L. Zhang
    3DGS
ArXivPDFHTML
Abstract

Real-time rendering of high-fidelity and animatable avatars from monocular videos remains a challenging problem in computer vision and graphics. Over the past few years, the Neural Radiance Field (NeRF) has made significant progress in rendering quality but behaves poorly in run-time performance due to the low efficiency of volumetric rendering. Recently, methods based on 3D Gaussian Splatting (3DGS) have shown great potential in fast training and real-time rendering. However, they still suffer from artifacts caused by inaccurate geometry. To address these problems, we propose 2DGS-Avatar, a novel approach based on 2D Gaussian Splatting (2DGS) for modeling animatable clothed avatars with high-fidelity and fast training performance. Given monocular RGB videos as input, our method generates an avatar that can be driven by poses and rendered in real-time. Compared to 3DGS-based methods, our 2DGS-Avatar retains the advantages of fast training and rendering while also capturing detailed, dynamic, and photo-realistic appearances. We conduct abundant experiments on popular datasets such as AvatarRex and THuman4.0, demonstrating impressive performance in both qualitative and quantitative metrics.

View on arXiv
@article{yan2025_2503.02452,
  title={ 2DGS-Avatar: Animatable High-fidelity Clothed Avatar via 2D Gaussian Splatting },
  author={ Qipeng Yan and Mingyang Sun and Lihua Zhang },
  journal={arXiv preprint arXiv:2503.02452},
  year={ 2025 }
}
Comments on this paper