ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.13961
46
0

BG-Triangle: Bézier Gaussian Triangle for 3D Vectorization and Rendering

18 March 2025
Minye Wu
Haizhao Dai
Kaixin Yao
Tinne Tuytelaars
Jingyi Yu
ArXivPDFHTML
Abstract

Differentiable rendering enables efficient optimization by allowing gradients to be computed through the rendering process, facilitating 3D reconstruction, inverse rendering and neural scene representation learning. To ensure differentiability, existing solutions approximate or re-formulate traditional rendering operations using smooth, probabilistic proxies such as volumes or Gaussian primitives. Consequently, they struggle to preserve sharp edges due to the lack of explicit boundary definitions. We present a novel hybrid representation, Bézier Gaussian Triangle (BG-Triangle), that combines Bézier triangle-based vector graphics primitives with Gaussian-based probabilistic models, to maintain accurate shape modeling while conducting resolution-independent differentiable rendering. We present a robust and effective discontinuity-aware rendering technique to reduce uncertainties at object boundaries. We also employ an adaptive densification and pruning scheme for efficient training while reliably handling level-of-detail (LoD) variations. Experiments show that BG-Triangle achieves comparable rendering quality as 3DGS but with superior boundary preservation. More importantly, BG-Triangle uses a much smaller number of primitives than its alternatives, showcasing the benefits of vectorized graphics primitives and the potential to bridge the gap between classic and emerging representations.

View on arXiv
@article{wu2025_2503.13961,
  title={ BG-Triangle: Bézier Gaussian Triangle for 3D Vectorization and Rendering },
  author={ Minye Wu and Haizhao Dai and Kaixin Yao and Tinne Tuytelaars and Jingyi Yu },
  journal={arXiv preprint arXiv:2503.13961},
  year={ 2025 }
}
Comments on this paper