ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.19261
32
0

Rendering Anywhere You See: Renderability Field-guided Gaussian Splatting

27 April 2025
Xiaofeng Jin
Yan Fang
Matteo Frosi
Jianfei Ge
Jiangjian Xiao
Matteo Matteucci
    3DGS
ArXivPDFHTML
Abstract

Scene view synthesis, which generates novel views from limited perspectives, is increasingly vital for applications like virtual reality, augmented reality, and robotics. Unlike object-based tasks, such as generating 360° views of a car, scene view synthesis handles entire environments where non-uniform observations pose unique challenges for stable rendering quality. To address this issue, we propose a novel approach: renderability field-guided gaussian splatting (RF-GS). This method quantifies input inhomogeneity through a renderability field, guiding pseudo-view sampling to enhanced visual consistency. To ensure the quality of wide-baseline pseudo-views, we train an image restoration model to map point projections to visible-light styles. Additionally, our validated hybrid data optimization strategy effectively fuses information of pseudo-view angles and source view textures. Comparative experiments on simulated and real-world data show that our method outperforms existing approaches in rendering stability.

View on arXiv
@article{jin2025_2504.19261,
  title={ Rendering Anywhere You See: Renderability Field-guided Gaussian Splatting },
  author={ Xiaofeng Jin and Yan Fang and Matteo Frosi and Jianfei Ge and Jiangjian Xiao and Matteo Matteucci },
  journal={arXiv preprint arXiv:2504.19261},
  year={ 2025 }
}
Comments on this paper