ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.03373
72
0

Direct Sparse Odometry with Continuous 3D Gaussian Maps for Indoor Environments

5 March 2025
Jie Deng
Fengtian Lang
Zikang Yuan
Xin Yang
    3DGS
    3DV
ArXivPDFHTML
Abstract

Accurate localization is essential for robotics and augmented reality applications such as autonomous navigation. Vision-based methods combining prior maps aim to integrate LiDAR-level accuracy with camera cost efficiency for robust pose estimation. Existing approaches, however, often depend on unreliable interpolation procedures when associating discrete point cloud maps with dense image pixels, which inevitably introduces depth errors and degrades pose estimation accuracy. We propose a monocular visual odometry framework utilizing a continuous 3D Gaussian map, which directly assigns geometrically consistent depth values to all extracted high-gradient points without interpolation. Evaluations on two public datasets demonstrate superior tracking accuracy compared to existing methods. We have released the source code of this work for the development of the community.

View on arXiv
@article{deng2025_2503.03373,
  title={ Direct Sparse Odometry with Continuous 3D Gaussian Maps for Indoor Environments },
  author={ Jie Deng and Fengtian Lang and Zikang Yuan and Xin Yang },
  journal={arXiv preprint arXiv:2503.03373},
  year={ 2025 }
}
Comments on this paper