ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.18384
40
0

LiDAR Remote Sensing Meets Weak Supervision: Concepts, Methods, and Perspectives

24 March 2025
Yuan Gao
Shaobo Xia
P. Wang
Xiaohuan Xi
Sheng Nie
Cheng-Xiang Wang
ArXivPDFHTML
Abstract

LiDAR (Light Detection and Ranging) enables rapid and accurate acquisition of three-dimensional spatial data, widely applied in remote sensing areas such as surface mapping, environmental monitoring, urban modeling, and forestry inventory. LiDAR remote sensing primarily includes data interpretation and LiDAR-based inversion. However, LiDAR interpretation typically relies on dense and precise annotations, which are costly and time-consuming. Similarly, LiDAR inversion depends on scarce supervisory signals and expensive field surveys for annotations. To address this challenge, weakly supervised learning has gained significant attention in recent years, with many methods emerging to tackle LiDAR remote sensing tasks using incomplete, inaccurate, and inexact annotations, as well as annotations from other domains. Existing review articles treat LiDAR interpretation and inversion as separate tasks. This review, for the first time, adopts a unified weakly supervised learning perspective to systematically examine research on both LiDAR interpretation and inversion. We summarize the latest advancements, provide a comprehensive review of the development and application of weakly supervised techniques in LiDAR remote sensing, and discuss potential future research directions in this field.

View on arXiv
@article{gao2025_2503.18384,
  title={ LiDAR Remote Sensing Meets Weak Supervision: Concepts, Methods, and Perspectives },
  author={ Yuan Gao and Shaobo Xia and Pu Wang and Xiaohuan Xi and Sheng Nie and Cheng Wang },
  journal={arXiv preprint arXiv:2503.18384},
  year={ 2025 }
}
Comments on this paper