ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.14150
151
2
v1v2v3v4v5v6v7v8v9v10 (latest)

Fast LiDAR Informed Visual Search in Unseen Indoor Environments

25 September 2023
Ryan Gupta
Kyle Morgenstein
Steve Ortega
Luis Sentis
ArXiv (abs)PDFHTML
Abstract

This paper explores the problem of planning for visual search without prior map information. We leverage the pixel-wise environment perception problem where one is given wide Field of View 2D scan data and must perform LiDAR segmentation to contextually label points in the surroundings. These pixel classifications provide an informed prior on which to plan next best viewpoints during visual search tasks. We present LIVES: LiDAR Informed Visual Search, a method aimed at finding objects of interest in unknown indoor environments. A robust map-free classifier is trained from expert data collected using a simple cart platform equipped with a map-based classifier. An autonomous exploration planner takes the contextual data from scans and uses that prior to plan viewpoints more likely to yield detection of the search target. We propose a utility function that accounts for traditional metrics like information gain and path cost and for the contextual information. LIVES is baselined against several existing exploration methods in simulation to verify its performance. It is validated in real-world experiments with single and multiple search objects with a Spot robot in two unseen environments. Videos of experiments, implementation details and open source code can be found at https://sites.google.com/view/lives-2024/home.

View on arXiv
Comments on this paper