ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.00127
375
34
v1v2v3 (latest)

A Mobile Manipulation System for One-Shot Teaching of Complex Tasks in Homes

IEEE International Conference on Robotics and Automation (ICRA), 2019
30 September 2019
M. Bajracharya
James Borders
D. Helmick
Thomas Kollar
Michael Laskey
John Leichty
Jeremy Ma
Umashankar Nagarajan
A. Ochiai
Josh Petersen
K. Shankar
Kevin Stone
Yutaka Takaoka
ArXiv (abs)PDFHTML
Abstract

We describe a mobile manipulation hardware and software system capable of autonomously performing complex human-level tasks in real homes, after being taught the task with a single demonstration from a person in virtual reality. This is enabled by a highly capable mobile manipulation robot, whole-body task space hybrid position/force control, teaching of parameterized primitives linked to a robust learned dense visual embeddings representation of the scene, and a task graph of the taught behaviors. We demonstrate the robustness of the approach by presenting results for performing a variety of tasks, under different environmental conditions, in multiple real homes. Our approach achieves 85% overall success rate on three tasks that consist of an average of 45 behaviors each.

View on arXiv
Comments on this paper