7
0

MRHaD: Mixed Reality-based Hand-Drawn Map Editing Interface for Mobile Robot Navigation

Abstract

Mobile robot navigation systems are increasingly relied upon in dynamic and complex environments, yet they often struggle with map inaccuracies and the resulting inefficient path planning. This paper presents MRHaD, a Mixed Reality-based Hand-drawn Map Editing Interface that enables intuitive, real-time map modifications through natural hand gestures. By integrating the MR head-mounted display with the robotic navigation system, operators can directly create hand-drawn restricted zones (HRZ), thereby bridging the gap between 2D map representations and the real-world environment. Comparative experiments against conventional 2D editing methods demonstrate that MRHaD significantly improves editing efficiency, map accuracy, and overall usability, contributing to safer and more efficient mobile robot operations. The proposed approach provides a robust technical foundation for advancing human-robot collaboration and establishing innovative interaction models that enhance the hybrid future of robotics and human society. For additional material, please check:this https URL

View on arXiv
@article{taki2025_2504.00580,
  title={ MRHaD: Mixed Reality-based Hand-Drawn Map Editing Interface for Mobile Robot Navigation },
  author={ Takumi Taki and Masato Kobayashi and Eduardo Iglesius and Naoya Chiba and Shizuka Shirai and Yuki Uranishi },
  journal={arXiv preprint arXiv:2504.00580},
  year={ 2025 }
}
Comments on this paper