373
v1v2v3v4 (latest)

E2Map: Experience-and-Emotion Map for Self-Reflective Robot Navigation with Language Models

IEEE International Conference on Robotics and Automation (ICRA), 2024
Main:6 Pages
28 Figures
Bibliography:1 Pages
Appendix:12 Pages
Abstract

Large language models (LLMs) have shown significant potential in guiding embodied agents to execute language instructions across a range of tasks, including robotic manipulation and navigation. However, existing methods are primarily designed for static environments and do not leverage the agent's own experiences to refine its initial plans. Given that real-world environments are inherently stochastic, initial plans based solely on LLMs' general knowledge may fail to achieve their objectives, unlike in static scenarios. To address this limitation, this study introduces the Experience-and-Emotion Map (E2Map), which integrates not only LLM knowledge but also the agent's real-world experiences, drawing inspiration from human emotional responses. The proposed methodology enables one-shot behavior adjustments by updating the E2Map based on the agent's experiences. Our evaluation in stochastic navigation environments, including both simulations and real-world scenarios, demonstrates that the proposed method significantly enhances performance in stochastic environments compared to existing LLM-based approaches. Code and supplementary materials are available atthis https URL.

View on arXiv
Comments on this paper