19
1

GSON: A Group-based Social Navigation Framework with Large Multimodal Model

Shangyi Luo
Ji Zhu
Peng Sun
Yuhong Deng
Cunjun Yu
Anxing Xiao
Xueqian Wang
Abstract

With the increasing presence of service robots and autonomous vehicles in human environments, navigation systems need to evolve beyond simple destination reach to incorporate social awareness. This paper introduces GSON, a novel group-based social navigation framework that leverages Large Multimodal Models (LMMs) to enhance robots' social perception capabilities. Our approach uses visual prompting to enable zero-shot extraction of social relationships among pedestrians and integrates these results with robust pedestrian detection and tracking pipelines to overcome the inherent inference speed limitations of LMMs. The planning system incorporates a mid-level planner that sits between global path planning and local motion planning, effectively preserving both global context and reactive responsiveness while avoiding disruption of the predicted social group. We validate GSON through extensive real-world mobile robot navigation experiments involving complex social scenarios such as queuing, conversations, and photo sessions. Comparative results show that our system significantly outperforms existing navigation approaches in minimizing social perturbations while maintaining comparable performance on traditional navigation metrics.

View on arXiv
@article{luo2025_2409.18084,
  title={ GSON: A Group-based Social Navigation Framework with Large Multimodal Model },
  author={ Shangyi Luo and Ji Zhu and Peng Sun and Yuhong Deng and Cunjun Yu and Anxing Xiao and Xueqian Wang },
  journal={arXiv preprint arXiv:2409.18084},
  year={ 2025 }
}
Comments on this paper