[2511.21135] SocialNav: Training Human-Inspired Foundation Model for Socially-Aware Embodied Navigation
About this article
Abstract page for arXiv paper 2511.21135: SocialNav: Training Human-Inspired Foundation Model for Socially-Aware Embodied Navigation
Computer Science > Robotics arXiv:2511.21135 (cs) [Submitted on 26 Nov 2025 (v1), last revised 27 Feb 2026 (this version, v2)] Title:SocialNav: Training Human-Inspired Foundation Model for Socially-Aware Embodied Navigation Authors:Ziyi Chen, Yingnan Guo, Zedong Chu, Minghua Luo, Yanfen Shen, Mingchao Sun, Junjun Hu, Shichao Xie, Kuan Yang, Pei Shi, Zhining Gu, Lu Liu, Honglin Han, Xiaolong Wu, Mu Xu, Yu Zhang, Ning Guo View a PDF of the paper titled SocialNav: Training Human-Inspired Foundation Model for Socially-Aware Embodied Navigation, by Ziyi Chen and 16 other authors View PDF HTML (experimental) Abstract:Embodied navigation that adheres to social norms remains an open research challenge. Our SocialNav is a foundational model for socially-aware navigation with a hierarchical "brain-action" architecture, capable of understanding high-level social norms and generating low-level, socially compliant trajectories. To enable such dual capabilities, we construct the SocNav Dataset, a large-scale collection of 7 million samples, comprising (1) a Cognitive Activation Dataset providing social reasoning signals such as chain-of-thought explanations and social traversability prediction, and (2) an Expert Trajectories Pyramid aggregating diverse navigation demonstrations from internet videos, simulated environments, and real-world robots. A multi-stage training pipeline is proposed to gradually inject and refine navigation intelligence: we first inject general navigation skills a...