[2605.05017] Position: Embodied AI Requires a Privacy-Utility Trade-off
About this article
Abstract page for arXiv paper 2605.05017: Position: Embodied AI Requires a Privacy-Utility Trade-off
Computer Science > Artificial Intelligence arXiv:2605.05017 (cs) [Submitted on 6 May 2026] Title:Position: Embodied AI Requires a Privacy-Utility Trade-off Authors:Xiaoliang Fan, Jiarui Chen, Zhuodong Liu, Ziqi Yang, Peixuan Xu, Ruimin Shen, Junhui Liu, Jianzhong Qi, Cheng Wang View a PDF of the paper titled Position: Embodied AI Requires a Privacy-Utility Trade-off, by Xiaoliang Fan and 8 other authors View PDF HTML (experimental) Abstract:Embodied AI (EAI) systems are rapidly transitioning from simulations into real-world domestic and other sensitive environments. However, recent EAI solutions have largely demonstrated advancements within isolated stages such as instruction, perception, planning and interaction, without considering their coupled privacy implications in high-frequency deployments where privacy leakage is often irreversible. This position paper argues that optimizing these components independently creates a systemic privacy crisis when deployed in sensitive settings, thereby advancing the position that privacy in EAI is a life cycle-level architectural constraint rather than a stage-local feature. To address these challenges, we propose Secure Privacy Integration in Next-generation Embodied AI (SPINE), a unified privacy-aware framework that treats privacy as a dynamic control signal governing cross-stage coupling throughout the entire EAI life cycle. SPINE decomposes the EAI pipeline into various stages and establishes a multi-criterion privacy classificat...