[2509.24250] Interactive Program Synthesis for Modeling Collaborative Physical Activities from Narrated Demonstrations
About this article
Abstract page for arXiv paper 2509.24250: Interactive Program Synthesis for Modeling Collaborative Physical Activities from Narrated Demonstrations
Computer Science > Artificial Intelligence arXiv:2509.24250 (cs) [Submitted on 29 Sep 2025 (v1), last revised 10 Apr 2026 (this version, v3)] Title:Interactive Program Synthesis for Modeling Collaborative Physical Activities from Narrated Demonstrations Authors:Edward Kim, Daniel He, Jorge Chao, Wiktor Rajca, Mohammed Amin, Nishant Malpani, Ruta Desai, Antti Oulasvirta, Bjoern Hartmann, Sanjit Seshia View a PDF of the paper titled Interactive Program Synthesis for Modeling Collaborative Physical Activities from Narrated Demonstrations, by Edward Kim and 9 other authors View PDF HTML (experimental) Abstract:Teaching systems physical tasks is a long standing goal in HCI, yet most prior work has focused on non collaborative physical activities. Collaborative tasks introduce added complexity, requiring systems to infer users assumptions about their teammates intent, which is an inherently ambiguous and dynamic process. This necessitates representations that are interpretable and correctable, enabling users to inspect and refine system behavior. We address this challenge by framing collaborative task learning as a program synthesis problem. Our system represents behavior as editable programs and uses narrated demonstrations, i.e. paired physical actions and natural language, as a unified modality for teaching, inspecting, and correcting system logic without requiring users to see or write code. The same modality is used for the system to communicate its learning to users. In a ...