[2603.24806] FODMP: Fast One-Step Diffusion of Movement Primitives Generation for Time-Dependent Robot Actions
About this article
Abstract page for arXiv paper 2603.24806: FODMP: Fast One-Step Diffusion of Movement Primitives Generation for Time-Dependent Robot Actions
Computer Science > Robotics arXiv:2603.24806 (cs) [Submitted on 25 Mar 2026] Title:FODMP: Fast One-Step Diffusion of Movement Primitives Generation for Time-Dependent Robot Actions Authors:Xirui Shi, Arya Ebrahimi, Yi Hu, Jun Jin View a PDF of the paper titled FODMP: Fast One-Step Diffusion of Movement Primitives Generation for Time-Dependent Robot Actions, by Xirui Shi and 2 other authors View PDF HTML (experimental) Abstract:Diffusion models are increasingly used for robot learning, but current designs face a clear trade-off. Action-chunking diffusion policies like ManiCM are fast to run, yet they only predict short segments of motion. This makes them reactive, but unable to capture time-dependent motion primitives, such as following a spring-damper-like behavior with built-in dynamic profiles of acceleration and deceleration. Recently, Movement Primitive Diffusion (MPD) partially addresses this limitation by parameterizing full trajectories using Probabilistic Dynamic Movement Primitives (ProDMPs), thereby enabling the generation of temporally structured motions. Nevertheless, MPD integrates the motion decoder directly into a multi-step diffusion process, resulting in prohibitively high inference latency that limits its applicability in real-time control settings. We propose FODMP (Fast One-step Diffusion of Movement Primitives), a new framework that distills diffusion models into the ProDMPs trajectory parameter space and generates motion using a single-step decoder. F...