[2503.14637] KINESIS: Motion Imitation for Human Musculoskeletal Locomotion

[2503.14637] KINESIS: Motion Imitation for Human Musculoskeletal Locomotion

arXiv - Machine Learning 4 min read Article

Summary

KINESIS presents a model-free framework for motion imitation in human musculoskeletal locomotion, achieving robust performance in various tasks through advanced reinforcement learning techniques.

Why It Matters

This research addresses the limitations of existing humanoid models in accurately replicating human motion by introducing KINESIS, which learns from real locomotion data. Its implications extend to robotics, rehabilitation, and AI-driven motion control, making it a significant advancement in understanding and replicating human biomechanics.

Key Takeaways

  • KINESIS effectively imitates human locomotion using a model-free approach.
  • The framework learns robust locomotion priors from 1.8 hours of data.
  • It demonstrates physiological plausibility by correlating muscle activity with human EMG patterns.
  • KINESIS can be applied to various tasks, including text-to-control and sports simulations.
  • The model scales across different biomechanical complexities, controlling up to 290 muscles.

Computer Science > Robotics arXiv:2503.14637 (cs) [Submitted on 18 Mar 2025 (v1), last revised 23 Feb 2026 (this version, v2)] Title:KINESIS: Motion Imitation for Human Musculoskeletal Locomotion Authors:Merkourios Simos, Alberto Silvio Chiappa, Alexander Mathis View a PDF of the paper titled KINESIS: Motion Imitation for Human Musculoskeletal Locomotion, by Merkourios Simos and Alberto Silvio Chiappa and Alexander Mathis View PDF HTML (experimental) Abstract:How do humans move? Advances in reinforcement learning (RL) have produced impressive results in capturing human motion using physics-based humanoid control. However, torque-controlled humanoids fail to model key aspects of human motor control such as biomechanical joint constraints \& non-linear and overactuated musculotendon control. We present KINESIS, a model-free motion imitation framework that tackles these challenges. KINESIS is trained on 1.8 hours of locomotion data and achieves strong motion imitation performance on unseen trajectories. Through a negative mining approach, KINESIS learns robust locomotion priors that we leverage to deploy the policy on several downstream tasks such as text-to-control, target point reaching, and football penalty kicks. Importantly, KINESIS learns to generate muscle activity patterns that correlate well with human EMG activity. We show that these results scale seamlessly across biomechanical model complexity, demonstrating control of up to 290 muscles. Overall, the physiological...

Related Articles

Llms

Von Hammerstein’s Ghost: What a Prussian General’s Officer Typology Can Teach Us About AI Misalignment

Greetings all - I've posted mostly in r/claudecode and r/aigamedev a couple of times previously. Working with CC for personal projects re...

Reddit - Artificial Intelligence · 1 min ·
Llms

World models will be the next big thing, bye-bye LLMs

Was at Nvidia's GTC conference recently and honestly, it was one of the most eye-opening events I've attended in a while. There was a lot...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

[D] Got my first offer after months of searching — below posted range, contract-to-hire, and worried it may pause my search. Do I take it?

I could really use some outside perspective. I’m a senior ML/CV engineer in Canada with about 5–6 years across research and industry. Mas...

Reddit - Machine Learning · 1 min ·
Machine Learning

[Research] AI training is bad, so I started an research

Hello, I started researching about AI training Q:Why? R: Because AI training is bad right now. Q: What do you mean its bad? R: Like when ...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime