[2602.12407] MiDAS: A Multimodal Data Acquisition System and Dataset for Robot-Assisted Minimally Invasive Surgery

[2602.12407] MiDAS: A Multimodal Data Acquisition System and Dataset for Robot-Assisted Minimally Invasive Surgery

arXiv - Machine Learning 3 min read Article

Summary

The paper presents MiDAS, an open-source multimodal data acquisition system for robot-assisted minimally invasive surgery, enabling synchronized data collection across various platforms.

Why It Matters

MiDAS addresses the challenge of accessing proprietary robot telemetry in surgical research, facilitating reproducible data collection and advancing the field of robot-assisted surgery. By providing an open-source solution, it promotes collaboration and innovation in surgical robotics.

Key Takeaways

  • MiDAS allows for non-invasive data acquisition across surgical robots.
  • The system integrates multiple data sources, including hand tracking and video capture.
  • Validation studies show MiDAS can approximate internal robot kinematics effectively.
  • The dataset includes the first multimodal recordings of hernia repair suturing.
  • MiDAS enhances reproducibility in surgical research and training.

Computer Science > Robotics arXiv:2602.12407 (cs) [Submitted on 12 Feb 2026] Title:MiDAS: A Multimodal Data Acquisition System and Dataset for Robot-Assisted Minimally Invasive Surgery Authors:Keshara Weerasinghe, Seyed Hamid Reza Roodabeh, Andrew Hawkins (MD), Zhaomeng Zhang, Zachary Schrader, Homa Alemzadeh View a PDF of the paper titled MiDAS: A Multimodal Data Acquisition System and Dataset for Robot-Assisted Minimally Invasive Surgery, by Keshara Weerasinghe and 5 other authors View PDF HTML (experimental) Abstract:Background: Robot-assisted minimally invasive surgery (RMIS) research increasingly relies on multimodal data, yet access to proprietary robot telemetry remains a major barrier. We introduce MiDAS, an open-source, platform-agnostic system enabling time-synchronized, non-invasive multimodal data acquisition across surgical robotic platforms. Methods: MiDAS integrates electromagnetic and RGB-D hand tracking, foot pedal sensing, and surgical video capturing without requiring proprietary robot interfaces. We validated MiDAS on the open-source Raven-II and the clinical da Vinci Xi by collecting multimodal datasets of peg transfer and hernia repair suturing tasks performed by surgical residents. Correlation analysis and downstream gesture recognition experiments were conducted. Results: External hand and foot sensing closely approximated internal robot kinematics and non-invasive motion signals achieved gesture recognition performance comparable to proprietary tel...

Related Articles

Nomadic raises $8.4 million to wrangle the data pouring off autonomous vehicles | TechCrunch
Machine Learning

Nomadic raises $8.4 million to wrangle the data pouring off autonomous vehicles | TechCrunch

The company turns footage from robots into structured, searchable datasets with a deep learning model.

TechCrunch - AI · 6 min ·
Machine Learning

The AI Chip War is Just Getting Started

Everyone talks about AI models, but the real bottleneck might be hardware. According to a recent study by Roots Analysis: AI chip market ...

Reddit - Artificial Intelligence · 1 min ·
Robotics

What happens when AI agents can earn and spend real money? I built a small test to find out

I've been sitting with a question for a while: what happens when AI agents aren't just tools to be used, but participants in an economy? ...

Reddit - Artificial Intelligence · 1 min ·
Robotics

AIPass Herald

Some insight onto building a muilti agent autonomous system. This is like the daily newspaper for the project. A quick read to see how ou...

Reddit - Artificial Intelligence · 1 min ·
More in Robotics: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime