[2602.12407] MiDAS: A Multimodal Data Acquisition System and Dataset for Robot-Assisted Minimally Invasive Surgery
Summary
The paper presents MiDAS, an open-source multimodal data acquisition system for robot-assisted minimally invasive surgery, enabling synchronized data collection across various platforms.
Why It Matters
MiDAS addresses the challenge of accessing proprietary robot telemetry in surgical research, facilitating reproducible data collection and advancing the field of robot-assisted surgery. By providing an open-source solution, it promotes collaboration and innovation in surgical robotics.
Key Takeaways
- MiDAS allows for non-invasive data acquisition across surgical robots.
- The system integrates multiple data sources, including hand tracking and video capture.
- Validation studies show MiDAS can approximate internal robot kinematics effectively.
- The dataset includes the first multimodal recordings of hernia repair suturing.
- MiDAS enhances reproducibility in surgical research and training.
Computer Science > Robotics arXiv:2602.12407 (cs) [Submitted on 12 Feb 2026] Title:MiDAS: A Multimodal Data Acquisition System and Dataset for Robot-Assisted Minimally Invasive Surgery Authors:Keshara Weerasinghe, Seyed Hamid Reza Roodabeh, Andrew Hawkins (MD), Zhaomeng Zhang, Zachary Schrader, Homa Alemzadeh View a PDF of the paper titled MiDAS: A Multimodal Data Acquisition System and Dataset for Robot-Assisted Minimally Invasive Surgery, by Keshara Weerasinghe and 5 other authors View PDF HTML (experimental) Abstract:Background: Robot-assisted minimally invasive surgery (RMIS) research increasingly relies on multimodal data, yet access to proprietary robot telemetry remains a major barrier. We introduce MiDAS, an open-source, platform-agnostic system enabling time-synchronized, non-invasive multimodal data acquisition across surgical robotic platforms. Methods: MiDAS integrates electromagnetic and RGB-D hand tracking, foot pedal sensing, and surgical video capturing without requiring proprietary robot interfaces. We validated MiDAS on the open-source Raven-II and the clinical da Vinci Xi by collecting multimodal datasets of peg transfer and hernia repair suturing tasks performed by surgical residents. Correlation analysis and downstream gesture recognition experiments were conducted. Results: External hand and foot sensing closely approximated internal robot kinematics and non-invasive motion signals achieved gesture recognition performance comparable to proprietary tel...