[2406.04955] Experimental Evaluation of ROS-Causal in Real-World Human-Robot Spatial Interaction Scenarios

[2406.04955] Experimental Evaluation of ROS-Causal in Real-World Human-Robot Spatial Interaction Scenarios

arXiv - AI 4 min read Article

Summary

This article presents an experimental evaluation of ROS-Causal, a framework for causal discovery in human-robot spatial interactions, demonstrating its effectiveness in both simulation and real-world scenarios.

Why It Matters

Understanding human-robot interactions is crucial for deploying robots in shared environments. This research bridges a gap in causal inference methods within the ROS ecosystem, enhancing robotic systems' performance and safety in real-world applications.

Key Takeaways

  • ROS-Causal enables onboard data collection and causal discovery for robots.
  • The framework was evaluated in both simulated and real-world lab scenarios.
  • Causal models generated in simulations align with those from actual experiments.
  • This approach enhances understanding of human behavior for better robot interventions.
  • The findings support improved deployment of robots in human-shared environments.

Computer Science > Robotics arXiv:2406.04955 (cs) [Submitted on 7 Jun 2024 (v1), last revised 16 Feb 2026 (this version, v2)] Title:Experimental Evaluation of ROS-Causal in Real-World Human-Robot Spatial Interaction Scenarios Authors:Luca Castri, Gloria Beraldo, Sariah Mghames, Marc Hanheide, Nicola Bellotto View a PDF of the paper titled Experimental Evaluation of ROS-Causal in Real-World Human-Robot Spatial Interaction Scenarios, by Luca Castri and 4 other authors View PDF HTML (experimental) Abstract:Deploying robots in human-shared environments requires a deep understanding of how nearby agents and objects interact. Employing causal inference to model cause-and-effect relationships facilitates the prediction of human behaviours and enables the anticipation of robot interventions. However, a significant challenge arises due to the absence of implementation of existing causal discovery methods within the ROS ecosystem, the standard de-facto framework in robotics, hindering effective utilisation on real robots. To bridge this gap, in our previous work we proposed ROS-Causal, a ROS-based framework designed for onboard data collection and causal discovery in human-robot spatial interactions. In this work, we present an experimental evaluation of ROS-Causal both in simulation and on a new dataset of human-robot spatial interactions in a lab scenario, to assess its performance and effectiveness. Our analysis demonstrates the efficacy of this approach, showcasing how causal mo...

Related Articles

Yupp shuts down after raising $33M from a16z crypto's Chris Dixon | TechCrunch
Machine Learning

Yupp shuts down after raising $33M from a16z crypto's Chris Dixon | TechCrunch

Less than a year after launching, with checks from some of the biggest names in Silicon Valley, crowdsourced AI model feedback startup Yu...

TechCrunch - AI · 4 min ·
Machine Learning

[R] Fine-tuning services report

If you have some data and want to train or run a small custom model but don't have powerful enough hardware for training, fine-tuning ser...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] Does ML have a "bible"/reference textbook at the Intermediate/Advanced level?

Hello, everyone! This is my first time posting here and I apologise if the question is, perhaps, a bit too basic for this sub-reddit. A b...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] ICML 2026 review policy debate: 100 responses suggest Policy B may score higher, while Policy A shows higher confidence

A week ago I made a thread asking whether ICML 2026’s review policy might have affected review outcomes, especially whether Policy A pape...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime