[2602.20055] To Move or Not to Move: Constraint-based Planning Enables Zero-Shot Generalization for Interactive Navigation

[2602.20055] To Move or Not to Move: Constraint-based Planning Enables Zero-Shot Generalization for Interactive Navigation

arXiv - AI 4 min read Article

Summary

This paper presents a novel constraint-based planning framework for mobile robots, enabling zero-shot generalization in interactive navigation by allowing robots to manipulate their environment to complete tasks.

Why It Matters

As robots increasingly operate in cluttered environments, such as homes and warehouses, this research addresses a critical challenge in robotics: enabling effective navigation and task completion despite obstacles. The proposed framework enhances robot autonomy and efficiency, making it relevant for advancements in robotics and AI applications.

Key Takeaways

  • Introduces Lifelong Interactive Navigation for mobile robots.
  • Proposes an LLM-driven, constraint-based planning framework.
  • Demonstrates improved task completion in cluttered environments.
  • Combines reasoning and active perception for efficient navigation.
  • Outperforms traditional and learning-based navigation methods.

Computer Science > Robotics arXiv:2602.20055 (cs) [Submitted on 23 Feb 2026] Title:To Move or Not to Move: Constraint-based Planning Enables Zero-Shot Generalization for Interactive Navigation Authors:Apoorva Vashisth (1), Manav Kulshrestha (1), Pranav Bakshi (2), Damon Conover (3), Guillaume Sartoretti (4), Aniket Bera (1) ((1) Purdue University, (2) IIT Kharagpur (3) DEVCOM Army Research Lab (4) National University of Singapore) View a PDF of the paper titled To Move or Not to Move: Constraint-based Planning Enables Zero-Shot Generalization for Interactive Navigation, by Apoorva Vashisth (1) and 6 other authors View PDF Abstract:Visual navigation typically assumes the existence of at least one obstacle-free path between start and goal, which must be discovered/planned by the robot. However, in real-world scenarios, such as home environments and warehouses, clutter can block all routes. Targeted at such cases, we introduce the Lifelong Interactive Navigation problem, where a mobile robot with manipulation abilities can move clutter to forge its own path to complete sequential object- placement tasks - each involving placing an given object (eg. Alarm clock, Pillow) onto a target object (eg. Dining table, Desk, Bed). To address this lifelong setting - where effects of environment changes accumulate and have long-term effects - we propose an LLM-driven, constraint-based planning framework with active perception. Our framework allows the LLM to reason over a structured scene...

Related Articles

[2601.07855] RoAD Benchmark: How LiDAR Models Fail under Coupled Domain Shifts and Label Evolution
Machine Learning

[2601.07855] RoAD Benchmark: How LiDAR Models Fail under Coupled Domain Shifts and Label Evolution

Abstract page for arXiv paper 2601.07855: RoAD Benchmark: How LiDAR Models Fail under Coupled Domain Shifts and Label Evolution

arXiv - AI · 3 min ·
[2502.00262] INSIGHT: Enhancing Autonomous Driving Safety through Vision-Language Models on Context-Aware Hazard Detection and Edge Case Evaluation
Llms

[2502.00262] INSIGHT: Enhancing Autonomous Driving Safety through Vision-Language Models on Context-Aware Hazard Detection and Edge Case Evaluation

Abstract page for arXiv paper 2502.00262: INSIGHT: Enhancing Autonomous Driving Safety through Vision-Language Models on Context-Aware Ha...

arXiv - AI · 4 min ·
[2508.00500] ProbGuard: Probabilistic Runtime Monitoring for LLM Agent Safety
Llms

[2508.00500] ProbGuard: Probabilistic Runtime Monitoring for LLM Agent Safety

Abstract page for arXiv paper 2508.00500: ProbGuard: Probabilistic Runtime Monitoring for LLM Agent Safety

arXiv - AI · 4 min ·
[2603.26660] Ruka-v2: Tendon Driven Open-Source Dexterous Hand with Wrist and Abduction for Robot Learning
Robotics

[2603.26660] Ruka-v2: Tendon Driven Open-Source Dexterous Hand with Wrist and Abduction for Robot Learning

Abstract page for arXiv paper 2603.26660: Ruka-v2: Tendon Driven Open-Source Dexterous Hand with Wrist and Abduction for Robot Learning

arXiv - AI · 4 min ·
More in Robotics: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime