[2603.25498] EcoThink: A Green Adaptive Inference Framework for Sustainable and Accessible Agents
About this article
Abstract page for arXiv paper 2603.25498: EcoThink: A Green Adaptive Inference Framework for Sustainable and Accessible Agents
Computer Science > Artificial Intelligence arXiv:2603.25498 (cs) [Submitted on 26 Mar 2026] Title:EcoThink: A Green Adaptive Inference Framework for Sustainable and Accessible Agents Authors:Linxiao Li, Zhixiang Lu View a PDF of the paper titled EcoThink: A Green Adaptive Inference Framework for Sustainable and Accessible Agents, by Linxiao Li and Zhixiang Lu View PDF HTML (experimental) Abstract:As the Web transitions from static retrieval to generative interaction, the escalating environmental footprint of Large Language Models (LLMs) presents a critical sustainability challenge. Current paradigms indiscriminately apply computation-intensive strategies like Chain-of-Thought (CoT) to billions of daily queries, causing LLM overthinking, a redundancy that amplifies carbon emissions and operational barriers. This inefficiency directly undermines UN Sustainable Development Goals 13 (Climate Action) and 10 (Reduced Inequalities) by hindering equitable AI access in resource-constrained regions. To address this, we introduce EcoThink, an energy-aware adaptive inference framework designed to reconcile high-performance AI intelligence with environmental responsibility. EcoThink employs a lightweight, distillation-based router to dynamically assess query complexity, skipping unnecessary reasoning for factoid retrieval while reserving deep computation for complex logic. Extensive evaluations across 9 diverse benchmarks demonstrate that EcoThink reduces inference energy by 40.4% on a...