[2602.22547] Towards Dynamic Dense Retrieval with Routing Strategy
Summary
The paper presents a novel approach to dense retrieval called Dynamic Dense Retrieval (DDR), which addresses limitations in adapting models to new domains and reduces training costs.
Why It Matters
As dense retrieval becomes increasingly important in information retrieval, DDR offers a flexible solution that minimizes resource expenditure while enhancing adaptability. This is crucial for applications requiring frequent model updates, making it relevant for both researchers and practitioners in machine learning and information retrieval.
Key Takeaways
- Dynamic Dense Retrieval (DDR) improves domain adaptation in retrieval tasks.
- DDR utilizes prefix tuning and a dynamic routing strategy for flexibility.
- The approach reduces training parameters to just 2%, enhancing efficiency.
- Extensive evaluations show DDR outperforms traditional dense retrieval methods.
- DDR paves the way for more adaptable applications in information retrieval.
Computer Science > Information Retrieval arXiv:2602.22547 (cs) [Submitted on 26 Feb 2026] Title:Towards Dynamic Dense Retrieval with Routing Strategy Authors:Zhan Su, Fengran Mo, Jinghan Zhang, Yuchen Hui, Jia Ao Sun, Bingbing Wen, Jian-Yun Nie View a PDF of the paper titled Towards Dynamic Dense Retrieval with Routing Strategy, by Zhan Su and 6 other authors View PDF HTML (experimental) Abstract:The \textit{de facto} paradigm for applying dense retrieval (DR) to new tasks involves fine-tuning a pre-trained model for a specific task. However, this paradigm has two significant limitations: (1) It is difficult adapt the DR to a new domain if the training dataset is limited. (2) Old DR models are simply replaced by newer models that are trained from scratch when the former are no longer up to date. Especially for scenarios where the model needs to be updated frequently, this paradigm is prohibitively expensive. To address these challenges, we propose a novel dense retrieval approach, termed \textit{dynamic dense retrieval} (DDR). DDR uses \textit{prefix tuning} as a \textit{module} specialized for a specific domain. These modules can then be compositional combined with a dynamic routing strategy, enabling highly flexible domain adaptation in the retrieval part. Extensive evaluation on six zero-shot downstream tasks demonstrates that this approach can surpass DR while utilizing only 2\% of the training parameters, paving the way to achieve more flexible dense retrieval in IR. ...