[2602.13783] MEMTS: Internalizing Domain Knowledge via Parameterized Memory for Retrieval-Free Domain Adaptation of Time Series Foundation Models
Summary
The paper presents MEMTS, a novel method for domain adaptation in time series forecasting that internalizes domain knowledge through a Knowledge Persistence Module, enhancing efficiency and mitigating issues like catastrophic forgetting.
Why It Matters
As time series forecasting becomes increasingly vital across various industries, the ability to adapt models to specific domains without significant overhead is crucial. MEMTS addresses the limitations of existing methods by providing a scalable, efficient solution that maintains performance in real-world applications.
Key Takeaways
- MEMTS offers a retrieval-free approach to domain adaptation in time series models.
- The Knowledge Persistence Module internalizes domain-specific dynamics, improving efficiency.
- The method achieves constant-time inference, addressing scalability issues in real-time processing.
- Extensive experiments demonstrate MEMTS's state-of-the-art performance across multiple datasets.
- The approach mitigates catastrophic forgetting while preserving general temporal patterns.
Computer Science > Machine Learning arXiv:2602.13783 (cs) [Submitted on 14 Feb 2026] Title:MEMTS: Internalizing Domain Knowledge via Parameterized Memory for Retrieval-Free Domain Adaptation of Time Series Foundation Models Authors:Xiaoyun Yu, Li fan, Xiangfei Qiu, Nanqing Dong, Yonggui Huang, Honggang Qi, Geguang Pu, Wanli Ouyang, Xi Chen, Jilin Hu View a PDF of the paper titled MEMTS: Internalizing Domain Knowledge via Parameterized Memory for Retrieval-Free Domain Adaptation of Time Series Foundation Models, by Xiaoyun Yu and 9 other authors View PDF HTML (experimental) Abstract:While Time Series Foundation Models (TSFMs) have demonstrated exceptional performance in generalized forecasting, their performance often degrades significantly when deployed in real-world vertical domains characterized by temporal distribution shifts and domain-specific periodic structures. Current solutions are primarily constrained by two paradigms: Domain-Adaptive Pretraining (DAPT), which improves short-term domain fitting but frequently disrupts previously learned global temporal patterns due to catastrophic forgetting; and Retrieval-Augmented Generation (RAG), which incorporates external knowledge but introduces substantial retrieval overhead. This creates a severe scalability bottleneck that fails to meet the high-efficiency requirements of real-time stream processing. To break this impasse, we propose Memory for Time Series (MEMTS), a lightweight and plug-and-play method for retrieval-f...