[2604.04475] Discrete Prototypical Memories for Federated Time Series Foundation Models
About this article
Abstract page for arXiv paper 2604.04475: Discrete Prototypical Memories for Federated Time Series Foundation Models
Computer Science > Machine Learning arXiv:2604.04475 (cs) [Submitted on 6 Apr 2026] Title:Discrete Prototypical Memories for Federated Time Series Foundation Models Authors:Liwei Deng, Qingxiang Liu, Xinhe Niu, Shengchao Chen, Sheng Sun, Yuankai Wu, Guodong Long, Yuxuan Liang View a PDF of the paper titled Discrete Prototypical Memories for Federated Time Series Foundation Models, by Liwei Deng and 7 other authors View PDF HTML (experimental) Abstract:Leveraging Large Language Models (LLMs) as federated learning (FL)-based time series foundation models offers a promising way to transfer the generalization capabilities of LLMs to time series data while preserving access to private data. However, the semantic misalignment between time-series data and the text-centric latent space of existing LLMs often leads to degraded performance. Meanwhile, the parameter-sharing mechanism in existing FL methods model heterogeneous cross-domain time-series data into a unified continuous latent space, which contradicts the fact that time-series semantics frequently manifest as discrete and recurring regimes. To address these limitations, we propose \textsc{FeDPM}, a federated framework for time-series foundation models based on discrete prototypical memories. Specifically, we learn local prototypical memory priors for intra-domain time-series data. We then align cross-domain memories to promote a unified discrete latent space and introduce a domain-specific memory update mechanism to balanc...