[2505.13033] TSPulse: Tiny Pre-Trained Models with Disentangled Representations for Rapid Time-Series Analysis
About this article
Abstract page for arXiv paper 2505.13033: TSPulse: Tiny Pre-Trained Models with Disentangled Representations for Rapid Time-Series Analysis
Computer Science > Machine Learning arXiv:2505.13033 (cs) [Submitted on 19 May 2025 (v1), last revised 4 Mar 2026 (this version, v3)] Title:TSPulse: Tiny Pre-Trained Models with Disentangled Representations for Rapid Time-Series Analysis Authors:Vijay Ekambaram, Subodh Kumar, Arindam Jati, Sumanta Mukherjee, Tomoya Sakai, Pankaj Dayama, Wesley M. Gifford, Jayant Kalagnanam View a PDF of the paper titled TSPulse: Tiny Pre-Trained Models with Disentangled Representations for Rapid Time-Series Analysis, by Vijay Ekambaram and 7 other authors View PDF Abstract:Time-series tasks often benefit from signals expressed across multiple representation spaces (e.g., time vs. frequency) and at varying abstraction levels (e.g., local patterns vs. global semantics). However, existing pre-trained time-series models entangle these heterogeneous signals into a single large embedding, limiting transferability and direct zero-shot usability. To address this, we propose TSPulse, family of ultra-light pre-trained models (1M parameters) with disentanglement properties, specialized for various time-series diagnostic tasks. TSPulse introduces a novel pre-training framework that augments masked reconstruction with explicit disentanglement across spaces and abstractions, learning three complementary embedding views (temporal, spectral, and semantic) to effectively enable zero-shot transfer. In-addition, we introduce various lightweight post-hoc fusers that selectively attend and fuse these disentang...