[2505.14202] MSDformer: Multi-scale Discrete Transformer For Time Series Generation

[2505.14202] MSDformer: Multi-scale Discrete Transformer For Time Series Generation

arXiv - Machine Learning 4 min read

About this article

Abstract page for arXiv paper 2505.14202: MSDformer: Multi-scale Discrete Transformer For Time Series Generation

Computer Science > Machine Learning arXiv:2505.14202 (cs) [Submitted on 20 May 2025 (v1), last revised 6 Apr 2026 (this version, v3)] Title:MSDformer: Multi-scale Discrete Transformer For Time Series Generation Authors:Shibo Feng, Zhicheng Chen, Xi Xiao, Zhong Zhang, Qing Li, Xingyu Gao, Peilin Zhao View a PDF of the paper titled MSDformer: Multi-scale Discrete Transformer For Time Series Generation, by Shibo Feng and 6 other authors View PDF HTML (experimental) Abstract:Discrete Token Modeling (DTM), which employs vector quantization techniques, has demonstrated remarkable success in modeling non-natural language modalities, particularly in time series generation. While our prior work SDformer established the first DTM-based framework to achieve state-of-the-art performance in this domain, two critical limitations persist in existing DTM approaches: 1) their inability to capture multi-scale temporal patterns inherent to complex time series data, and 2) the absence of theoretical foundations to guide model optimization. To address these challenges, we proposes a novel multi-scale DTM-based time series generation method, called Multi-Scale Discrete Transformer (MSDformer). MSDformer employs a multi-scale time series tokenizer to learn discrete token representations at multiple scales, which jointly characterize the complex nature of time series data. Subsequently, MSDformer applies a multi-scale autoregressive token modeling technique to capture the multi-scale patterns of ...

Originally published on April 07, 2026. Curated by AI News.

Related Articles

Llms

Qwen3 4B outperforms cloud agents on code tasks—with Mahoraga research [R]

Hey everyone in ML. I've been working on Mahoraga, an open-source orchestrator that routes tasks across local and cloud AI agents using a...

Reddit - Machine Learning · 1 min ·
Machine Learning

Auroch - The Future of AI Memory

Auroch Engine is an external memory layer for AI assistants — designed to give models better long-term recall, personalization, and conte...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

Project Aurelia — A 3-model architecture (80B + 13B + 9B) that physically reacts to my real-time heart rate via mmWave radar, spatial awareness via Lidar, and Vibration via Accelerometer. All on a Framework Desktop + eGPU

Hey everyone, I’ve been building a multi-agent system in my spare time, and I just open-sourced the repository. I was getting tired of th...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

Help needed [D]

Heyy guyss... I had made the image dataset and was currently working on its training using the srnet model... I made it train on batches ...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime