[2602.21693] TiMi: Empower Time Series Transformers with Multimodal Mixture of Experts

[2602.21693] TiMi: Empower Time Series Transformers with Multimodal Mixture of Experts

arXiv - Machine Learning 4 min read Article

Summary

The paper introduces TiMi, a novel approach that enhances time series forecasting by integrating multimodal data through a Mixture of Experts framework, demonstrating superior performance on various benchmarks.

Why It Matters

As industries increasingly rely on accurate forecasting, TiMi's ability to effectively incorporate multimodal data, especially textual information, addresses critical challenges in time series prediction. This advancement could lead to more informed decision-making in fields such as finance, healthcare, and disaster management.

Key Takeaways

  • TiMi leverages multimodal data for improved time series forecasting.
  • The Mixture of Experts module enhances model adaptability and interpretability.
  • TiMi outperforms existing models on multiple real-world forecasting benchmarks.
  • Incorporating textual information can significantly influence numerical predictions.
  • The approach eliminates the need for explicit representation-level alignment.

Computer Science > Machine Learning arXiv:2602.21693 (cs) [Submitted on 25 Feb 2026] Title:TiMi: Empower Time Series Transformers with Multimodal Mixture of Experts Authors:Jiafeng Lin, Yuxuan Wang, Huakun Luo, Zhongyi Pei, Jianmin Wang View a PDF of the paper titled TiMi: Empower Time Series Transformers with Multimodal Mixture of Experts, by Jiafeng Lin and 4 other authors View PDF HTML (experimental) Abstract:Multimodal time series forecasting has garnered significant attention for its potential to provide more accurate predictions than traditional single-modality models by leveraging rich information inherent in other modalities. However, due to fundamental challenges in modality alignment, existing methods often struggle to effectively incorporate multimodal data into predictions, particularly textual information that has a causal influence on time series fluctuations, such as emergency reports and policy announcements. In this paper, we reflect on the role of textual information in numerical forecasting and propose Time series transformers with Multimodal Mixture-of-Experts, TiMi, to unleash the causal reasoning capabilities of LLMs. Concretely, TiMi utilizes LLMs to generate inferences on future developments, which serve as guidance for time series forecasting. To seamlessly integrate both exogenous factors and time series into predictions, we introduce a Multimodal Mixture-of-Experts (MMoE) module as a lightweight plug-in to empower Transformer-based time series mo...

Related Articles

Yupp shuts down after raising $33M from a16z crypto's Chris Dixon | TechCrunch
Machine Learning

Yupp shuts down after raising $33M from a16z crypto's Chris Dixon | TechCrunch

Less than a year after launching, with checks from some of the biggest names in Silicon Valley, crowdsourced AI model feedback startup Yu...

TechCrunch - AI · 4 min ·
Machine Learning

[R] Fine-tuning services report

If you have some data and want to train or run a small custom model but don't have powerful enough hardware for training, fine-tuning ser...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] Does ML have a "bible"/reference textbook at the Intermediate/Advanced level?

Hello, everyone! This is my first time posting here and I apologise if the question is, perhaps, a bit too basic for this sub-reddit. A b...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] ICML 2026 review policy debate: 100 responses suggest Policy B may score higher, while Policy A shows higher confidence

A week ago I made a thread asking whether ICML 2026’s review policy might have affected review outcomes, especially whether Policy A pape...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime