[2510.02410] OpenTSLM: Time-Series Language Models for Reasoning over Multivariate Medical Text- and Time-Series Data
Summary
OpenTSLM introduces a new family of Time Series Language Models designed to enhance reasoning over multivariate medical data, outperforming traditional models in various benchmarks.
Why It Matters
The integration of time series data into language models addresses a significant gap in medical AI applications, enabling more accurate insights from complex datasets. This advancement can improve clinical decision-making and digital health solutions.
Key Takeaways
- OpenTSLM models effectively integrate time series data with text for improved reasoning.
- Two architectures, OpenTSLM-SoftPrompt and OpenTSLM-Flamingo, show significant performance improvements over traditional models.
- OpenTSLM models achieved high F1 scores in sleep staging and human activity recognition tasks.
- The models are open-source, promoting further research and development in the field.
- Expert reviews indicate strong reasoning capabilities in clinical applications.
Computer Science > Machine Learning arXiv:2510.02410 (cs) [Submitted on 2 Oct 2025 (v1), last revised 14 Feb 2026 (this version, v3)] Title:OpenTSLM: Time-Series Language Models for Reasoning over Multivariate Medical Text- and Time-Series Data Authors:Patrick Langer, Thomas Kaar, Max Rosenblattl, Maxwell A. Xu, Winnie Chow, Martin Maritsch, Robert Jakob, Ning Wang, Juncheng Liu, Aradhana Verma, Brian Han, Daniel Seung Kim, Henry Chubb, Scott Ceresnak, Aydin Zahedivash, Alexander Tarlochan Singh Sandhu, Fatima Rodriguez, Daniel McDuff, Elgar Fleisch, Oliver Aalami, Filipe Barata, Paul Schmiedmayer View a PDF of the paper titled OpenTSLM: Time-Series Language Models for Reasoning over Multivariate Medical Text- and Time-Series Data, by Patrick Langer and 21 other authors View PDF Abstract:LLMs have emerged as powerful tools for interpreting multimodal data. In medicine, they hold particular promise for synthesizing large volumes of clinical information into actionable insights and digital health applications. Yet, a major limitation remains their inability to handle time series. To overcome this gap, we present OpenTSLM, a family of Time Series Language Models (TSLMs) created by integrating time series as a native modality to pretrained LLMs, enabling reasoning over multiple time series of any length. We investigate two architectures for OpenTSLM. The first, OpenTSLM-SoftPrompt, models time series implicitly by concatenating learnable time series tokens with text tokens via...