[2604.04271] A Family of Open Time-Series Foundation Models for the Radio Access Network

[2604.04271] A Family of Open Time-Series Foundation Models for the Radio Access Network

arXiv - Machine Learning 4 min read

About this article

Abstract page for arXiv paper 2604.04271: A Family of Open Time-Series Foundation Models for the Radio Access Network

Computer Science > Networking and Internet Architecture arXiv:2604.04271 (cs) [Submitted on 5 Apr 2026] Title:A Family of Open Time-Series Foundation Models for the Radio Access Network Authors:Ioannis Panitsas, Leandros Tassiulas View a PDF of the paper titled A Family of Open Time-Series Foundation Models for the Radio Access Network, by Ioannis Panitsas and 1 other authors View PDF HTML (experimental) Abstract:The Radio Access Network (RAN) is evolving into a programmable and disaggregated infrastructure that increasingly relies on AI-native algorithms for optimization and closed-loop control. However, current RAN intelligence is still largely built from task-specific models tailored to individual functions, resulting in model fragmentation, limited knowledge sharing across tasks, poor generalization, and increased system complexity. To address these limitations, we introduce TimeRAN, a unified multi-task learning framework for time-series modeling in the RAN. TimeRAN leverages a lightweight time-series foundation model with few task-specific heads to learn transferable representations that can be efficiently adapted across diverse tasks with limited supervision. To enable large-scale pretraining, we further curate and open-source TimeRAN DataPile, the largest time-series corpus for RAN analytics to date, comprising over 355K time series and 0.56B measurements across diverse telemetry sources, protocol layers, and deployment scenarios. We evaluate TimeRAN across a compr...

Originally published on April 07, 2026. Curated by AI News.

Related Articles

AI isn’t just evolving, it’s taking over how we think — here’s my take on ChatGPT’s big upgrade, Google's plan to make AI invisible, DeepSeek’s return and more
Llms

AI isn’t just evolving, it’s taking over how we think — here’s my take on ChatGPT’s big upgrade, Google's plan to make AI invisible, DeepSeek’s return and more

This week, AI is all about "super" apps, invisible apps and a geopolitical AI race

AI Tools & Products · 10 min ·
Llms

Founding Engineer (Full-Stack / AI) – Build the Future of Personalized Healthcare (San Francisco, In-Person)

Hi everyone Galen AI is an early-stage, YC-backed healthtech startup building a personal AI doctor by combining clinical data, wearable d...

Reddit - ML Jobs · 1 min ·
Llms

Looking for Job Opportunities — Senior MLOps / LLMOps Engineer (Remote / Visa Sponsorship)

Hi Everyone 👋 I’m a Senior MLOps / LLMOps Engineer with ~5 years of experience building and operating production-scale ML & LLM platf...

Reddit - ML Jobs · 1 min ·
Llms

Early career / PhD (USA only) - $80-120/hr

Mercor is hiring Machine Learning Engineers to: Draft detailed natural-language plans and code implementations for machine learning tasks...

Reddit - ML Jobs · 1 min ·
More in Llms: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime