[2512.20885] From GNNs to Symbolic Surrogates via Kolmogorov-Arnold Networks for Delay Prediction

[2512.20885] From GNNs to Symbolic Surrogates via Kolmogorov-Arnold Networks for Delay Prediction

arXiv - Machine Learning 3 min read Article

Summary

This paper explores the use of Kolmogorov-Arnold Networks (KAN) for predicting flow delays in communication networks, enhancing efficiency and transparency in modeling.

Why It Matters

Accurate flow delay prediction is crucial for optimizing communication networks. This research introduces innovative modeling techniques that balance performance and efficiency, potentially transforming how predictive models are deployed in real-world applications.

Key Takeaways

  • The study implements a heterogeneous GNN with attention for strong baseline performance.
  • FlowKANet utilizes KAN layers to reduce parameters while maintaining accuracy.
  • Symbolic surrogates derived from the model enhance deployment efficiency and transparency.

Computer Science > Machine Learning arXiv:2512.20885 (cs) [Submitted on 24 Dec 2025 (v1), last revised 16 Feb 2026 (this version, v2)] Title:From GNNs to Symbolic Surrogates via Kolmogorov-Arnold Networks for Delay Prediction Authors:Sami Marouani, Kamal Singh, Baptiste Jeudy, Amaury Habrard View a PDF of the paper titled From GNNs to Symbolic Surrogates via Kolmogorov-Arnold Networks for Delay Prediction, by Sami Marouani and 3 other authors View PDF HTML (experimental) Abstract:Accurate prediction of flow delay is essential for optimizing and managing modern communication networks. We investigate three levels of modeling for this task. First, we implement a heterogeneous GNN with attention-based message passing, establishing a strong neural baseline. Second, we propose FlowKANet in which Kolmogorov-Arnold Networks replace standard MLP layers, reducing trainable parameters while maintaining competitive predictive performance. FlowKANet integrates KAMP-Attn (Kolmogorov-Arnold Message Passing with Attention), embedding KAN operators directly into message-passing and attention computation. Finally, we distill the model into symbolic surrogate models using block-wise regression, producing closed-form equations that eliminate trainable weights while preserving graph-structured dependencies. The results show that KAN layers provide a favorable trade-off between efficiency and accuracy and that symbolic surrogates emphasize the potential for lightweight deployment and enhanced t...

Related Articles

Machine Learning

Coherence Without Convergence: A New Protocol for Multi-Agent AI

Opening For the past year, most progress in multi-agent AI has followed a familiar pattern: Add more agents. Add more coordination. Watch...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

Week 6 AIPass update - answering the top questions from last post (file conflicts, remote models, scale)

Followup to last post with answers to the top questions from the comments. Appreciate everyone who jumped in. The most common one by a mi...

Reddit - Artificial Intelligence · 1 min ·
Llms

Honest ChatGPT vs Claude comparison after using both daily for a month

got tired of reading comparisons that were obvisously written by people who tested each tool for 20 minutes so i ran both at $20/month fo...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

What if attention didn’t need matrix multiplication?

I built a cognitive architecture where all computation reduces to three bit operations: XOR, MAJ, POPCNT. No GEMM. No GPU. No floating-po...

Reddit - Artificial Intelligence · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime