[2512.20885] From GNNs to Symbolic Surrogates via Kolmogorov-Arnold Networks for Delay Prediction
Summary
This paper explores the use of Kolmogorov-Arnold Networks (KAN) for predicting flow delays in communication networks, enhancing efficiency and transparency in modeling.
Why It Matters
Accurate flow delay prediction is crucial for optimizing communication networks. This research introduces innovative modeling techniques that balance performance and efficiency, potentially transforming how predictive models are deployed in real-world applications.
Key Takeaways
- The study implements a heterogeneous GNN with attention for strong baseline performance.
- FlowKANet utilizes KAN layers to reduce parameters while maintaining accuracy.
- Symbolic surrogates derived from the model enhance deployment efficiency and transparency.
Computer Science > Machine Learning arXiv:2512.20885 (cs) [Submitted on 24 Dec 2025 (v1), last revised 16 Feb 2026 (this version, v2)] Title:From GNNs to Symbolic Surrogates via Kolmogorov-Arnold Networks for Delay Prediction Authors:Sami Marouani, Kamal Singh, Baptiste Jeudy, Amaury Habrard View a PDF of the paper titled From GNNs to Symbolic Surrogates via Kolmogorov-Arnold Networks for Delay Prediction, by Sami Marouani and 3 other authors View PDF HTML (experimental) Abstract:Accurate prediction of flow delay is essential for optimizing and managing modern communication networks. We investigate three levels of modeling for this task. First, we implement a heterogeneous GNN with attention-based message passing, establishing a strong neural baseline. Second, we propose FlowKANet in which Kolmogorov-Arnold Networks replace standard MLP layers, reducing trainable parameters while maintaining competitive predictive performance. FlowKANet integrates KAMP-Attn (Kolmogorov-Arnold Message Passing with Attention), embedding KAN operators directly into message-passing and attention computation. Finally, we distill the model into symbolic surrogate models using block-wise regression, producing closed-form equations that eliminate trainable weights while preserving graph-structured dependencies. The results show that KAN layers provide a favorable trade-off between efficiency and accuracy and that symbolic surrogates emphasize the potential for lightweight deployment and enhanced t...