[2602.12613] Coden: Efficient Temporal Graph Neural Networks for Continuous Prediction
Summary
The paper introduces Coden, an efficient Temporal Graph Neural Network (TGNN) model designed for continuous predictions, overcoming limitations of existing TGNNs in dynamic graph scenarios.
Why It Matters
Coden addresses the growing need for continuous predictions in dynamic graph applications, which are critical in fields like social network analysis and real-time recommendation systems. By improving efficiency and effectiveness, it enhances the applicability of TGNNs in practical scenarios.
Key Takeaways
- Coden is designed for continuous predictions in dynamic graphs.
- It overcomes complexity bottlenecks found in existing TGNNs.
- The model maintains predictive accuracy while improving efficiency.
- Theoretical analyses support Coden's effectiveness and efficiency.
- Coden outperforms existing benchmarks across five dynamic datasets.
Computer Science > Machine Learning arXiv:2602.12613 (cs) [Submitted on 13 Feb 2026] Title:Coden: Efficient Temporal Graph Neural Networks for Continuous Prediction Authors:Zulun Zhu, Siqiang Luo View a PDF of the paper titled Coden: Efficient Temporal Graph Neural Networks for Continuous Prediction, by Zulun Zhu and 1 other authors View PDF HTML (experimental) Abstract:Temporal Graph Neural Networks (TGNNs) are pivotal in processing dynamic graphs. However, existing TGNNs primarily target one-time predictions for a given temporal span, whereas many practical applications require continuous predictions, that predictions are issued frequently over time. Directly adapting existing TGNNs to continuous-prediction scenarios introduces either significant computational overhead or prediction quality issues especially for large graphs. This paper revisits the challenge of { continuous predictions} in TGNNs, and introduces {\sc Coden}, a TGNN model designed for efficient and effective learning on dynamic graphs. {\sc Coden} innovatively overcomes the key complexity bottleneck in existing TGNNs while preserving comparable predictive accuracy. Moreover, we further provide theoretical analyses that substantiate the effectiveness and efficiency of {\sc Coden}, and clarify its duality relationship with both RNN-based and attention-based models. Our evaluations across five dynamic datasets show that {\sc Coden} surpasses existing performance benchmarks in both efficiency and effectivenes...