[2602.12613] Coden: Efficient Temporal Graph Neural Networks for Continuous Prediction

[2602.12613] Coden: Efficient Temporal Graph Neural Networks for Continuous Prediction

arXiv - Machine Learning 3 min read Article

Summary

The paper introduces Coden, an efficient Temporal Graph Neural Network (TGNN) model designed for continuous predictions, overcoming limitations of existing TGNNs in dynamic graph scenarios.

Why It Matters

Coden addresses the growing need for continuous predictions in dynamic graph applications, which are critical in fields like social network analysis and real-time recommendation systems. By improving efficiency and effectiveness, it enhances the applicability of TGNNs in practical scenarios.

Key Takeaways

  • Coden is designed for continuous predictions in dynamic graphs.
  • It overcomes complexity bottlenecks found in existing TGNNs.
  • The model maintains predictive accuracy while improving efficiency.
  • Theoretical analyses support Coden's effectiveness and efficiency.
  • Coden outperforms existing benchmarks across five dynamic datasets.

Computer Science > Machine Learning arXiv:2602.12613 (cs) [Submitted on 13 Feb 2026] Title:Coden: Efficient Temporal Graph Neural Networks for Continuous Prediction Authors:Zulun Zhu, Siqiang Luo View a PDF of the paper titled Coden: Efficient Temporal Graph Neural Networks for Continuous Prediction, by Zulun Zhu and 1 other authors View PDF HTML (experimental) Abstract:Temporal Graph Neural Networks (TGNNs) are pivotal in processing dynamic graphs. However, existing TGNNs primarily target one-time predictions for a given temporal span, whereas many practical applications require continuous predictions, that predictions are issued frequently over time. Directly adapting existing TGNNs to continuous-prediction scenarios introduces either significant computational overhead or prediction quality issues especially for large graphs. This paper revisits the challenge of { continuous predictions} in TGNNs, and introduces {\sc Coden}, a TGNN model designed for efficient and effective learning on dynamic graphs. {\sc Coden} innovatively overcomes the key complexity bottleneck in existing TGNNs while preserving comparable predictive accuracy. Moreover, we further provide theoretical analyses that substantiate the effectiveness and efficiency of {\sc Coden}, and clarify its duality relationship with both RNN-based and attention-based models. Our evaluations across five dynamic datasets show that {\sc Coden} surpasses existing performance benchmarks in both efficiency and effectivenes...

Related Articles

Llms

[P] ClaudeFormer: Building a Transformer Out of Claudes — Collaboration Request

I'm looking to work with people interested in math, machine learning, or agentic coding, on creating a multi-agent framework to do fronti...

Reddit - Machine Learning · 1 min ·
UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
Machine Learning

[D] Looking for definition of open-world ish learning problem

Hello! Recently I did a project where I initially had around 30 target classes. But at inference, the model had to be able to handle a lo...

Reddit - Machine Learning · 1 min ·
Mystery Shopping Meets Machine Learning: Can Algorithms Become the Ultimate Customer Experience Auditor?
Machine Learning

Mystery Shopping Meets Machine Learning: Can Algorithms Become the Ultimate Customer Experience Auditor?

Customer expectations across Africa are shifting faster than most organisations can track. A single inconsistent interaction can ignite a...

AI News - General · 8 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime