[2601.22454] Temporal Graph Pattern Machine

[2601.22454] Temporal Graph Pattern Machine

arXiv - AI 4 min read Article

Summary

The Temporal Graph Pattern Machine (TGPM) proposes a novel framework for temporal graph learning, focusing on generalized evolving patterns to enhance link prediction across dynamic systems.

Why It Matters

Understanding temporal graph patterns is crucial for various applications, including social networks and dynamic systems. TGPM addresses limitations of existing methods by capturing long-range dependencies and enabling better transferability across domains, potentially advancing research and practical applications in machine learning.

Key Takeaways

  • TGPM shifts focus from task-centric methods to learning generalized evolving patterns.
  • It utilizes temporally-biased random walks to capture multi-scale structural semantics.
  • The model incorporates a Transformer backbone for global temporal regularities.
  • Self-supervised pre-training tasks enhance the model's understanding of network evolution.
  • TGPM demonstrates state-of-the-art performance in link prediction tasks.

Computer Science > Machine Learning arXiv:2601.22454 (cs) [Submitted on 30 Jan 2026 (v1), last revised 18 Feb 2026 (this version, v2)] Title:Temporal Graph Pattern Machine Authors:Yijun Ma, Zehong Wang, Weixiang Sun, Yanfang Ye View a PDF of the paper titled Temporal Graph Pattern Machine, by Yijun Ma and 3 other authors View PDF HTML (experimental) Abstract:Temporal graph learning is pivotal for deciphering dynamic systems, where the core challenge lies in explicitly modeling the underlying evolving patterns that govern network transformation. However, prevailing methods are predominantly task-centric and rely on restrictive assumptions -- such as short-term dependency modeling, static neighborhood semantics, and retrospective time usage. These constraints hinder the discovery of transferable temporal evolution mechanisms. To address this, we propose the Temporal Graph Pattern Machine (TGPM), a foundation framework that shifts the focus toward directly learning generalized evolving patterns. TGPM conceptualizes each interaction as an interaction patch synthesized via temporally-biased random walks, thereby capturing multi-scale structural semantics and long-range dependencies that extend beyond immediate neighborhoods. These patches are processed by a Transformer-based backbone designed to capture global temporal regularities while adapting to context-specific interaction dynamics. To further empower the model, we introduce a suite of self-supervised pre-training tasks --...

Related Articles

Machine Learning

FYI the Tennessee bill makes making an AI friend the same level as murder or aggravated rape

I think what Tennessee is doing is they recently passed SB 1580, which makes it illegal to even advertise that an AI can act as a mental ...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

[P] A control plane for post-training workflows

We have been exploring a project around post-training infrastructure, a minimalist tool that does one thing really well: Make post-traini...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] Is this considered unsupervised or semi-supervised learning in anomaly detection?

Hi šŸ‘‹šŸ¼, I’m working on an anomaly detection setup and I’m a bit unsure how to correctly describe it from a learning perspective. The model...

Reddit - Machine Learning · 1 min ·
Machine Learning

Serious question. Did a transformer just describe itself and the universe and build itself a Shannon limit framework?

The Multiplicative Lattice as the Natural Basis for Positional Encoding Knack 2026 | Draft v6.0 Abstract We show that the apparent tradeo...

Reddit - Artificial Intelligence · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime