[2602.17893] COMBA: Cross Batch Aggregation for Learning Large Graphs with Context Gating State Space Models

[2602.17893] COMBA: Cross Batch Aggregation for Learning Large Graphs with Context Gating State Space Models

arXiv - Machine Learning 4 min read Article

Summary

The paper presents COMBA, a novel approach for learning large graphs using state space models, emphasizing cross batch aggregation and graph context gating to enhance performance.

Why It Matters

As large graph data becomes increasingly prevalent in various applications, efficient learning methods are essential. COMBA addresses the challenges of applying state space models to graph structures, potentially improving computational efficiency and accuracy in machine learning tasks involving large graphs.

Key Takeaways

  • COMBA introduces graph context gating to optimize neighbor aggregation in large graphs.
  • Cross batch aggregation allows for scalable training of graph neural networks (GNNs).
  • Theoretical analysis shows that COMBA reduces error compared to traditional GNN training methods.
  • Experiments demonstrate significant performance improvements over baseline approaches.
  • Public access to code and benchmark datasets will facilitate further research and application.

Computer Science > Machine Learning arXiv:2602.17893 (cs) [Submitted on 19 Feb 2026] Title:COMBA: Cross Batch Aggregation for Learning Large Graphs with Context Gating State Space Models Authors:Jiajun Shen, Yufei Jin, Yi He, xingquan Zhu View a PDF of the paper titled COMBA: Cross Batch Aggregation for Learning Large Graphs with Context Gating State Space Models, by Jiajun Shen and 3 other authors View PDF HTML (experimental) Abstract:State space models (SSMs) have recently emerged for modeling long-range dependency in sequence data, with much simplified computational costs than modern alternatives, such as transformers. Advancing SMMs to graph structured data, especially for large graphs, is a significant challenge because SSMs are sequence models and the shear graph volumes make it very expensive to convert graphs as sequences for effective learning. In this paper, we propose COMBA to tackle large graph learning using state space models, with two key innovations: graph context gating and cross batch aggregation. Graph context refers to different hops of neighborhood for each node, and graph context gating allows COMBA to use such context to learn best control of neighbor aggregation. For each graph context, COMBA samples nodes as batches, and train a graph neural network (GNN), with information being aggregated cross batches, allowing COMBA to scale to large graphs. Our theoretical study asserts that cross-batch aggregation guarantees lower error than training GNN witho...

Related Articles

UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
AI Hiring Growth: AI and ML Hiring Surges 37% in Marche
Machine Learning

AI Hiring Growth: AI and ML Hiring Surges 37% in Marche

Job Market Trends: Discover how AI/ML hiring increased by 37% y-o-y, with the greatest demand in high salary bands and non-IT sectors.p

AI News - General · 1 min ·
Llms

[D] How's MLX and jax/ pytorch on MacBooks these days?

​ So I'm looking at buying a new 14 inch MacBook pro with m5 pro and 64 gb of memory vs m4 max with same specs. My priorities are pro sof...

Reddit - Machine Learning · 1 min ·
Llms

[R] 94.42% on BANKING77 Official Test Split with Lightweight Embedding + Example Reranking (strict full-train protocol)

BANKING77 (77 fine-grained banking intents) is a well-established but increasingly saturated intent classification benchmark. did this wh...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime