[2602.17934] Causal Neighbourhood Learning for Invariant Graph Representations
Summary
The paper presents Causal Neighbourhood Learning (CNL-GNN), a novel framework for improving Graph Neural Networks (GNNs) by addressing spurious correlations in graph data, enhancing model robustness and generalization across various graph structures.
Why It Matters
This research is significant as it tackles the limitations of traditional GNNs that struggle with noisy data and spurious connections. By focusing on causal relationships, the proposed CNL-GNN framework could lead to more accurate predictions and better performance in real-world applications, making it a valuable contribution to the field of machine learning.
Key Takeaways
- CNL-GNN improves GNNs by focusing on causal relationships.
- The framework reduces spurious influences through counterfactual neighbourhoods.
- It enhances model robustness under distribution shifts.
- CNL-GNN outperforms state-of-the-art GNN models in experiments.
- The approach combines structural interventions with causal feature disentanglement.
Computer Science > Machine Learning arXiv:2602.17934 (cs) [Submitted on 20 Feb 2026] Title:Causal Neighbourhood Learning for Invariant Graph Representations Authors:Simi Job, Xiaohui Tao, Taotao Cai, Haoran Xie, Jianming Yong View a PDF of the paper titled Causal Neighbourhood Learning for Invariant Graph Representations, by Simi Job and 4 other authors View PDF HTML (experimental) Abstract:Graph data often contain noisy and spurious correlations that mask the true causal relationships, which are essential for enabling graph models to make predictions based on the underlying causal structure of the data. Dependence on spurious connections makes it challenging for traditional Graph Neural Networks (GNNs) to generalize effectively across different graphs. Furthermore, traditional aggregation methods tend to amplify these spurious patterns, limiting model robustness under distribution shifts. To address these issues, we propose Causal Neighbourhood Learning with Graph Neural Networks (CNL-GNN), a novel framework that performs causal interventions on graph structure. CNL-GNN effectively identifies and preserves causally relevant connections and reduces spurious influences through the generation of counterfactual neighbourhoods and adaptive edge perturbation guided by learnable importance masking and an attention-based mechanism. In addition, by combining structural-level interventions with the disentanglement of causal features from confounding factors, the model learns invari...