[2602.13075] Unified Multi-Domain Graph Pre-training for Homogeneous and Heterogeneous Graphs via Domain-Specific Expert Encoding
Summary
This paper presents a unified approach to graph pre-training that effectively integrates both homogeneous and heterogeneous graphs, addressing the limitations of existing methods.
Why It Matters
Graph pre-training is crucial for enhancing the performance of machine learning models on various tasks. This research addresses the gap in current methodologies that treat homogeneous and heterogeneous graphs separately, which is increasingly relevant in real-world applications where mixed graph types are common. By proposing a unified framework, the study enhances the adaptability and effectiveness of graph-based models across diverse domains.
Key Takeaways
- Introduces a unified multi-domain graph pre-training method ($GPH^{2}$) for both homogeneous and heterogeneous graphs.
- Demonstrates that a balanced mixture of graph types improves performance on downstream tasks.
- Proposes domain-specific expert encoding to mitigate cross-domain discrepancies.
- Implements a Task-oriented Expert Fusion Strategy to leverage the strengths of multiple experts.
- Shows significant performance improvements over existing graph pre-training methods in mixed graph scenarios.
Computer Science > Machine Learning arXiv:2602.13075 (cs) [Submitted on 13 Feb 2026] Title:Unified Multi-Domain Graph Pre-training for Homogeneous and Heterogeneous Graphs via Domain-Specific Expert Encoding Authors:Chundong Liang, Yongqi Huang, Dongxiao He, Peiyuan Li, Yawen Li, Di Jin, Weixiong Zhang View a PDF of the paper titled Unified Multi-Domain Graph Pre-training for Homogeneous and Heterogeneous Graphs via Domain-Specific Expert Encoding, by Chundong Liang and 6 other authors View PDF HTML (experimental) Abstract:Graph pre-training has achieved remarkable success in recent years, delivering transferable representations for downstream adaptation. However, most existing methods are designed for either homogeneous or heterogeneous graphs, thereby hindering unified graph modeling across diverse graph types. This separation contradicts real-world applications, where mixed homogeneous and heterogeneous graphs are ubiquitous, and distribution shifts between upstream pre-training and downstream deployment are common. In this paper, we empirically demonstrate that a balanced mixture of homogeneous and heterogeneous graph pre-training benefits downstream tasks and propose a unified multi-domain \textbf{G}raph \textbf{P}re-training method across \textbf{H}omogeneous and \textbf{H}eterogeneous graphs ($\mathbf{GPH^{2}}$). To address the lack of a unified encoder for homogeneous and heterogeneous graphs, we propose a Unified Multi-View Graph Construction that simultaneously e...