[2604.03154] DSBD: Dual-Aligned Structural Basis Distillation for Graph Domain Adaptation
About this article
Abstract page for arXiv paper 2604.03154: DSBD: Dual-Aligned Structural Basis Distillation for Graph Domain Adaptation
Computer Science > Machine Learning arXiv:2604.03154 (cs) [Submitted on 3 Apr 2026] Title:DSBD: Dual-Aligned Structural Basis Distillation for Graph Domain Adaptation Authors:Yingxu Wang, Kunyu Zhang, Jiaxin Huang, Mengzhu Wang, Mingyan Xiao, Siyang Gao, Nan Yin View a PDF of the paper titled DSBD: Dual-Aligned Structural Basis Distillation for Graph Domain Adaptation, by Yingxu Wang and 6 other authors View PDF HTML (experimental) Abstract:Graph domain adaptation (GDA) aims to transfer knowledge from a labeled source graph to an unlabeled target graph under distribution shifts. However, existing methods are largely feature-centric and overlook structural discrepancies, which become particularly detrimental under significant topology shifts. Such discrepancies alter both geometric relationships and spectral properties, leading to unreliable transfer of graph neural networks (GNNs). To address this limitation, we propose Dual-Aligned Structural Basis Distillation (DSBD) for GDA, a novel framework that explicitly models and adapts cross-domain structural variation. DSBD constructs a differentiable structural basis by synthesizing continuous probabilistic prototype graphs, enabling gradient-based optimization over graph topology. The basis is learned under source-domain supervision to preserve semantic discriminability, while being explicitly aligned to the target domain through a dual-alignment objective. Specifically, geometric consistency is enforced via permutation-invari...