[2604.00399] A Cross-graph Tuning-free GNN Prompting Framework
About this article
Abstract page for arXiv paper 2604.00399: A Cross-graph Tuning-free GNN Prompting Framework
Computer Science > Machine Learning arXiv:2604.00399 (cs) [Submitted on 1 Apr 2026] Title:A Cross-graph Tuning-free GNN Prompting Framework Authors:Yaqi Chen, Shixun Huang, Ryan Twemlow, Lei Wang, John Le, Sheng Wang, Willy Susilo, Jun Yan, Jun Shen View a PDF of the paper titled A Cross-graph Tuning-free GNN Prompting Framework, by Yaqi Chen and 8 other authors View PDF HTML (experimental) Abstract:GNN prompting aims to adapt models across tasks and graphs without requiring extensive retraining. However, most existing graph prompt methods still require task-specific parameter updates and face the issue of generalizing across graphs, limiting their performance and undermining the core promise of prompting. In this work, we introduce a Cross-graph Tuning-free Prompting Framework (CTP), which supports both homogeneous and heterogeneous graphs, can be directly deployed to unseen graphs without further parameter tuning, and thus enables a plug-and-play GNN inference engine. Extensive experiments on few-shot prediction tasks show that, compared to SOTAs, CTP achieves an average accuracy gain of 30.8% and a maximum gain of 54%, confirming its effectiveness and offering a new perspective on graph prompt learning. Subjects: Machine Learning (cs.LG) Cite as: arXiv:2604.00399 [cs.LG] (or arXiv:2604.00399v1 [cs.LG] for this version) https://doi.org/10.48550/arXiv.2604.00399 Focus to learn more arXiv-issued DOI via DataCite (pending registration) Submission history From: Shixun Hu...