[2602.14919] BHyGNN+: Unsupervised Representation Learning for Heterophilic Hypergraphs
Summary
BHyGNN+ introduces a self-supervised learning framework for heterophilic hypergraphs, enhancing representation learning without needing labeled data, outperforming existing methods.
Why It Matters
This research addresses the limitations of existing hypergraph neural networks that rely on labeled data, which is often scarce. By proposing a self-supervised approach, it opens new avenues for effective learning in complex networks, making it relevant for real-world applications where annotations are costly or unavailable.
Key Takeaways
- BHyGNN+ leverages hypergraph duality for unsupervised learning.
- It eliminates the need for negative samples in contrastive learning.
- Extensive experiments show superior performance on benchmark datasets.
- The framework is applicable to both heterophilic and homophilic hypergraphs.
- This research establishes a new paradigm for representation learning in unlabeled hypergraphs.
Computer Science > Machine Learning arXiv:2602.14919 (cs) [Submitted on 16 Feb 2026] Title:BHyGNN+: Unsupervised Representation Learning for Heterophilic Hypergraphs Authors:Tianyi Ma, Yiyue Qian, Zehong Wang, Zheyuan Zhang, Chuxu Zhang, Yanfang Ye View a PDF of the paper titled BHyGNN+: Unsupervised Representation Learning for Heterophilic Hypergraphs, by Tianyi Ma and 5 other authors View PDF Abstract:Hypergraph Neural Networks (HyGNNs) have demonstrated remarkable success in modeling higher-order relationships among entities. However, their performance often degrades on heterophilic hypergraphs, where nodes connected by the same hyperedge tend to have dissimilar semantic representations or belong to different classes. While several HyGNNs, including our prior work BHyGNN, have been proposed to address heterophily, their reliance on labeled data significantly limits their applicability in real-world scenarios where annotations are scarce or costly. To overcome this limitation, we introduce BHyGNN+, a self-supervised learning framework that extends BHyGNN for representation learning on heterophilic hypergraphs without requiring ground-truth labels. The core idea of BHyGNN+ is hypergraph duality, a structural transformation where the roles of nodes and hyperedges are interchanged. By contrasting augmented views of a hypergraph against its dual using cosine similarity, our framework captures essential structural patterns in a fully unsupervised manner. Notably, this duality...