[2506.07198] GGBall: Graph Generative Model on Poincaré Ball
Summary
The paper introduces GGBall, a novel graph generative model utilizing hyperbolic geometry to enhance the generation of hierarchical structures in graphs, significantly improving the preservation of topological hierarchies.
Why It Matters
This research addresses the limitations of traditional Euclidean geometry in modeling complex data structures. By leveraging hyperbolic geometry, GGBall offers a new framework that can better capture the intricacies of hierarchical data, which is crucial for advancements in machine learning and data science applications.
Key Takeaways
- GGBall integrates hyperbolic geometry with generative models to improve graph generation.
- The model significantly reduces degree MMD, outperforming state-of-the-art baselines.
- Hyperbolic Vector-Quantized Autoencoder (HVQVAE) is a key component of GGBall.
- The research highlights the potential of hyperbolic geometry in complex data modeling.
- Empirical results demonstrate improved preservation of topological hierarchies.
Computer Science > Machine Learning arXiv:2506.07198 (cs) [Submitted on 8 Jun 2025 (v1), last revised 19 Feb 2026 (this version, v2)] Title:GGBall: Graph Generative Model on Poincaré Ball Authors:Tianci Bu, Chuanrui Wang, Hao Ma, Haoren Zheng, Xin Lu, Tailin Wu View a PDF of the paper titled GGBall: Graph Generative Model on Poincar\'e Ball, by Tianci Bu and 5 other authors View PDF HTML (experimental) Abstract:Generating graphs with hierarchical structures remains a fundamental challenge due to the limitations of Euclidean geometry in capturing exponential complexity. Here we introduce \textbf{GGBall}, a novel hyperbolic framework for graph generation that integrates geometric inductive biases with modern generative paradigms. GGBall combines a Hyperbolic Vector-Quantized Autoencoder (HVQVAE) with a Riemannian flow matching prior defined via closed-form geodesics. This design enables flow-based priors to model complex latent distributions, while vector quantization helps preserve the curvature-aware structure of the hyperbolic space. We further develop a suite of hyperbolic GNN and Transformer layers that operate entirely within the manifold, ensuring stability and scalability. Empirically, our model reduces degree MMD by over 75\% on Community-Small and over 40\% on Ego-Small compared to state-of-the-art baselines, demonstrating an improved ability to preserve topological hierarchies. These results highlight the potential of hyperbolic geometry as a powerful foundation f...