[2602.18084] Balancing Symmetry and Efficiency in Graph Flow Matching
Summary
This paper explores the trade-off between symmetry and efficiency in graph flow matching, proposing a method to modulate symmetry during training to improve performance and reduce overfitting.
Why It Matters
Understanding the balance between symmetry and efficiency in graph generative models is crucial for optimizing their performance. This research provides insights that can enhance model training and application in various machine learning tasks, making it relevant for both researchers and practitioners in the field.
Key Takeaways
- Equivariance in graph models ensures permutation symmetry but can increase computational costs.
- Relaxing equivariance during training can accelerate convergence but may lead to overfitting.
- A controllable symmetry modulation scheme can balance efficiency and performance.
- Proper modulation can reduce training epochs while maintaining model integrity.
- The findings are significant for advancing graph generative model applications.
Computer Science > Machine Learning arXiv:2602.18084 (cs) [Submitted on 20 Feb 2026] Title:Balancing Symmetry and Efficiency in Graph Flow Matching Authors:Benjamin Honoré, Alba Carballo-Castro, Yiming Qin, Pascal Frossard View a PDF of the paper titled Balancing Symmetry and Efficiency in Graph Flow Matching, by Benjamin Honor\'e and 3 other authors View PDF HTML (experimental) Abstract:Equivariance is central to graph generative models, as it ensures the model respects the permutation symmetry of graphs. However, strict equivariance can increase computational cost due to added architectural constraints, and can slow down convergence because the model must be consistent across a large space of possible node permutations. We study this trade-off for graph generative models. Specifically, we start from an equivariant discrete flow-matching model, and relax its equivariance during training via a controllable symmetry modulation scheme based on sinusoidal positional encodings and node permutations. Experiments first show that symmetry-breaking can accelerate early training by providing an easier learning signal, but at the expense of encouraging shortcut solutions that can cause overfitting, where the model repeatedly generates graphs that are duplicates of the training set. On the contrary, properly modulating the symmetry signal can delay overfitting while accelerating convergence, allowing the model to reach stronger performance with $19\%$ of the baseline training epochs....