[2603.00454] Rooted Absorbed Prefix Trajectory Balance with Submodular Replay for GFlowNet Training
About this article
Abstract page for arXiv paper 2603.00454: Rooted Absorbed Prefix Trajectory Balance with Submodular Replay for GFlowNet Training
Computer Science > Machine Learning arXiv:2603.00454 (cs) [Submitted on 28 Feb 2026] Title:Rooted Absorbed Prefix Trajectory Balance with Submodular Replay for GFlowNet Training Authors:Xi Wang, Wenbo Lu, Shengjie Wang View a PDF of the paper titled Rooted Absorbed Prefix Trajectory Balance with Submodular Replay for GFlowNet Training, by Xi Wang and 2 other authors View PDF HTML (experimental) Abstract:Generative Flow Networks (GFlowNets) enable fine-tuning large language models to approximate reward-proportional posteriors, but they remain prone to mode collapse, manifesting as prefix collapse and length bias. We attribute this to two factors: (i) weak credit assignment to early prefixes, and (ii) biased replay that induces a shifted, non-representative training flow distribution. We propose Rooted absorbed prefix Trajectory Balance RapTB, an objective that anchors subtrajectory supervision at the root and propagates terminal rewards to intermediate prefixes via absorbed suffix-based backups, providing dense prefix-level learning signals. To mitigate replay-induced distribution shift, we further introduce SubM, a submodular replay refresh strategy that promotes both high reward and diversity. Empirically, on tasks such as molecule generation with LLM using SMILES strings, RapTB combined with SubM consistently improves optimization performance and molecular diversity while preserving high validity. Subjects: Machine Learning (cs.LG); Artificial Intelligence (cs.AI) Cite a...