[2603.23562] Synthetic Mixed Training: Scaling Parametric Knowledge Acquisition Beyond RAG
About this article
Abstract page for arXiv paper 2603.23562: Synthetic Mixed Training: Scaling Parametric Knowledge Acquisition Beyond RAG
Computer Science > Machine Learning arXiv:2603.23562 (cs) [Submitted on 24 Mar 2026] Title:Synthetic Mixed Training: Scaling Parametric Knowledge Acquisition Beyond RAG Authors:Seungju Han, Konwoo Kim, Chanwoo Park, Benjamin Newman, Suhas Kotha, Jaehun Jung, James Zou, Yejin Choi View a PDF of the paper titled Synthetic Mixed Training: Scaling Parametric Knowledge Acquisition Beyond RAG, by Seungju Han and Konwoo Kim and Chanwoo Park and Benjamin Newman and Suhas Kotha and Jaehun Jung and James Zou and Yejin Choi View PDF HTML (experimental) Abstract:Synthetic data augmentation helps language models learn new knowledge in data-constrained domains. However, naively scaling existing synthetic data methods by training on more synthetic tokens or using stronger generators yields diminishing returns below the performance of RAG. To break the RAG ceiling, we introduce Synthetic Mixed Training, which combines synthetic QAs and synthetic documents. This leverages their complementary training signals, and enables log-linear improvements as both synthetic data volume and generator strength increase. This allows the model to outperform RAG by a 2.6\% relative gain on QuaLITY, a long-document reading comprehension benchmark. In addition, we introduce Focal Rewriting, a simple technique for synthetic document generation that explicitly conditions document generation on specific questions, improving the diversity of synthetic documents and yielding a steeper log-linear scaling curve. On...