[2603.27987] Beyond Dataset Distillation: Lossless Dataset Concentration via Diffusion-Assisted Distribution Alignment
About this article
Abstract page for arXiv paper 2603.27987: Beyond Dataset Distillation: Lossless Dataset Concentration via Diffusion-Assisted Distribution Alignment
Computer Science > Computer Vision and Pattern Recognition arXiv:2603.27987 (cs) [Submitted on 30 Mar 2026] Title:Beyond Dataset Distillation: Lossless Dataset Concentration via Diffusion-Assisted Distribution Alignment Authors:Tongfei Liu, Yufan Liu, Bing Li, Weiming Hu View a PDF of the paper titled Beyond Dataset Distillation: Lossless Dataset Concentration via Diffusion-Assisted Distribution Alignment, by Tongfei Liu and 3 other authors View PDF HTML (experimental) Abstract:The high cost and accessibility problem associated with large datasets hinder the development of large-scale visual recognition systems. Dataset Distillation addresses these problems by synthesizing compact surrogate datasets for efficient training, storage, transfer, and privacy preservation. The existing state-of-the-art diffusion-based dataset distillation methods face three issues: lack of theoretical justification, poor efficiency in scaling to high data volumes, and failure in data-free scenarios. To address these issues, we establish a theoretical framework that justifies the use of diffusion models by proving the equivalence between dataset distillation and distribution matching, and reveals an inherent efficiency limit in the dataset distillation paradigm. We then propose a Dataset Concentration (DsCo) framework that uses a diffusion-based Noise-Optimization (NOpt) method to synthesize a small yet representative set of samples, and optionally augments the synthetic data via "Doping", which ...