[2603.00233] Scaling Quantum Machine Learning without Tricks: High-Resolution and Diverse Image Generation
About this article
Abstract page for arXiv paper 2603.00233: Scaling Quantum Machine Learning without Tricks: High-Resolution and Diverse Image Generation
Quantum Physics arXiv:2603.00233 (quant-ph) [Submitted on 27 Feb 2026] Title:Scaling Quantum Machine Learning without Tricks: High-Resolution and Diverse Image Generation Authors:Jonas Jäger, Florian J. Kiwit, Carlos A. Riofrío View a PDF of the paper titled Scaling Quantum Machine Learning without Tricks: High-Resolution and Diverse Image Generation, by Jonas J\"ager and 2 other authors View PDF HTML (experimental) Abstract:Quantum generative modeling is a rapidly evolving discipline at the intersection of quantum computing and machine learning. Contemporary quantum machine learning is generally limited to toy examples or heavily restricted datasets with few elements. This is not only due to the current limitations of available quantum hardware but also due to the absence of inductive biases arising from application-agnostic designs. Current quantum solutions must resort to tricks to scale down high-resolution images, such as relying heavily on dimensionality reduction or utilizing multiple quantum models for low-resolution image patches. Building on recent developments in classical image loading to quantum computers, we circumvent these limitations and train quantum Wasserstein GANs on the established classical MNIST and Fashion-MNIST datasets. Using the complete datasets, our system generates full-resolution images across all ten classes and establishes a new state-of-the-art performance with a single end-to-end quantum generator without tricks. As a proof-of-principle,...