[2604.03420] Zero-Shot Quantization via Weight-Space Arithmetic
About this article
Abstract page for arXiv paper 2604.03420: Zero-Shot Quantization via Weight-Space Arithmetic
Computer Science > Computer Vision and Pattern Recognition arXiv:2604.03420 (cs) [Submitted on 3 Apr 2026] Title:Zero-Shot Quantization via Weight-Space Arithmetic Authors:Daniele Solombrino, Antonio Andrea Gargiulo, Adrian Robert Minut, Luca Zhou, Alessandro Zirilli, Emanuele Rodolà View a PDF of the paper titled Zero-Shot Quantization via Weight-Space Arithmetic, by Daniele Solombrino and 5 other authors View PDF HTML (experimental) Abstract:We show that robustness to post-training quantization (PTQ) is a transferable direction in weight space. We call this direction the quantization vector: extracted from a donor task by simple weight-space arithmetic, it can be used to patch a receiver model and improve robustness to PTQ-induced noise by as much as 60%, without receiver-side quantization-aware training (QAT). Because the method requires no receiver training data, it provides a zero-shot, low-cost alternative to QAT for extremely low-bit deployment. We demonstrate this on Vision Transformer (ViT) models. More broadly, our results suggest that quantization robustness is not merely a byproduct of task-specific training, but a reusable feature of weight-space geometry that can be transferred rather than retrained. Subjects: Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI); Machine Learning (cs.LG) Cite as: arXiv:2604.03420 [cs.CV] (or arXiv:2604.03420v1 [cs.CV] for this version) https://doi.org/10.48550/arXiv.2604.03420 Focus to learn mo...