[2603.04956] WaterSIC: information-theoretically (near) optimal linear layer quantization
About this article
Abstract page for arXiv paper 2603.04956: WaterSIC: information-theoretically (near) optimal linear layer quantization
Computer Science > Machine Learning arXiv:2603.04956 (cs) [Submitted on 5 Mar 2026] Title:WaterSIC: information-theoretically (near) optimal linear layer quantization Authors:Egor Lifar, Semyon Savkin, Or Ordentlich, Yury Polyanskiy View a PDF of the paper titled WaterSIC: information-theoretically (near) optimal linear layer quantization, by Egor Lifar and 3 other authors View PDF HTML (experimental) Abstract:This paper considers the problem of converting a given dense linear layer to low precision. The tradeoff between compressed length and output discrepancy is analyzed information theoretically (IT). It is shown that a popular GPTQ algorithm may have an arbitrarily large gap to the IT limit. To alleviate this problem, a novel algorithm, termed ''WaterSIC'', is proposed and is shown to be within a rate gap of 0.255 bits to the IT limit, uniformly over all possible covariance matrices of input activations. The key innovation of WaterSIC's is to allocate different quantization rates to different columns (in-features) of the weight matrix, mimicking the classical IT solution known as ''waterfilling''. Applying WaterSIC to the Llama and Qwen family of LLMs establishes new state-of-the-art performance for all quantization rates from 1 to 4 bits. Subjects: Machine Learning (cs.LG); Information Theory (cs.IT) Cite as: arXiv:2603.04956 [cs.LG] (or arXiv:2603.04956v1 [cs.LG] for this version) https://doi.org/10.48550/arXiv.2603.04956 Focus to learn more arXiv-issued DOI via ...