[2308.05629] Inhibitor Transformers and Gated RNNs for Torus Efficient Fully Homomorphic Encryption
About this article
Abstract page for arXiv paper 2308.05629: Inhibitor Transformers and Gated RNNs for Torus Efficient Fully Homomorphic Encryption
Computer Science > Machine Learning arXiv:2308.05629 (cs) [Submitted on 10 Aug 2023 (v1), last revised 23 Mar 2026 (this version, v2)] Title:Inhibitor Transformers and Gated RNNs for Torus Efficient Fully Homomorphic Encryption Authors:Rickard Brännvall, Tony Zhang, Henrik Forsgren, Andrei Stoian, Fredrik Sandin, Marcus Liwicki View a PDF of the paper titled Inhibitor Transformers and Gated RNNs for Torus Efficient Fully Homomorphic Encryption, by Rickard Br\"annvall and 5 other authors View PDF HTML (experimental) Abstract:This paper introduces efficient modifications to neural network-based sequence processing approaches, laying new grounds for scalable privacy-preserving machine learning under Fully Homomorphic Encryption (FHE). Transformers are now ubiquitous in AI applications and have largely supplanted Gated Recurrent Neural Networks (RNNs) as the standard architecture for sequence modeling. Both architectures rely on costly multiplications and complex activations that hinder encrypted inference. We focus on TFHE, which supports deep circuit evaluation and efficient univariate function evaluation but makes variable-to-variable multiplication particularly expensive. To address this, we propose inhibitor designs for Transformers and gated RNNs that replace multiplications and Softmax/Sigmoid activations with additive and ReLU-based operations. These changes enable integer-only computation, reduce circuit depth, and improve the efficiency of encrypted execution while p...