[2510.05261] ECLipsE-Gen-Local: Efficient Compositional Local Lipschitz Estimates for Deep Neural Networks
About this article
Abstract page for arXiv paper 2510.05261: ECLipsE-Gen-Local: Efficient Compositional Local Lipschitz Estimates for Deep Neural Networks
Computer Science > Machine Learning arXiv:2510.05261 (cs) [Submitted on 6 Oct 2025 (v1), last revised 7 Apr 2026 (this version, v2)] Title:ECLipsE-Gen-Local: Efficient Compositional Local Lipschitz Estimates for Deep Neural Networks Authors:Yuezhu Xu, S. Sivaranjani View a PDF of the paper titled ECLipsE-Gen-Local: Efficient Compositional Local Lipschitz Estimates for Deep Neural Networks, by Yuezhu Xu and 1 other authors View PDF Abstract:The Lipschitz constant is a key measure for certifying the robustness of neural networks to input perturbations. However, computing the exact constant is NP-hard, and standard approaches to estimate the Lipschitz constant involve solving a large matrix semidefinite program (SDP) that scales poorly with network size. Further, there is a potential to efficiently leverage local information on the input region to provide tighter Lipschitz estimates. We address this problem here by proposing a compositional framework that yields tight yet scalable Lipschitz estimates for deep feedforward neural networks. Specifically, we begin by developing a generalized SDP framework that is highly flexible, accommodating heterogeneous activation function slope, and allowing Lipschitz estimates with respect to arbitrary input-output pairs and arbitrary choices of sub-networks of consecutive layers. We then decompose this generalized SDP into a sequence of small sub-problems, with computational complexity that scales linearly with respect to the network depth...