[2405.06727] Approximation Error and Complexity Bounds for ReLU Networks on Low-Regular Function Spaces
Summary
This paper explores the approximation capabilities of ReLU neural networks on low-regularity function spaces, establishing bounds on approximation error based on network architecture.
Why It Matters
Understanding the approximation error and complexity bounds of ReLU networks is crucial for improving neural network design, particularly in applications requiring minimal regularity assumptions. This research contributes to the theoretical foundation of machine learning, enhancing the performance and reliability of neural networks in various domains.
Key Takeaways
- The paper presents a method to bound approximation error for ReLU networks based on network width and depth.
- It connects the performance of ReLU networks to Fourier features residual networks, offering a constructive proof.
- The findings can inform the design of neural networks in low-regularity contexts, enhancing their practical applications.
Statistics > Machine Learning arXiv:2405.06727 (stat) This paper has been withdrawn by Owen Davis [Submitted on 10 May 2024 (v1), last revised 25 Feb 2026 (this version, v2)] Title:Approximation Error and Complexity Bounds for ReLU Networks on Low-Regular Function Spaces Authors:Owen Davis, Gianluca Geraci, Mohammad Motamed View a PDF of the paper titled Approximation Error and Complexity Bounds for ReLU Networks on Low-Regular Function Spaces, by Owen Davis and 2 other authors No PDF available, click to view other formats Abstract:In this work, we consider the approximation of a large class of bounded functions, with minimal regularity assumptions, by ReLU neural networks. We show that the approximation error can be bounded from above by a quantity proportional to the uniform norm of the target function and inversely proportional to the product of network width and depth. We inherit this approximation error bound from Fourier features residual networks, a type of neural network that uses complex exponential activation functions. Our proof is constructive and proceeds by conducting a careful complexity analysis associated with the approximation of a Fourier features residual network by a ReLU network. Comments: Subjects: Machine Learning (stat.ML); Machine Learning (cs.LG) MSC classes: 41A25, 41A30, 41A46, 68T07 Cite as: arXiv:2405.06727 [stat.ML] (or arXiv:2405.06727v2 [stat.ML] for this version) https://doi.org/10.48550/arXiv.2405.06727 Focus to learn more arXiv-issu...