[2502.01247] Polynomial, trigonometric, and tropical activations
About this article
Abstract page for arXiv paper 2502.01247: Polynomial, trigonometric, and tropical activations
Computer Science > Machine Learning arXiv:2502.01247 (cs) [Submitted on 3 Feb 2025 (v1), last revised 2 Mar 2026 (this version, v3)] Title:Polynomial, trigonometric, and tropical activations Authors:Ismail Khalfaoui-Hassani, Stefan Kesselheim View a PDF of the paper titled Polynomial, trigonometric, and tropical activations, by Ismail Khalfaoui-Hassani and Stefan Kesselheim View PDF Abstract:Which functions can be used as activations in deep neural networks? This article explores families of functions based on orthonormal bases, including the Hermite polynomial basis and the Fourier trigonometric basis, as well as a basis resulting from the tropicalization of a polynomial basis. Our study shows that, through simple variance-preserving initialization and without additional clamping mechanisms, these activations can successfully be used to train deep models, such as GPT-2 for next-token prediction on OpenWebText and ConvNeXt for image classification on ImageNet. Our work addresses the issue of exploding and vanishing activations and gradients, particularly prevalent with polynomial activations, and opens the door for improving the efficiency of large-scale learning tasks. Furthermore, our approach provides insight into the structure of neural networks, revealing that networks with polynomial activations can be interpreted as multivariate polynomial mappings. Finally, using Hermite interpolation, we show that our activations can closely approximate classical ones in pre-train...