[2604.02201] On the Role of Depth in the Expressivity of RNNs
About this article
Abstract page for arXiv paper 2604.02201: On the Role of Depth in the Expressivity of RNNs
Computer Science > Machine Learning arXiv:2604.02201 (cs) [Submitted on 2 Apr 2026] Title:On the Role of Depth in the Expressivity of RNNs Authors:Maude Lizaire, Michael Rizvi-Martel, Éric Dupuis, Guillaume Rabusseau View a PDF of the paper titled On the Role of Depth in the Expressivity of RNNs, by Maude Lizaire and 2 other authors View PDF Abstract:The benefits of depth in feedforward neural networks are well known: composing multiple layers of linear transformations with nonlinear activations enables complex computations. While similar effects are expected in recurrent neural networks (RNNs), it remains unclear how depth interacts with recurrence to shape expressive power. Here, we formally show that depth increases RNNs' memory capacity efficiently with respect to the number of parameters, thus enhancing expressivity both by enabling more complex input transformations and improving the retention of past information. We broaden our analysis to 2RNNs, a generalization of RNNs with multiplicative interactions between inputs and hidden states. Unlike RNNs, which remain linear without nonlinear activations, 2RNNs perform polynomial transformations whose maximal degree grows with depth. We further show that multiplicative interactions cannot, in general, be replaced by layerwise nonlinearities. Finally, we validate these insights empirically on synthetic and real-world tasks. Subjects: Machine Learning (cs.LG) Cite as: arXiv:2604.02201 [cs.LG] (or arXiv:2604.02201v1 [cs.LG...