[2604.02474] Time-Warping Recurrent Neural Networks for Transfer Learning
About this article
Abstract page for arXiv paper 2604.02474: Time-Warping Recurrent Neural Networks for Transfer Learning
Computer Science > Machine Learning arXiv:2604.02474 (cs) [Submitted on 2 Apr 2026] Title:Time-Warping Recurrent Neural Networks for Transfer Learning Authors:Jonathon Hirschi View a PDF of the paper titled Time-Warping Recurrent Neural Networks for Transfer Learning, by Jonathon Hirschi View PDF HTML (experimental) Abstract:Dynamical systems describe how a physical system evolves over time. Physical processes can evolve faster or slower in different environmental conditions. We use time-warping as rescaling the time in a model of a physical system. This thesis proposes a new method of transfer learning for Recurrent Neural Networks (RNNs) based on time-warping. We prove that for a class of linear, first-order differential equations known as time lag models, an LSTM can approximate these systems with any desired accuracy, and the model can be time-warped while maintaining the approximation accuracy. The Time-Warping method of transfer learning is then evaluated in an applied problem on predicting fuel moisture content (FMC), an important concept in wildfire modeling. An RNN with LSTM recurrent layers is pretrained on fuels with a characteristic time scale of 10 hours, where there are large quantities of data available for training. The RNN is then modified with transfer learning to generate predictions for fuels with characteristic time scales of 1 hour, 100 hours, and 1000 hours. The Time-Warping method is evaluated against several known methods of transfer learning. The ...