[2602.23050] Latent Matters: Learning Deep State-Space Models

[2602.23050] Latent Matters: Learning Deep State-Space Models

arXiv - Machine Learning 3 min read Article

Summary

The paper presents a novel constrained optimization framework for training deep state-space models (DSSMs), introducing the extended Kalman VAE (EKVAE) to enhance prediction accuracy and system identification in dynamic systems.

Why It Matters

This research addresses limitations in existing DSSMs by proposing a new training approach that improves the accuracy of temporal predictions. The findings are significant for fields relying on accurate modeling of dynamic systems, such as robotics and data science.

Key Takeaways

  • Introduces a constrained optimization framework for training DSSMs.
  • Presents the extended Kalman VAE (EKVAE) for improved accuracy.
  • Demonstrates significant enhancements in system identification.
  • EKVAE outperforms traditional RNN-based DSSMs in prediction tasks.
  • Successfully disentangles static and dynamic features in state-space representations.

Computer Science > Machine Learning arXiv:2602.23050 (cs) [Submitted on 26 Feb 2026] Title:Latent Matters: Learning Deep State-Space Models Authors:Alexej Klushyn, Richard Kurle, Maximilian Soelch, Botond Cseke, Patrick van der Smagt View a PDF of the paper titled Latent Matters: Learning Deep State-Space Models, by Alexej Klushyn and 4 other authors View PDF Abstract:Deep state-space models (DSSMs) enable temporal predictions by learning the underlying dynamics of observed sequence data. They are often trained by maximising the evidence lower bound. However, as we show, this does not ensure the model actually learns the underlying dynamics. We therefore propose a constrained optimisation framework as a general approach for training DSSMs. Building upon this, we introduce the extended Kalman VAE (EKVAE), which combines amortised variational inference with classic Bayesian filtering/smoothing to model dynamics more accurately than RNN-based DSSMs. Our results show that the constrained optimisation framework significantly improves system identification and prediction accuracy on the example of established state-of-the-art DSSMs. The EKVAE outperforms previous models w.r.t. prediction accuracy, achieves remarkable results in identifying dynamical systems, and can furthermore successfully learn state-space representations where static and dynamic features are disentangled. Comments: Subjects: Machine Learning (cs.LG) Cite as: arXiv:2602.23050 [cs.LG]   (or arXiv:2602.23050v1 [...

Related Articles

Machine Learning

[D] I had an idea, would love your thoughts

What happens that while training an AI during pre training we make it such that if makes "misaligned behaviour" then we just reduce like ...

Reddit - Machine Learning · 1 min ·
Machine Learning

I had an idea, would love your thoughts

What happens that while training an AI during pre training we make it such that if makes "misaligned behaviour" then we just reduce like ...

Reddit - Artificial Intelligence · 1 min ·
AI benchmarks are broken. Here’s what we need instead. | MIT Technology Review
Machine Learning

AI benchmarks are broken. Here’s what we need instead. | MIT Technology Review

One-off tests don’t measure AI’s true impact. We’re better off shifting to more human-centered, context-specific methods.

MIT Technology Review · 8 min ·
Machine Learning

[D] How does distributed proof of work computing handle the coordination needs of neural network training?

[D] Ive been trying to understand the technical setup of a project called Qubic. It claims to use distributed proof of work computing for...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime