[2602.21307] SymTorch: A Framework for Symbolic Distillation of Deep Neural Networks

[2602.21307] SymTorch: A Framework for Symbolic Distillation of Deep Neural Networks

arXiv - Machine Learning 3 min read Article

Summary

SymTorch is a new library that automates the symbolic distillation of deep neural networks, converting them into interpretable mathematical expressions to enhance understanding and integration in workflows.

Why It Matters

This framework addresses the engineering challenges that have limited the adoption of symbolic regression in deep learning, potentially enabling more interpretable AI models. By improving model transparency, it can aid in discovering physical laws and enhance the efficiency of large language models (LLMs).

Key Takeaways

  • SymTorch automates the symbolic distillation process for deep learning models.
  • It addresses key engineering challenges like data transfer and model serialization.
  • The library has been tested across various architectures, including GNNs and transformers.
  • A proof-of-concept shows an 8.3% throughput improvement in LLM inference.
  • This approach can lead to more interpretable AI models and facilitate the discovery of mathematical relationships.

Computer Science > Machine Learning arXiv:2602.21307 (cs) [Submitted on 24 Feb 2026] Title:SymTorch: A Framework for Symbolic Distillation of Deep Neural Networks Authors:Elizabeth S.Z. Tan, Adil Soubki, Miles Cranmer View a PDF of the paper titled SymTorch: A Framework for Symbolic Distillation of Deep Neural Networks, by Elizabeth S.Z. Tan and 1 other authors View PDF HTML (experimental) Abstract:Symbolic distillation replaces neural networks, or components thereof, with interpretable, closed-form mathematical expressions. This approach has shown promise in discovering physical laws and mathematical relationships directly from trained deep learning models, yet adoption remains limited due to the engineering barrier of integrating symbolic regression into deep learning workflows. We introduce SymTorch, a library that automates this distillation by wrapping neural network components, collecting their input-output behavior, and approximating them with human-readable equations via PySR. SymTorch handles the engineering challenges that have hindered adoption: GPU-CPU data transfer, input-output caching, model serialization, and seamless switching between neural and symbolic forward passes. We demonstrate SymTorch across diverse architectures including GNNs, PINNs and transformer models. Finally, we present a proof-of-concept for accelerating LLM inference by replacing MLP layers with symbolic surrogates, achieving an 8.3\% throughput improvement with moderate performance degr...

Related Articles

UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
Llms

built an open source tool that auto generates AI context files for any codebase, 150 stars in

one of the most tedious parts of working with AI coding tools is having to manually write context files every single time. CLAUDE.md, .cu...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

[R] First open-source implementation of Hebbian fast-weight write-back for the BDH architecture

The BDH (Dragon Hatchling) paper (arXiv:2509.26507) describes a Hebbian synaptic plasticity mechanism where model weights update during i...

Reddit - Machine Learning · 1 min ·
Llms

[R] A language model built from the damped harmonic oscillator equation — no transformer blocks

I've been building a neural architecture where the only learnable transform is the transfer function of a damped harmonic oscillator: H(ω...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime