[2507.01781] Symbolic Branch Networks: Tree-Inherited Neural Models for Interpretable Multiclass Classification

[2507.01781] Symbolic Branch Networks: Tree-Inherited Neural Models for Interpretable Multiclass Classification

arXiv - AI 4 min read Article

Summary

This article presents Symbolic Branch Networks (SBNs), a novel neural model that integrates decision tree structures for enhanced interpretability in multiclass classification tasks.

Why It Matters

The development of SBNs is significant as it combines the strengths of neural networks with the transparency of decision trees, addressing the growing demand for interpretable AI models. This is crucial for applications where understanding model decisions is as important as accuracy.

Key Takeaways

  • SBNs leverage tree structures to maintain interpretability while enhancing predictive performance.
  • The model allows for gradient-based learning, refining feature relevance without losing symbolic integrity.
  • SBNs consistently outperform traditional models like XGBoost on various multiclass datasets.
  • The research highlights the potential of combining symbolic structures with neural optimization.
  • SBN* variant shows competitive performance even with fixed symbolic parameters, emphasizing robustness.

Computer Science > Machine Learning arXiv:2507.01781 (cs) [Submitted on 2 Jul 2025 (v1), last revised 22 Feb 2026 (this version, v2)] Title:Symbolic Branch Networks: Tree-Inherited Neural Models for Interpretable Multiclass Classification Authors:Dalia Rodríguez-Salas View a PDF of the paper titled Symbolic Branch Networks: Tree-Inherited Neural Models for Interpretable Multiclass Classification, by Dalia Rodr\'iguez-Salas View PDF HTML (experimental) Abstract:Symbolic Branch Networks (SBNs) are neural models whose architecture is inherited directly from an ensemble of decision trees. Each root-to-parent-of-leaf decision path is mapped to a hidden neuron, and the matrices $W_{1}$ (feature-to-branch) and $W_{2}$ (branch-to-class) encode the symbolic structure of the ensemble. Because these matrices originate from the trees, SBNs preserve transparent feature relevance and branch-level semantics while enabling gradient-based learning. The primary contribution of this work is SBN, a semi-symbolic variant that preserves branch semantics by keeping $W_{2}$ fixed, while allowing $W_{1}$ to be refined through learning. This controlled relaxation improves predictive accuracy without altering the underlying symbolic structure. Across 28 multiclass tabular datasets from the OpenML CC-18 benchmark, SBN consistently matches or surpasses XGBoost while retaining human-interpretable branch attributions. We also analyze SBN*, a fully symbolic variant in which both $W_{1}$ and $W_{2}$ are f...

Related Articles

Machine Learning

Your prompts aren’t the problem — something else is

I keep seeing people focus heavily on prompt optimization. But in practice, a lot of failures I’ve observed don’t come from the prompt it...

Reddit - Artificial Intelligence · 1 min ·
UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
Machine Learning

[R], 31 MILLIONS High frequency data, Light GBM worked perfectly

We just published a paper on predicting adverse selection in high-frequency crypto markets using LightGBM, and I wanted to share it here ...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] Those of you with 10+ years in ML — what is the public completely wrong about?

For those of you who've been in ML/AI research or applied ML for 10+ years — what's the gap between what the public thinks AI is doing vs...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime