[2506.15715] NeuronSeek: On Stability and Expressivity of Task-driven Neurons
Summary
The paper introduces NeuronSeek, a framework that enhances the stability and expressivity of task-driven neurons in deep learning through tensor decomposition, outperforming existing models in various benchmarks.
Why It Matters
This research is significant as it addresses the limitations of current deep learning models by proposing a novel approach that improves neuron stability and convergence speed. The theoretical guarantees provided enhance the understanding of neural network capabilities, making it relevant for researchers and practitioners in AI and machine learning.
Key Takeaways
- NeuronSeek utilizes tensor decomposition for optimal neuron formulation.
- The framework offers enhanced stability and faster convergence compared to traditional methods.
- Theoretical guarantees support the ability to approximate continuous functions effectively.
- Empirical evaluations show competitive performance against state-of-the-art models.
- The code for NeuronSeek is publicly available for further research.
Computer Science > Machine Learning arXiv:2506.15715 (cs) [Submitted on 1 Jun 2025 (v1), last revised 15 Feb 2026 (this version, v2)] Title:NeuronSeek: On Stability and Expressivity of Task-driven Neurons Authors:Hanyu Pei, Jing-Xiao Liao, Qibin Zhao, Ting Gao, Shijun Zhang, Xiaoge Zhang, Feng-Lei Fan View a PDF of the paper titled NeuronSeek: On Stability and Expressivity of Task-driven Neurons, by Hanyu Pei and 6 other authors View PDF HTML (experimental) Abstract:Drawing inspiration from our human brain that designs different neurons for different tasks, recent advances in deep learning have explored modifying a network's neurons to develop so-called task-driven neurons. Prototyping task-driven neurons (referred to as NeuronSeek) employs symbolic regression (SR) to discover the optimal neuron formulation and construct a network from these optimized neurons. Along this direction, this work replaces symbolic regression with tensor decomposition (TD) to discover optimal neuronal formulations, offering enhanced stability and faster convergence. Furthermore, we establish theoretical guarantees that modifying the aggregation functions with common activation functions can empower a network with a fixed number of parameters to approximate any continuous function with an arbitrarily small error, providing a rigorous mathematical foundation for the NeuronSeek framework. Extensive empirical evaluations demonstrate that our NeuronSeek-TD framework not only achieves superior stabili...