[2603.22429] Neural Structure Embedding for Symbolic Regression via Continuous Structure Search and Coefficient Optimization
Nlp

[2603.22429] Neural Structure Embedding for Symbolic Regression via Continuous Structure Search and Coefficient Optimization

arXiv - Machine Learning 4 min read

About this article

Abstract page for arXiv paper 2603.22429: Neural Structure Embedding for Symbolic Regression via Continuous Structure Search and Coefficient Optimization

Computer Science > Machine Learning arXiv:2603.22429 (cs) [Submitted on 23 Mar 2026] Title:Neural Structure Embedding for Symbolic Regression via Continuous Structure Search and Coefficient Optimization Authors:Fateme Memar, Tao Zhe, Dongjie Wang View a PDF of the paper titled Neural Structure Embedding for Symbolic Regression via Continuous Structure Search and Coefficient Optimization, by Fateme Memar and 2 other authors View PDF HTML (experimental) Abstract:Symbolic regression aims to discover human-interpretable equations that explain observational data. However, existing approaches rely heavily on discrete structure search (e.g., genetic programming), which often leads to high computational cost, unstable performance, and limited scalability to large equation spaces. To address these challenges, we propose SRCO, a unified embedding-driven framework for symbolic regression that transforms symbolic structures into a continuous, optimizable representation space. The framework consists of three key components: (1) structure embedding: we first generate a large pool of exploratory equations using traditional symbolic regression algorithms and train a Transformer model to compress symbolic structures into a continuous embedding space; (2) continuous structure search: the embedding space enables efficient exploration using gradient-based or sampling-based optimization, significantly reducing the cost of navigating the combinatorial structure space; and (3) coefficient optimi...

Originally published on March 25, 2026. Curated by AI News.

Related Articles

Machine Learning

[P] I tested Meta’s brain-response model on posts. It predicted the Elon one almost perfectly.

I built an experimental UI and visualization layer around Meta’s open brain-response model just to see whether this stuff actually works ...

Reddit - Machine Learning · 1 min ·
Machine Learning

[R] First open-source implementation of Hebbian fast-weight write-back for the BDH architecture

The BDH (Dragon Hatchling) paper (arXiv:2509.26507) describes a Hebbian synaptic plasticity mechanism where model weights update during i...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] Could really use some guidance . I'm a 2nd year Data Science UG Student

I'm currently finishing up my second year of a three year Bachelor of Data Science degree. I've got the basics down quite well, linear re...

Reddit - Machine Learning · 1 min ·
Machine Learning

[P] Create datasets from TikTok videos

For ML experiments and RAG projects: Tikkocampus converts creator timelines into timestamped, searchable segments and then use it to perf...

Reddit - Machine Learning · 1 min ·
More in Nlp: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime