[2602.22422] Revisiting Chebyshev Polynomial and Anisotropic RBF Models for Tabular Regression

[2602.22422] Revisiting Chebyshev Polynomial and Anisotropic RBF Models for Tabular Regression

arXiv - AI 4 min read Article

Summary

This paper explores the effectiveness of Chebyshev polynomial regressors and anisotropic RBF networks in tabular regression, benchmarking them against tree ensembles across 55 datasets.

Why It Matters

The study addresses the underutilization of smooth-basis models in tabular regression, an area dominated by tree ensembles. By demonstrating their competitive performance, especially in CPU-constrained environments, the findings encourage broader adoption of these models in practical applications.

Key Takeaways

  • Smooth-basis models like Chebyshev and anisotropic RBF networks can compete with tree ensembles in tabular regression.
  • The study benchmarks these models against 55 datasets, revealing their potential for tighter generalization gaps.
  • Despite a transformer model achieving higher accuracy, its limitations in deployment make smooth models a viable alternative.
  • The authors recommend including smooth-basis models in regression tasks, particularly where gradual predictions are beneficial.
  • All proposed models are available as scikit-learn-compatible packages, enhancing accessibility for practitioners.

Computer Science > Machine Learning arXiv:2602.22422 (cs) [Submitted on 25 Feb 2026] Title:Revisiting Chebyshev Polynomial and Anisotropic RBF Models for Tabular Regression Authors:Luciano Gerber, Huw Lloyd View a PDF of the paper titled Revisiting Chebyshev Polynomial and Anisotropic RBF Models for Tabular Regression, by Luciano Gerber and Huw Lloyd View PDF HTML (experimental) Abstract:Smooth-basis models such as Chebyshev polynomial regressors and radial basis function (RBF) networks are well established in numerical analysis. Their continuously differentiable prediction surfaces suit surrogate optimisation, sensitivity analysis, and other settings where the response varies gradually with inputs. Despite these properties, smooth models seldom appear in tabular regression, where tree ensembles dominate. We ask whether they can compete, benchmarking models across 55 regression datasets organised by application domain. We develop an anisotropic RBF network with data-driven centre placement and gradient-based width optimisation, a ridge-regularised Chebyshev polynomial regressor, and a smooth-tree hybrid (Chebyshev model tree); all three are released as scikit-learn-compatible packages. We benchmark these against tree ensembles, a pre-trained transformer, and standard baselines, evaluating accuracy alongside generalisation behaviour. The transformer ranks first on accuracy across a majority of datasets, but its GPU dependence, inference latency, and dataset-size limits cons...

Related Articles

Machine Learning

[D] Does ML have a "bible"/reference textbook at the Intermediate/Advanced level?

Hello, everyone! This is my first time posting here and I apologise if the question is, perhaps, a bit too basic for this sub-reddit. A b...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] ICML 2026 review policy debate: 100 responses suggest Policy B may score higher, while Policy A shows higher confidence

A week ago I made a thread asking whether ICML 2026’s review policy might have affected review outcomes, especially whether Policy A pape...

Reddit - Machine Learning · 1 min ·
Nomadic raises $8.4 million to wrangle the data pouring off autonomous vehicles | TechCrunch
Machine Learning

Nomadic raises $8.4 million to wrangle the data pouring off autonomous vehicles | TechCrunch

The company turns footage from robots into structured, searchable datasets with a deep learning model.

TechCrunch - AI · 6 min ·
Machine Learning

[D] Applied AI/Machine learning course by Srikanth Varma

I have all 10 modules of this course, along with all the notes, assignments, and solutions. If anyone need this course DM me. submitted b...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime