[2404.17768] Changing the Training Data Distribution to Reduce Simplicity Bias Improves In-distribution Generalization

[2404.17768] Changing the Training Data Distribution to Reduce Simplicity Bias Improves In-distribution Generalization

arXiv - Machine Learning 4 min read

About this article

Abstract page for arXiv paper 2404.17768: Changing the Training Data Distribution to Reduce Simplicity Bias Improves In-distribution Generalization

Computer Science > Machine Learning arXiv:2404.17768 (cs) [Submitted on 27 Apr 2024 (v1), last revised 2 Mar 2026 (this version, v3)] Title:Changing the Training Data Distribution to Reduce Simplicity Bias Improves In-distribution Generalization Authors:Dang Nguyen, Paymon Haddad, Eric Gan, Baharan Mirzasoleiman View a PDF of the paper titled Changing the Training Data Distribution to Reduce Simplicity Bias Improves In-distribution Generalization, by Dang Nguyen and 3 other authors View PDF HTML (experimental) Abstract:Can we modify the training data distribution to encourage the underlying optimization method toward finding solutions with superior generalization performance on in-distribution data? In this work, we approach this question for the first time by comparing the inductive bias of gradient descent (GD) with that of sharpness-aware minimization (SAM). By studying a two-layer CNN, we rigorously prove that SAM learns different features more uniformly, particularly in early epochs. That is, SAM is less susceptible to simplicity bias compared to GD. We also show that examples containing features that are learned early are separable from the rest based on the model's output. Based on this observation, we propose a method that (i) clusters examples based on the network output early in training, (ii) identifies a cluster of examples with similar network output, and (iii) upsamples the rest of examples only once to alleviate the simplicity bias. We show empirically that ...

Originally published on March 03, 2026. Curated by AI News.

Related Articles

AI chip startup Rebellions raises $400 million at $2.3B valuation in pre-IPO round | TechCrunch
Machine Learning

AI chip startup Rebellions raises $400 million at $2.3B valuation in pre-IPO round | TechCrunch

The startup, which is planning to go public later this year, designs chips specifically for AI inference, another challenger to Nvidia's ...

TechCrunch - AI · 4 min ·
Llms

CLI for Google AI Search (gai.google) — run AI-powered code/tech searches headlessly from your terminal

Google AI (gai.google) gives Gemini-powered answers for technical queries — think AI-enhanced search with code understanding. I built a C...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

Big increase in the amount of people using AI to write their replies with AI

I find it interesting that we’ve all randomly decided to use the “-“ more often recently on reddit, and everyone’s grammar has drasticall...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

[D] MXFP8 GEMM: Up to 99% of cuBLAS performance using CUDA + PTX

New blog post by Daniel Vega-Myhre (Meta/PyTorch) illustrating GEMM design for FP8, including deep-dives into all the constraints and des...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime