[2509.00955] ART: Adaptive Resampling-based Training for Imbalanced Classification

[2509.00955] ART: Adaptive Resampling-based Training for Imbalanced Classification

arXiv - AI 4 min read Article

Summary

The paper presents ART, a novel Adaptive Resampling-based Training method for imbalanced classification that dynamically adjusts training data distribution based on class performance, leading to improved model accuracy.

Why It Matters

Imbalanced classification is a common challenge in machine learning, often leading to suboptimal model performance. ART addresses this issue by adapting the training process based on real-time performance metrics, making it a significant advancement over traditional static methods. This approach can enhance the reliability of models in critical applications across various domains.

Key Takeaways

  • ART adapts training data distribution based on class performance metrics.
  • The method outperforms traditional resampling techniques like SMOTE and NearMiss.
  • Improvements in macro F1 score are statistically significant across multiple datasets.
  • ART provides a consistent performance boost in both binary and multi-class classification tasks.
  • The approach is robust against varying degrees of class imbalance.

Computer Science > Machine Learning arXiv:2509.00955 (cs) [Submitted on 31 Aug 2025 (v1), last revised 15 Feb 2026 (this version, v2)] Title:ART: Adaptive Resampling-based Training for Imbalanced Classification Authors:Arjun Basandrai, Shourya Jain, K. Ilanthenral View a PDF of the paper titled ART: Adaptive Resampling-based Training for Imbalanced Classification, by Arjun Basandrai and 1 other authors View PDF HTML (experimental) Abstract:Traditional resampling methods for handling class imbalance typically uses fixed distributions, undersampling the majority or oversampling the minority. These static strategies ignore changes in class-wise learning difficulty, which can limit the overall performance of the model. This paper proposes an Adaptive Resampling-based Training (ART) method that periodically updates the distribution of the training data based on the class-wise performance of the model. Specifically, ART uses class-wise macro F1 scores, computed at fixed intervals, to determine the degree of resampling to be performed. Unlike instance-level difficulty modeling, which is noisy and outlier-sensitive, ART adapts at the class level. This allows the model to incrementally shift its attention towards underperforming classes in a way that better aligns with the optimization objective. Results on diverse benchmarks, including Pima Indians Diabetes and Yeast dataset demonstrate that ART consistently outperforms both resampling-based and algorithm-level methods, including ...

Related Articles

Machine Learning

Anyone compared Gemma 4 31B

I have been seeing a lot of people claiming how good Gemma 4 31B model is. I know when compared to the size of models like sonnet which i...

Reddit - Artificial Intelligence · 1 min ·
Google’s Gemini AI can answer your questions with 3D models and simulations
Llms

Google’s Gemini AI can answer your questions with 3D models and simulations

Google's latest upgrade for Gemini will allow the chatbot to generate interactive 3D models and simulations in response to your questions...

The Verge - AI · 4 min ·
The fear over Anthropic’s new AI model Mythos
Machine Learning

The fear over Anthropic’s new AI model Mythos

AI Tools & Products · 5 min ·
The Gemini app can now generate interactive simulations and models.
Llms

The Gemini app can now generate interactive simulations and models.

AI Tools & Products · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime