[2602.13506] $γ$-weakly $θ$-up-concavity: Linearizable Non-Convex Optimization with Applications to DR-Submodular and OSS Functions

[2602.13506] $γ$-weakly $θ$-up-concavity: Linearizable Non-Convex Optimization with Applications to DR-Submodular and OSS Functions

arXiv - AI 3 min read Article

Summary

The paper introduces $γ$-weakly $θ$-up-concavity, a new condition for optimizing monotone non-convex functions, enhancing approximation guarantees for various optimization problems.

Why It Matters

This research addresses a significant challenge in machine learning and combinatorial optimization by providing a unified framework for non-convex functions. The findings can improve algorithms for DR-submodular and OSS functions, with implications for both offline and online optimization strategies.

Key Takeaways

  • Introduces a novel first-order condition for non-convex optimization.
  • Establishes upper-linearizability for $γ$-weakly $θ$-up-concave functions.
  • Provides unified approximation guarantees for a range of optimization problems.
  • Improves existing approximation coefficients for OSS optimization.
  • Demonstrates applications in both offline and online optimization settings.

Computer Science > Machine Learning arXiv:2602.13506 (cs) [Submitted on 13 Feb 2026] Title:$γ$-weakly $θ$-up-concavity: Linearizable Non-Convex Optimization with Applications to DR-Submodular and OSS Functions Authors:Mohammad Pedramfar, Vaneet Aggarwal View a PDF of the paper titled $\gamma$-weakly $\theta$-up-concavity: Linearizable Non-Convex Optimization with Applications to DR-Submodular and OSS Functions, by Mohammad Pedramfar and Vaneet Aggarwal View PDF Abstract:Optimizing monotone non-convex functions is a fundamental challenge across machine learning and combinatorial optimization. We introduce and study $\gamma$-weakly $\theta$-up-concavity, a novel first-order condition that characterizes a broad class of such functions. This condition provides a powerful unifying framework, strictly generalizing both DR-submodular functions and One-Sided Smooth (OSS) functions. Our central theoretical contribution demonstrates that $\gamma$-weakly $\theta$-up-concave functions are upper-linearizable: for any feasible point, we can construct a linear surrogate whose gains provably approximate the original non-linear objective. This approximation holds up to a constant factor, namely the approximation coefficient, dependent solely on $\gamma$, $\theta$, and the geometry of the feasible set. This linearizability yields immediate and unified approximation guarantees for a wide range of problems. Specifically, we obtain unified approximation guarantees for offline optimization as w...

Related Articles

UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
University of Tartu thesis: transfer learning boosts Estonian AI models
Machine Learning

University of Tartu thesis: transfer learning boosts Estonian AI models

AI News - General · 4 min ·
Machine Learning

AI model suggests CPAP can massively swing heart risk in sleep apnea

AI News - General · 1 min ·
Machine Learning

COD expands AI education with degree and machine learning certificate

AI News - General ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime