[2604.00060] Scaled Gradient Descent for Ill-Conditioned Low-Rank Matrix Recovery with Optimal Sampling Complexity

[2604.00060] Scaled Gradient Descent for Ill-Conditioned Low-Rank Matrix Recovery with Optimal Sampling Complexity

arXiv - Machine Learning 4 min read

About this article

Abstract page for arXiv paper 2604.00060: Scaled Gradient Descent for Ill-Conditioned Low-Rank Matrix Recovery with Optimal Sampling Complexity

Statistics > Machine Learning arXiv:2604.00060 (stat) [Submitted on 31 Mar 2026] Title:Scaled Gradient Descent for Ill-Conditioned Low-Rank Matrix Recovery with Optimal Sampling Complexity Authors:Zhenxuan Li, Meng Huang View a PDF of the paper titled Scaled Gradient Descent for Ill-Conditioned Low-Rank Matrix Recovery with Optimal Sampling Complexity, by Zhenxuan Li and Meng Huang View PDF HTML (experimental) Abstract:The low-rank matrix recovery problem seeks to reconstruct an unknown $n_1 \times n_2$ rank-$r$ matrix from $m$ linear measurements, where $m\ll n_1n_2$. This problem has been extensively studied over the past few decades, leading to a variety of algorithms with solid theoretical guarantees. Among these, gradient descent based non-convex methods have become particularly popular due to their computational efficiency. However, these methods typically suffer from two key limitations: a sub-optimal sample complexity of $O((n_1 + n_2)r^2)$ and an iteration complexity of $O(\kappa \log(1/\epsilon))$ to achieve $\epsilon$-accuracy, resulting in slow convergence when the target matrix is ill-conditioned. Here, $\kappa$ denotes the condition number of the unknown matrix. Recent studies show that a preconditioned variant of GD, known as scaled gradient descent (ScaledGD), can significantly reduce the iteration complexity to $O(\log(1/\epsilon))$. Nonetheless, its sample complexity remains sub-optimal at $O((n_1 + n_2)r^2)$. In contrast, a delicate virtual sequence tech...

Originally published on April 02, 2026. Curated by AI News.

Related Articles

The Download: AI's impact on jobs, and data centres in space | MIT Technology Review

The Download: AI's impact on jobs, and data centres in space | MIT Technology Review

Trump wants to slash science and tech spending again.

MIT Technology Review · 5 min ·
Llms

[D] thoughts on current community moving away from heavy math?

I don't know about how you guys feel but even before LLM started, many papers are already leaning on empirical findings, architecture des...

Reddit - Machine Learning · 1 min ·

Adobe Firefly Web vs Mobile vs Boards (2026): Which One Should You Actually Use?

Most of my clients are using Adobe Firefly, and I keep getting the same question: Which interface should I actually be using—Web, Mobile,...

Reddit - Artificial Intelligence · 1 min ·

[R] TriAttention: Efficient KV Cache Compression for Long-Context Reasoning

submitted by /u/Benlus [link] [comments]

Reddit - Machine Learning · 1 min ·

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime