[2510.24215] What Can Be Recovered Under Sparse Adversarial Corruption? Assumption-Free Theory for Linear Measurements
Summary
This paper explores the recoverability of sparse adversarial vectors in linear measurements without relying on strong structural assumptions, presenting a new theoretical framework.
Why It Matters
Understanding how to recover signals under adversarial conditions is crucial in fields like machine learning and signal processing. This research provides a foundation for developing more robust algorithms that can handle real-world data corruption without strict assumptions, enhancing the reliability of various applications.
Key Takeaways
- Introduces an assumption-free theory for recovering sparse signals under adversarial corruption.
- Identifies the smallest robust solution set as a function of the kernel of a projection matrix.
- Demonstrates that minimizing the ℓ0-norm leads to solutions within the identified robust set.
Computer Science > Information Theory arXiv:2510.24215 (cs) [Submitted on 28 Oct 2025 (v1), last revised 16 Feb 2026 (this version, v3)] Title:What Can Be Recovered Under Sparse Adversarial Corruption? Assumption-Free Theory for Linear Measurements Authors:Vishal Halder, Alexandre Reiffers-Masson, Abdeldjalil Aïssa-El-Bey, Gugan Thoppe View a PDF of the paper titled What Can Be Recovered Under Sparse Adversarial Corruption? Assumption-Free Theory for Linear Measurements, by Vishal Halder and 3 other authors View PDF HTML (experimental) Abstract:Let $A \in \mathbb{R}^{m \times n}$ be an arbitrary, known matrix and $e$ a $q$-sparse adversarial vector. Given $y = A x^\star + e$ and $q$, we seek the smallest robust solution set containing $x^\star$ that is uniformly recoverable from $y$ without knowing $e$. While exact recovery of $x^\star$ via strong (and often impractical) structural assumptions on $A$ or $x^\star$ (e.g., restricted isometry, sparsity) is well studied, recoverability for arbitrary $A$ and $x^\star$ remains open. Our main result shows that the smallest robust solution set is $x^\star + \ker(U)$, where $U$ is the unique projection matrix onto the intersection of rowspaces of all possible submatrices of $A$ obtained by deleting $2q$ rows. Moreover, we prove that every $x$ that minimizes the $\ell_0$-norm of $y - A x$ lies in $x^\star + \ker(U)$, which then gives a constructive approach to recover this set. Comments: Subjects: Information Theory (cs.IT); Machine...