[2602.22536] Persistent Nonnegative Matrix Factorization via Multi-Scale Graph Regularization

[2602.22536] Persistent Nonnegative Matrix Factorization via Multi-Scale Graph Regularization

arXiv - Machine Learning 3 min read Article

Summary

The paper introduces Persistent Nonnegative Matrix Factorization (pNMF), a novel approach that utilizes multi-scale graph regularization to enhance dimensionality reduction and data representation, addressing limitations of traditional NMF methods.

Why It Matters

This research is significant as it advances the field of machine learning by providing a method that captures multi-scale connectivity structures, which is crucial for applications in data analysis and interpretation, particularly in complex datasets like single-cell RNA sequencing.

Key Takeaways

  • pNMF offers a sequence of persistence-aligned embeddings rather than a single output.
  • The method incorporates persistent homology to identify critical scales of connectivity changes.
  • A new coupled NMF formulation is introduced, enhancing cross-scale consistency.
  • The proposed algorithm ensures guaranteed convergence, addressing computational challenges.
  • Numerical experiments validate the effectiveness of pNMF in multi-scale low-rank embeddings.

Computer Science > Machine Learning arXiv:2602.22536 (cs) [Submitted on 26 Feb 2026] Title:Persistent Nonnegative Matrix Factorization via Multi-Scale Graph Regularization Authors:Jichao Zhang, Ran Miao, Limin Li View a PDF of the paper titled Persistent Nonnegative Matrix Factorization via Multi-Scale Graph Regularization, by Jichao Zhang and 2 other authors View PDF HTML (experimental) Abstract:Matrix factorization techniques, especially Nonnegative Matrix Factorization (NMF), have been widely used for dimensionality reduction and interpretable data representation. However, existing NMF-based methods are inherently single-scale and fail to capture the evolution of connectivity structures across resolutions. In this work, we propose persistent nonnegative matrix factorization (pNMF), a scale-parameterized family of NMF problems, that produces a sequence of persistence-aligned embeddings rather than a single one. By leveraging persistent homology, we identify a canonical minimal sufficient scale set at which the underlying connectivity undergoes qualitative changes. These canonical scales induce a sequence of graph Laplacians, leading to a coupled NMF formulation with scale-wise geometric regularization and explicit cross-scale consistency constraint. We analyze the structural properties of the embeddings along the scale parameter and establish bounds on their increments between consecutive scales. The resulting model defines a nontrivial solution path across scales, rathe...

Related Articles

Machine Learning

[D] Does ML have a "bible"/reference textbook at the Intermediate/Advanced level?

Hello, everyone! This is my first time posting here and I apologise if the question is, perhaps, a bit too basic for this sub-reddit. A b...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] ICML 2026 review policy debate: 100 responses suggest Policy B may score higher, while Policy A shows higher confidence

A week ago I made a thread asking whether ICML 2026’s review policy might have affected review outcomes, especially whether Policy A pape...

Reddit - Machine Learning · 1 min ·
Nomadic raises $8.4 million to wrangle the data pouring off autonomous vehicles | TechCrunch
Machine Learning

Nomadic raises $8.4 million to wrangle the data pouring off autonomous vehicles | TechCrunch

The company turns footage from robots into structured, searchable datasets with a deep learning model.

TechCrunch - AI · 6 min ·
Machine Learning

[D] Applied AI/Machine learning course by Srikanth Varma

I have all 10 modules of this course, along with all the notes, assignments, and solutions. If anyone need this course DM me. submitted b...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime