[2510.17734] Efficient Tensor Completion Algorithms for Highly Oscillatory Operators

[2510.17734] Efficient Tensor Completion Algorithms for Highly Oscillatory Operators

arXiv - Machine Learning 4 min read Article

Summary

This paper introduces efficient tensor completion algorithms designed for reconstructing highly oscillatory operators, demonstrating significant computational advantages.

Why It Matters

The development of low-complexity tensor completion algorithms is crucial for applications in numerical analysis and machine learning, particularly in fields like seismic data processing where accurate reconstruction of oscillatory operators is essential. The proposed methods promise improved efficiency and accuracy, potentially transforming computational practices in these areas.

Key Takeaways

  • Introduces low-complexity tensor completion algorithms for oscillatory operators.
  • Utilizes butterfly decomposition for efficient matrix representation.
  • Demonstrates significant speedup in computational cost compared to existing methods.
  • Achieves lower reconstruction errors, enhancing accuracy in applications.
  • Proposes a novel initial guess generation strategy for algorithm optimization.

Mathematics > Numerical Analysis arXiv:2510.17734 (math) [Submitted on 20 Oct 2025 (v1), last revised 16 Feb 2026 (this version, v3)] Title:Efficient Tensor Completion Algorithms for Highly Oscillatory Operators Authors:Navjot Singh, Edgar Solomonik, Xiaoye Sherry Li, Yang Liu View a PDF of the paper titled Efficient Tensor Completion Algorithms for Highly Oscillatory Operators, by Navjot Singh and 2 other authors View PDF HTML (experimental) Abstract:This paper presents low-complexity tensor completion algorithms and their efficient implementation to reconstruct highly oscillatory operators discretized as $n\times n$ matrices. The underlying tensor decomposition is based on the reshaping of the input matrix and its butterfly decomposition into an order $O (\log n)$ tensor. The reshaping of the input matrix into a tensor allows for representation of the butterfly decomposition as a tensor decomposition with dense tensors. This leads to efficient utilization of the existing software infrastructure for dense and sparse tensor computations. We propose two tensor completion algorithms in the butterfly format, using alternating least squares and gradient-based optimization, as well as a novel strategy that uses low-rank matrix completion to efficiently generate an initial guess for the proposed algorithms. To demonstrate the efficiency and applicability of our proposed algorithms, we perform three numerical experiments using simulated oscillatory operators in seismic applicatio...

Related Articles

Machine Learning

[For Hire] Ex-Microsoft Senior Data Engineer | Databricks, Palantir Foundry, MLOps | $55/hr

submitted by /u/mcheetirala2510 [link] [comments]

Reddit - ML Jobs · 1 min ·
Meta AI app climbs to No. 5 on the App Store after Muse Spark launch | TechCrunch
Machine Learning

Meta AI app climbs to No. 5 on the App Store after Muse Spark launch | TechCrunch

The app was ranking No. 57 on the App Store just before Meta AI's new model launched. Now it's No. 5 — and rising.

TechCrunch - AI · 4 min ·
Machine Learning

Detecting mirrored selfie images: OCR the best way? [D]

I'm trying to catch backwards "selfie" images before passing them to our VLM text reader and/or face embedding extraction. Since models l...

Reddit - Machine Learning · 1 min ·
Llms

Google’s Gemini AI can answer your questions with 3D models and simulations

submitted by /u/tekz [link] [comments]

Reddit - Artificial Intelligence · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime