[2602.21466] Asymptotically Fast Clebsch-Gordan Tensor Products with Vector Spherical Harmonics

[2602.21466] Asymptotically Fast Clebsch-Gordan Tensor Products with Vector Spherical Harmonics

arXiv - Machine Learning 4 min read Article

Summary

This article presents a novel algorithm for computing Clebsch-Gordan tensor products using vector spherical harmonics, achieving significant runtime improvements for $E(3)$-equivariant neural networks.

Why It Matters

The advancement in computing Clebsch-Gordan tensor products is crucial for enhancing the performance of $E(3)$-equivariant neural networks, which are widely used in 3D modeling tasks. This research addresses the limitations of existing methods and offers a more efficient approach, potentially impacting various applications in machine learning and computational physics.

Key Takeaways

  • Introduces a complete algorithm for Clebsch-Gordan tensor products that improves runtime complexity from $O(L^6)$ to $O(L^4 \log^2 L)$.
  • Addresses previous limitations in expressivity and interaction in tensor products, enhancing the performance of neural networks.
  • Generalizes fast Fourier-based convolution to derive a new approach for tensor spherical harmonics.

Computer Science > Machine Learning arXiv:2602.21466 (cs) [Submitted on 25 Feb 2026] Title:Asymptotically Fast Clebsch-Gordan Tensor Products with Vector Spherical Harmonics Authors:YuQing Xie, Ameya Daigavane, Mit Kotak, Tess Smidt View a PDF of the paper titled Asymptotically Fast Clebsch-Gordan Tensor Products with Vector Spherical Harmonics, by YuQing Xie and 3 other authors View PDF HTML (experimental) Abstract:$E(3)$-equivariant neural networks have proven to be effective in a wide range of 3D modeling tasks. A fundamental operation of such networks is the tensor product, which allows interaction between different feature types. Because this operation scales poorly, there has been considerable work towards accelerating this interaction. However, recently \citet{xieprice} have pointed out that most speedups come from a reduction in expressivity rather than true algorithmic improvements on computing Clebsch-Gordan tensor products. A modification of Gaunt tensor product \citep{gaunt} can give a true asymptotic speedup but is incomplete and misses many interactions. In this work, we provide the first complete algorithm which truly provides asymptotic benefits Clebsch-Gordan tensor products. For full CGTP, our algorithm brings runtime complexity from the naive $O(L^6)$ to $O(L^4\log^2 L)$, close to the lower bound of $O(L^4)$. We first show how generalizing fast Fourier based convolution naturally leads to the previously proposed Gaunt tensor product \citep{gaunt}. To rem...

Related Articles

UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
Improving AI models’ ability to explain their predictions
Machine Learning

Improving AI models’ ability to explain their predictions

AI News - General · 9 min ·
Llms

LLM agents can trigger real actions now. But what actually stops them from executing?

We ran into a simple but important issue while building agents with tool calling: the model can propose actions but nothing actually enfo...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

OkCupid gave 3 million dating-app photos to facial recognition firm, FTC says

submitted by /u/Mathemodel [link] [comments]

Reddit - Artificial Intelligence · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime