[2506.06858] FA-INR: Adaptive Implicit Neural Representations for Interpretable Exploration of Simulation Ensembles

[2506.06858] FA-INR: Adaptive Implicit Neural Representations for Interpretable Exploration of Simulation Ensembles

arXiv - AI 4 min read

About this article

Abstract page for arXiv paper 2506.06858: FA-INR: Adaptive Implicit Neural Representations for Interpretable Exploration of Simulation Ensembles

Computer Science > Machine Learning arXiv:2506.06858 (cs) [Submitted on 7 Jun 2025 (v1), last revised 31 Mar 2026 (this version, v3)] Title:FA-INR: Adaptive Implicit Neural Representations for Interpretable Exploration of Simulation Ensembles Authors:Ziwei Li, Yuhan Duan, Tianyu Xiong, Yi-Tang Chen, Wei-Lun Chao, Han-Wei Shen View a PDF of the paper titled FA-INR: Adaptive Implicit Neural Representations for Interpretable Exploration of Simulation Ensembles, by Ziwei Li and 5 other authors View PDF HTML (experimental) Abstract:Surrogate models are essential for efficient exploration of large-scale ensemble simulations. Implicit neural representations (INRs) provide a compact and continuous framework for modeling spatially structured data, but they often struggle with learning complex localized structures within the scientific fields. Recent INR-based surrogates address this by augmenting INRs with explicit feature structures, but at the cost of flexibility and substantial memory overhead. In this paper, we present Feature-Adaptive INR (FA-INR), an adaptive INR-based surrogate model for high-fidelity and interpretable exploration of ensemble simulations. Instead of relying on structured feature representations, FA-INR leverages cross-attention over a learnable key-value memory bank to allocate model capacity adaptively based on the data characteristics. To further improve scalability, we introduce a coordinate-guided mixture of experts (MoE) framework that enhances both eff...

Originally published on April 01, 2026. Curated by AI News.

Related Articles

Machine Learning

easyaligner: Forced alignment with GPU acceleration and flexible text normalization (compatible with all w2v2 models on HF Hub) [P]

https://preview.redd.it/f4d5krhkjyvg1.png?width=1020&format=png&auto=webp&s=11310f377b22abbe3dd110cc7d362ba8aae35f8d I have b...

Reddit - Machine Learning · 1 min ·
Machine Learning

ICML 2026 - Heavy score variance among various batches? [D]

I've seen some people say in their batch very few papers have above 3.5 score, but then other reviewers say that most papers in their sco...

Reddit - Machine Learning · 1 min ·
Machine Learning

We’re proud to open-source LIDARLearn [R] [D] [P]

It’s a unified PyTorch library for 3D point cloud deep learning. To our knowledge, it’s the first framework that supports such a large co...

Reddit - Machine Learning · 1 min ·
Llms

I built a repo for implementing and training LLM architectures from scratch in minimal PyTorch — contributions welcome! [P]

Hey everyone, I've been working on a repo where I implement large language model architectures using the simplest PyTorch code possible. ...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime