[2506.08916] Enhancing generalizability of model discovery across parameter space with multi-experiment equation learning (ME-EQL)

[2506.08916] Enhancing generalizability of model discovery across parameter space with multi-experiment equation learning (ME-EQL)

arXiv - Machine Learning 4 min read

About this article

Abstract page for arXiv paper 2506.08916: Enhancing generalizability of model discovery across parameter space with multi-experiment equation learning (ME-EQL)

Computer Science > Machine Learning arXiv:2506.08916 (cs) [Submitted on 10 Jun 2025 (v1), last revised 24 Mar 2026 (this version, v2)] Title:Enhancing generalizability of model discovery across parameter space with multi-experiment equation learning (ME-EQL) Authors:Maria-Veronica Ciocanel, John T. Nardini, Kevin B. Flores, Erica M. Rutter, Suzanne S. Sindi, Alexandria Volkening View a PDF of the paper titled Enhancing generalizability of model discovery across parameter space with multi-experiment equation learning (ME-EQL), by Maria-Veronica Ciocanel and 5 other authors View PDF HTML (experimental) Abstract:Agent-based modeling (ABM) is a powerful tool for understanding self-organizing biological systems, but it is computationally intensive and often not analytically tractable. Equation learning (EQL) methods can derive continuum models from ABM data, but they typically require extensive simulations for each parameter set, raising concerns about generalizability. In this work, we extend EQL to Multi-experiment equation learning (ME-EQL) by introducing two methods: one-at-a-time ME-EQL (OAT ME-EQL), which learns individual models for each parameter set and connects them via interpolation, and embedded structure ME-EQL (ES ME-EQL), which builds a unified model library across parameters. We demonstrate these methods using a birth--death mean-field model and an on-lattice agent-based model of birth, death, and migration with spatial structure. Our results show that both meth...

Originally published on March 25, 2026. Curated by AI News.

Related Articles

Llms

[P] Dante-2B: I'm training a 2.1B bilingual fully open Italian/English LLM from scratch on 2×H200. Phase 1 done — here's what I've built.

The problem If you work with Italian text and local models, you know the pain. Every open-source LLM out there treats Italian as an after...

Reddit - Machine Learning · 1 min ·
Machine Learning

[R] Architecture Determines Optimization: Deriving Weight Updates from Network Topology (seeking arXiv endorsement - cs.LG)

Abstract: We derive neural network weight updates from first principles without assuming gradient descent or a specific loss function. St...

Reddit - Machine Learning · 1 min ·
Machine Learning

[P] ML project (XGBoost + Databricks + MLflow) — how to talk about “production issues” in interviews?

Hey all, I recently built an end-to-end fraud detection project using a large banking dataset: Trained an XGBoost model Used Databricks f...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] The memory chip market lost tens of billions over a paper this community would have understood in 10 minutes

TurboQuant was teased recently and tens of billions gone from memory chip market in 48 hours but anyone in this community who read the pa...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime