[2509.22353] Context and Diversity Matter: The Emergence of In-Context Learning in World Models
About this article
Abstract page for arXiv paper 2509.22353: Context and Diversity Matter: The Emergence of In-Context Learning in World Models
Computer Science > Machine Learning arXiv:2509.22353 (cs) [Submitted on 26 Sep 2025 (v1), last revised 27 Feb 2026 (this version, v2)] Title:Context and Diversity Matter: The Emergence of In-Context Learning in World Models Authors:Fan Wang, Zhiyuan Chen, Yuxuan Zhong, Sunjian Zheng, Pengtao Shao, Bo Yu, Shaoshan Liu, Jianan Wang, Ning Ding, Yang Cao, Yu Kang View a PDF of the paper titled Context and Diversity Matter: The Emergence of In-Context Learning in World Models, by Fan Wang and 9 other authors View PDF HTML (experimental) Abstract:The capability of predicting environmental dynamics underpins both biological neural systems and general embodied AI in adapting to their surroundings. Yet prevailing approaches rest on static world models that falter when confronted with novel or rare configurations. We investigate in-context learning (ICL) of world models, shifting attention from zero-shot performance to the growth and asymptotic limits of the world model. Our contributions are three-fold: (1) we formalize ICL of a world model and identify two core mechanisms: environment recognition (ER) and environment learning (EL); (2) we derive error upper-bounds for both mechanisms that expose how the mechanisms emerge; and (3) we empirically confirm that distinct ICL mechanisms exist in the world model, and we further investigate how data distribution and model architecture affect ICL in a manner consistent with theory. These findings demonstrate the potential of self-adapting ...