[2601.17094] The Mouth is Not the Brain: Bridging Energy-Based World Models and Language Generation
About this article
Abstract page for arXiv paper 2601.17094: The Mouth is Not the Brain: Bridging Energy-Based World Models and Language Generation
Computer Science > Machine Learning arXiv:2601.17094 (cs) [Submitted on 23 Jan 2026 (v1), last revised 31 Mar 2026 (this version, v2)] Title:The Mouth is Not the Brain: Bridging Energy-Based World Models and Language Generation Authors:Junichiro Niimi View a PDF of the paper titled The Mouth is Not the Brain: Bridging Energy-Based World Models and Language Generation, by Junichiro Niimi View PDF HTML (experimental) Abstract:Large Language Models (LLMs) generate fluent text, yet whether they truly understand the world or merely produce plausible texts about it remains contested. We propose an architectural principle, the mouth is not the brain, that explicitly separates world models from language models. Our architecture comprises three components: a DBM that captures domain structure as an energy-based world model, an adapter that projects latent belief states into embedding space, and a frozen GPT-2 that provides linguistic competence without domain knowledge. We instantiate this framework in the consumer review domain using Amazon smartphone reviews. Experiments demonstrate that (1) world model conditioning achieves lower cross-entropy loss and higher semantic similarity than architectural baselines including direct projection and full fine-tuning, while qualitative analysis reveals that soft prompt conditioning resolves a trade-off that prompt-based approaches cannot: simple prompts lack expressiveness while detailed prompts cause output collapse in small LLMs; (2) the ...