[2602.07075] LatentChem: From Textual CoT to Latent Thinking in Chemical Reasoning
About this article
Abstract page for arXiv paper 2602.07075: LatentChem: From Textual CoT to Latent Thinking in Chemical Reasoning
Physics > Chemical Physics arXiv:2602.07075 (physics) [Submitted on 6 Feb 2026 (v1), last revised 5 Mar 2026 (this version, v2)] Title:LatentChem: From Textual CoT to Latent Thinking in Chemical Reasoning Authors:Xinwu Ye, Yicheng Mao, Jia Zhang, Yimeng Liu, Li Hao, Fang Wu, Zhiwei Li, Yuxuan Liao, Zehong Wang, Zhiyuan Liu, Zhenfei Yin, Li Yuan, Philip Torr, Huan Sun, Xiangxiang Zeng, Mengdi Wang, Le Cong, Shenghua Gao, Xiangru Tang View a PDF of the paper titled LatentChem: From Textual CoT to Latent Thinking in Chemical Reasoning, by Xinwu Ye and 18 other authors View PDF HTML (experimental) Abstract:Chemical large language models (LLMs) predominantly rely on explicit Chain-of-Thought (CoT) in natural language to perform complex reasoning. However, chemical reasoning is inherently continuous and structural, and forcing it into discrete linguistic tokens introduces a fundamental representation mismatch that constrains both efficiency and performance. We introduce LatentChem, a latent reasoning interface that decouples chemical computation from textual generation, enabling models to perform multi-step reasoning directly in continuous latent space while emitting language only for final outputs. Remarkably, we observe a consistent emergent behavior: when optimized solely for task success, models spontaneously internalize reasoning, progressively abandoning verbose textual derivations in favor of implicit latent computation. This shift is not merely stylistic but computationa...