[2602.19128] K-Search: LLM Kernel Generation via Co-Evolving Intrinsic World Model

[2602.19128] K-Search: LLM Kernel Generation via Co-Evolving Intrinsic World Model

arXiv - AI 4 min read Article

Summary

The paper presents K-Search, a novel framework for optimizing GPU kernels using a co-evolving intrinsic world model, significantly improving performance over existing methods.

Why It Matters

As machine learning systems become increasingly complex, optimizing GPU kernels is crucial for efficiency. K-Search addresses limitations in current automated approaches by leveraging LLMs for better algorithmic planning, making it a significant advancement in the field of AI and machine learning.

Key Takeaways

  • K-Search improves GPU kernel optimization by using a co-evolving world model.
  • The framework outperforms state-of-the-art evolutionary search methods by an average of 2.10x.
  • K-Search enables better navigation of complex optimization paths, enhancing resilience to implementation defects.
  • The method achieves notable performance on diverse kernels, including MoE kernels.
  • This research highlights the potential of LLMs in optimizing complex computational tasks.

Computer Science > Artificial Intelligence arXiv:2602.19128 (cs) [Submitted on 22 Feb 2026] Title:K-Search: LLM Kernel Generation via Co-Evolving Intrinsic World Model Authors:Shiyi Cao, Ziming Mao, Joseph E. Gonzalez, Ion Stoica View a PDF of the paper titled K-Search: LLM Kernel Generation via Co-Evolving Intrinsic World Model, by Shiyi Cao and 3 other authors View PDF HTML (experimental) Abstract:Optimizing GPU kernels is critical for efficient modern machine learning systems yet remains challenging due to the complex interplay of design factors and rapid hardware evolution. Existing automated approaches typically treat Large Language Models (LLMs) merely as stochastic code generators within heuristic-guided evolutionary loops. These methods often struggle with complex kernels requiring coordinated, multi-step structural transformations, as they lack explicit planning capabilities and frequently discard promising strategies due to inefficient or incorrect intermediate implementations. To address this, we propose Search via Co-Evolving World Model and build K-Search based on this method. By replacing static search heuristics with a co-evolving world model, our framework leverages LLMs' prior domain knowledge to guide the search, actively exploring the optimization space. This approach explicitly decouples high-level algorithmic planning from low-level program instantiation, enabling the system to navigate non-monotonic optimization paths while remaining resilient to temp...

Related Articles

Llms

Claude Opus 4.6 API at 40% below Anthropic pricing – try free before you pay anything

Hey everyone I've set up a self-hosted API gateway using [New-API](QuantumNous/new-ap) to manage and distribute Claude Opus 4.6 access ac...

Reddit - Artificial Intelligence · 1 min ·
Hackers Are Posting the Claude Code Leak With Bonus Malware | WIRED
Llms

Hackers Are Posting the Claude Code Leak With Bonus Malware | WIRED

Plus: The FBI says a recent hack of its wiretap tools poses a national security risk, attackers stole Cisco source code as part of an ong...

Wired - AI · 9 min ·
Llms

People anxious about deviating from what AI tells them to do?

My friend came over yesterday to dye her hair. She had asked ChatGPT for the 'correct' way to do it. Chat told her to dye the ends first,...

Reddit - Artificial Intelligence · 1 min ·
Llms

ChatGPT on trial: A landmark test of AI liability in the practice of law

AI Tools & Products ·
More in Llms: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime