[2602.22698] Tokenization, Fusion and Decoupling: Bridging the Granularity Mismatch Between Large Language Models and Knowledge Graphs

[2602.22698] Tokenization, Fusion and Decoupling: Bridging the Granularity Mismatch Between Large Language Models and Knowledge Graphs

arXiv - AI 4 min read Article

Summary

This paper presents KGT, a novel framework addressing the granularity mismatch between large language models (LLMs) and knowledge graphs (KGs) by introducing dedicated entity tokens for improved knowledge graph completion.

Why It Matters

As LLMs become increasingly integrated into various AI applications, bridging the gap between their token-based processing and the entity-centric nature of knowledge graphs is crucial for enhancing knowledge graph completion tasks. This research offers a promising solution that could improve the performance of AI systems reliant on knowledge representation.

Key Takeaways

  • KGT framework introduces dedicated entity tokens for better feature representation.
  • The method fuses structural and textual features using a relation-guided gating mechanism.
  • Decoupled prediction allows for independent semantic and structural reasoning.
  • Experimental results show KGT outperforms existing state-of-the-art methods.
  • This approach addresses the fundamental granularity mismatch in knowledge graphs.

Computer Science > Computation and Language arXiv:2602.22698 (cs) [Submitted on 26 Feb 2026] Title:Tokenization, Fusion and Decoupling: Bridging the Granularity Mismatch Between Large Language Models and Knowledge Graphs Authors:Siyue Su, Jian Yang, Bo Li, Guanglin Niu View a PDF of the paper titled Tokenization, Fusion and Decoupling: Bridging the Granularity Mismatch Between Large Language Models and Knowledge Graphs, by Siyue Su and Jian Yang and Bo Li and Guanglin Niu View PDF HTML (experimental) Abstract:Leveraging Large Language Models (LLMs) for Knowledge Graph Completion (KGC) is promising but hindered by a fundamental granularity mismatch. LLMs operate on fragmented token sequences, whereas entities are the fundamental units in knowledge graphs (KGs) scenarios. Existing approaches typically constrain predictions to limited candidate sets or align entities with the LLM's vocabulary by pooling multiple tokens or decomposing entities into fixed-length token sequences, which fail to capture both the semantic meaning of the text and the structural integrity of the graph. To address this, we propose KGT, a novel framework that uses dedicated entity tokens to enable efficient, full-space prediction. Specifically, we first introduce specialized tokenization to construct feature representations at the level of dedicated entity tokens. We then fuse pre-trained structural and textual features into these unified embeddings via a relation-guided gating mechanism, avoiding trai...

Related Articles

I Asked ChatGPT 500 Questions. Here Are the Ads I Saw Most Often | WIRED
Llms

I Asked ChatGPT 500 Questions. Here Are the Ads I Saw Most Often | WIRED

Ads are rolling out across the US on ChatGPT’s free tier. I asked OpenAI's bot 500 questions to see what these ads were like and how they...

Wired - AI · 9 min ·
Llms

Abacus.Ai Claw LLM consumes an incredible amount of credit without any usage :(

Three days ago, I clicked the "Deploy OpenClaw In Seconds" button to get an overview of the new service, but I didn't build any automatio...

Reddit - Artificial Intelligence · 1 min ·
Google’s Gemini AI app debuts in Hong Kong
Llms

Google’s Gemini AI app debuts in Hong Kong

Tech giant’s chatbot service tops Apple’s app store chart in the city.

AI Tools & Products · 2 min ·
Google Launches Gemini Import Tools to Poach Users From Rival AI Apps
Llms

Google Launches Gemini Import Tools to Poach Users From Rival AI Apps

Anyone looking to switch their AI assistant will find it surprisingly easy, as it only takes a few steps to move from A to B. This is not...

AI Tools & Products · 4 min ·
More in Llms: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime