[2508.01067] Expressive Power of Graph Transformers via Logic

[2508.01067] Expressive Power of Graph Transformers via Logic

arXiv - AI 3 min read Article

Summary

This paper explores the expressive power of graph transformers, comparing their capabilities under different logical frameworks, particularly focusing on vertex properties definable in first-order logic and graded modal logic.

Why It Matters

Understanding the expressive power of graph transformers is crucial for advancing AI applications that rely on graph structures, such as social network analysis and recommendation systems. This research provides foundational insights that can enhance the design of more effective models in these areas.

Key Takeaways

  • Graph transformers exhibit varying expressive power depending on the logical framework used.
  • GPS-networks are shown to have the same expressive power as graded modal logic with global modalities under specific conditions.
  • The study distinguishes between theoretical and practical scenarios, enhancing understanding of model capabilities.
  • Results indicate that GPS-networks and graph transformers are equally expressive when using floats.
  • Findings can guide future research in AI model development and application in graph-based tasks.

Computer Science > Logic in Computer Science arXiv:2508.01067 (cs) [Submitted on 1 Aug 2025 (v1), last revised 18 Feb 2026 (this version, v2)] Title:Expressive Power of Graph Transformers via Logic Authors:Veeti Ahvonen, Maurice Funk, Damian Heiman, Antti Kuusisto, Carsten Lutz View a PDF of the paper titled Expressive Power of Graph Transformers via Logic, by Veeti Ahvonen and 4 other authors View PDF HTML (experimental) Abstract:Transformers are the basis of modern large language models, but relatively little is known about their precise expressive power on graphs. We study the expressive power of graph transformers (GTs) by Dwivedi and Bresson (2020) and GPS-networks by Rampásek et al. (2022), both under soft-attention and average hard-attention. Our study covers two scenarios: the theoretical setting with real numbers and the more practical case with floats. With reals, we show that in restriction to vertex properties definable in first-order logic (FO), GPS-networks have the same expressive power as graded modal logic (GML) with the global modality. With floats, GPS-networks turn out to be equally expressive as GML with the counting global modality. The latter result is absolute, not restricting to properties definable in a background logic. We also obtain similar characterizations for GTs in terms of propositional logic with the global modality (for reals) and the counting global modality (for floats). Subjects: Logic in Computer Science (cs.LO); Artificial Intellige...

Related Articles

Llms

[R] Looking for arXiv cs.LG endorser, inference monitoring using information geometry

Hi r/MachineLearning, I’m looking for an arXiv endorser in cs.LG for a paper on inference-time distribution shift detection for deployed ...

Reddit - Machine Learning · 1 min ·
Llms

How LLM sycophancy got the US into the Iran quagmire

submitted by /u/sow_oats [link] [comments]

Reddit - Artificial Intelligence · 1 min ·
Llms

Kept hitting ChatGPT and Claude limits during real work. This is the free setup I ended up using

I do a lot of writing and random problem solving for work. Mostly long drafts, edits, and breaking down ideas. Around Jan I kept hitting ...

Reddit - Artificial Intelligence · 1 min ·
Llms

Is ChatGPT changing the way we think too much already?

Back in the day, I got ChatGPT Plus mostly for work and to help me write better and do stuff faster. But now I use it for almost everythi...

Reddit - Artificial Intelligence · 1 min ·
More in Llms: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime