[2602.18227] Parameter-Efficient Domain Adaptation of Physics-Informed Self-Attention based GNNs for AC Power Flow Prediction

[2602.18227] Parameter-Efficient Domain Adaptation of Physics-Informed Self-Attention based GNNs for AC Power Flow Prediction

arXiv - Machine Learning 4 min read Article

Summary

This paper presents a novel approach to domain adaptation for AC power flow prediction using physics-informed self-attention based graph neural networks, achieving high accuracy with significantly fewer trainable parameters.

Why It Matters

As energy systems increasingly rely on accurate predictions for efficient operation, this research addresses the challenge of adapting models trained on medium-voltage grids to high-voltage networks. The proposed method enhances efficiency in model training while maintaining accuracy, which is crucial for real-world applications in power systems.

Key Takeaways

  • Introduces a parameter-efficient domain adaptation method for AC power flow prediction.
  • Utilizes physics-informed self-attention based GNNs to ensure Kirchhoff-consistent behavior.
  • Achieves near-full fine-tuning accuracy with an 85.46% reduction in trainable parameters.
  • Demonstrates a controllable efficiency-accuracy trade-off for voltage-regime shifts.
  • Maintains comparable physics-based residuals to full fine-tuning while reducing source retention.

Computer Science > Machine Learning arXiv:2602.18227 (cs) [Submitted on 20 Feb 2026] Title:Parameter-Efficient Domain Adaptation of Physics-Informed Self-Attention based GNNs for AC Power Flow Prediction Authors:Redwanul Karim (1), Changhun Kim (1), Timon Conrad (2), Nora Gourmelon (1), Julian Oelhaf (1), David Riebesel (2), Tomás Arias-Vergara (1), Andreas Maier (1), Johann Jäger (2), Siming Bayer (1) ((1) Pattern Recognition Lab, Friedrich-Alexander-Universität Erlangen-Nürnberg, Erlangen, Germany, (2) Institute of Electrical Energy Systems, Friedrich-Alexander-Universität Erlangen-Nürnberg, Germany) View a PDF of the paper titled Parameter-Efficient Domain Adaptation of Physics-Informed Self-Attention based GNNs for AC Power Flow Prediction, by Redwanul Karim (1) and 15 other authors View PDF HTML (experimental) Abstract:Accurate AC-PF prediction under domain shift is critical when models trained on medium-voltage (MV) grids are deployed on high-voltage (HV) networks. Existing physics-informed graph neural solvers typically rely on full fine-tuning for cross-regime transfer, incurring high retraining cost and offering limited control over the stability-plasticity trade-off between target-domain adaptation and source-domain retention. We study parameter-efficient domain adaptation for physics-informed self-attention based GNN, encouraging Kirchhoff-consistent behavior via a physics-based loss while restricting adaptation to low-rank updates. Specifically, we apply LoRA t...

Related Articles

Llms

One of The Worst AI's I've Ever Seen

I'm using Gemini just for they gave us a student-free-pro pack. It can't see the images I sent, most of the time it just rewrites the mes...

Reddit - Artificial Intelligence · 1 min ·
Llms

Claude Opus 4.6 API at 40% below Anthropic pricing – try free before you pay anything

Hey everyone 👋 I've set up a self-hosted API gateway using New-API to manage and distribute Claude Opus 4.6 access across multiple users....

Reddit - Artificial Intelligence · 1 min ·
Llms

The open-source AI system that beat Claude Sonnet on a $500 GPU just shipped a coding assistant

A week or two ago, an open-source project called ATLAS made the rounds for scoring 74.6% on LiveCodeBench with a frozen 9B model on a sin...

Reddit - Artificial Intelligence · 1 min ·
Google quietly releases an offline-first AI dictation app on iOS | TechCrunch
Machine Learning

Google quietly releases an offline-first AI dictation app on iOS | TechCrunch

Google's new offline-first dictation app uses Gemma AI models to take on the apps like Wispr Flow.

TechCrunch - AI · 4 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime