[2508.10587] Self-Supervised Temporal Super-Resolution of Energy Data using Generative Adversarial Transformer

[2508.10587] Self-Supervised Temporal Super-Resolution of Energy Data using Generative Adversarial Transformer

arXiv - Machine Learning 4 min read Article

Summary

This paper presents a novel self-supervised method for temporal super-resolution of energy data using Generative Adversarial Transformers, addressing challenges in time series upsampling.

Why It Matters

The research addresses significant challenges in energy data analysis, particularly the need for high-resolution time series data in energy systems. By utilizing self-supervised learning, the method reduces reliance on unavailable high-resolution data, improving accuracy in predictive control applications.

Key Takeaways

  • Introduces a self-supervised approach to temporal super-resolution.
  • Generative Adversarial Transformers (GATs) outperform conventional interpolation methods.
  • Achieves a 10% reduction in RMSE for upsampling tasks.
  • Improves model predictive control accuracy by 13%.
  • Addresses the paradox of training generative models without high-resolution data.

Computer Science > Machine Learning arXiv:2508.10587 (cs) This paper has been withdrawn by Xuanhao Mu [Submitted on 14 Aug 2025 (v1), last revised 12 Feb 2026 (this version, v4)] Title:Self-Supervised Temporal Super-Resolution of Energy Data using Generative Adversarial Transformer Authors:Xuanhao Mu, Gökhan Demirel, Yuzhe Zhang, Jianlei Liu, Thorsten Schlachter, Veit Hagenmeyer View a PDF of the paper titled Self-Supervised Temporal Super-Resolution of Energy Data using Generative Adversarial Transformer, by Xuanhao Mu and 4 other authors No PDF available, click to view other formats Abstract:To bridge the temporal granularity gap in energy network design and operation based on Energy System Models, resampling of time series is required. While conventional upsampling methods are computationally efficient, they often result in significant information loss or increased noise. Advanced models such as time series generation models, Super-Resolution models and imputation models show potential, but also face fundamental challenges. The goal of time series generative models is to learn the distribution of the original data to generate high-resolution series with similar statistical characteristics. This is not entirely consistent with the definition of upsampling. Time series Super-Resolution models or imputation models can degrade the accuracy of upsampling because the input low-resolution time series are sparse and may have insufficient context. Moreover, such models usually r...

Related Articles

Machine Learning

[D] ML researcher looking to switch to a product company.

Hey, I am an AI researcher currently working in a deep tech company as a data scientist. Prior to this, I was doing my PhD. My current ro...

Reddit - Machine Learning · 1 min ·
Machine Learning

Building behavioural response models of public figures using Brain scan data (Predict their next move using psychological modelling) [P]

Hey guys, I’m the same creator of Netryx V2, the geolocation tool. I’ve been working on something new called COGNEX. It learns how a pers...

Reddit - Machine Learning · 1 min ·
Machine Learning

[P] bitnet-edge: Ternary-weight CNNs ({-1,0,+1}) on MNIST and CIFAR-10, deployed to ESP32-S3 with zero multiplications

I built a pipeline that takes ternary-quantized CNNs from PyTorch training all the way to bare-metal inference on an ESP32-S3 microcontro...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] What surprised us while collecting training data from the public web been pulling training data from public web

been pulling training data from public web sources for a bit now. needed it to scale, not return complete garbage, and not immediately bl...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime