[2602.12742] Synthetic Craquelure Generation for Unsupervised Painting Restoration

[2602.12742] Synthetic Craquelure Generation for Unsupervised Painting Restoration

arXiv - Machine Learning 3 min read Article

Summary

This article presents a novel framework for unsupervised painting restoration by generating synthetic craquelure patterns, enhancing the preservation of cultural heritage.

Why It Matters

As cultural heritage preservation becomes increasingly reliant on digital methods, this research addresses the challenge of restoring fine craquelure patterns without extensive pixel-level annotations. The proposed framework offers a significant advancement in non-invasive restoration techniques, which is crucial for art conservationists and historians.

Key Takeaways

  • Introduces a synthetic craquelure generator for painting restoration.
  • Utilizes a fully annotation-free framework, enhancing efficiency.
  • Combines classical detection with learning-based refinement for better results.
  • Demonstrates superior performance compared to existing photographic restoration models.
  • Preserves original brushwork while reconstructing missing content.

Computer Science > Computer Vision and Pattern Recognition arXiv:2602.12742 (cs) [Submitted on 13 Feb 2026] Title:Synthetic Craquelure Generation for Unsupervised Painting Restoration Authors:Jana Cuch-Guillén, Antonio Agudo, Raül Pérez-Gonzalo View a PDF of the paper titled Synthetic Craquelure Generation for Unsupervised Painting Restoration, by Jana Cuch-Guill\'en and 2 other authors View PDF HTML (experimental) Abstract:Cultural heritage preservation increasingly demands non-invasive digital methods for painting restoration, yet identifying and restoring fine craquelure patterns from complex brushstrokes remains challenging due to scarce pixel-level annotations. We propose a fully annotation-free framework driven by a domain-specific synthetic craquelure generator, which simulates realistic branching and tapered fissure geometry using Bézier trajectories. Our approach couples a classical morphological detector with a learning-based refinement module: a SegFormer backbone adapted via Low-Rank Adaptation (LoRA). Uniquely, we employ a detector-guided strategy, injecting the morphological map as an input spatial prior, while a masked hybrid loss and logit adjustment constrain the training to focus specifically on refining candidate crack regions. The refined masks subsequently guide an Anisotropic Diffusion inpainting stage to reconstruct missing content. Experimental results demonstrate that our pipeline significantly outperforms state-of-the-art photographic restoration ...

Related Articles

[2511.21428] From Observation to Action: Latent Action-based Primitive Segmentation for VLA Pre-training in Industrial Settings
Machine Learning

[2511.21428] From Observation to Action: Latent Action-based Primitive Segmentation for VLA Pre-training in Industrial Settings

Abstract page for arXiv paper 2511.21428: From Observation to Action: Latent Action-based Primitive Segmentation for VLA Pre-training in ...

arXiv - AI · 4 min ·
[2511.16719] SAM 3: Segment Anything with Concepts
Machine Learning

[2511.16719] SAM 3: Segment Anything with Concepts

Abstract page for arXiv paper 2511.16719: SAM 3: Segment Anything with Concepts

arXiv - AI · 4 min ·
[2603.28594] Detection of Adversarial Attacks in Robotic Perception
Machine Learning

[2603.28594] Detection of Adversarial Attacks in Robotic Perception

Abstract page for arXiv paper 2603.28594: Detection of Adversarial Attacks in Robotic Perception

arXiv - AI · 3 min ·
[2603.28555] Domain-Invariant Prompt Learning for Vision-Language Models
Llms

[2603.28555] Domain-Invariant Prompt Learning for Vision-Language Models

Abstract page for arXiv paper 2603.28555: Domain-Invariant Prompt Learning for Vision-Language Models

arXiv - AI · 3 min ·
More in Computer Vision: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime