[2602.19156] Artefact-Aware Fungal Detection in Dermatophytosis: A Real-Time Transformer-Based Approach for KOH Microscopy
Summary
This study presents a transformer-based framework for detecting fungal elements in dermatophytosis using KOH microscopy, achieving high accuracy and sensitivity.
Why It Matters
Accurate detection of fungal infections is crucial for effective treatment. This research leverages advanced AI techniques to enhance diagnostic capabilities in dermatology, potentially improving patient outcomes and streamlining clinical workflows.
Key Takeaways
- Introduces a transformer-based model for detecting fungi in KOH microscopy images.
- Achieved 100% sensitivity and 98.8% accuracy in diagnosing dermatophytosis.
- Utilizes a dataset of 2,540 annotated images to train the model effectively.
- Demonstrates robust performance even in artefact-rich environments.
- Highlights the potential of AI in enhancing clinical decision-making in dermatology.
Computer Science > Computer Vision and Pattern Recognition arXiv:2602.19156 (cs) [Submitted on 22 Feb 2026] Title:Artefact-Aware Fungal Detection in Dermatophytosis: A Real-Time Transformer-Based Approach for KOH Microscopy Authors:Rana Gursoy, Abdurrahim Yilmaz, Baris Kizilyaprak, Esmahan Caglar, Burak Temelkuran, Huseyin Uvet, Ayse Esra Koku Aksu, Gulsum Gencoglan View a PDF of the paper titled Artefact-Aware Fungal Detection in Dermatophytosis: A Real-Time Transformer-Based Approach for KOH Microscopy, by Rana Gursoy and 7 other authors View PDF HTML (experimental) Abstract:Dermatophytosis is commonly assessed using potassium hydroxide (KOH) microscopy, yet accurate recognition of fungal hyphae is hindered by artefacts, heterogeneous keratin clearance, and notable inter-observer variability. This study presents a transformer-based detection framework using the RT-DETR model architecture to achieve precise, query-driven localization of fungal structures in high-resolution KOH images. A dataset of 2,540 routinely acquired microscopy images was manually annotated using a multi-class strategy to explicitly distinguish fungal elements from confounding artefacts. The model was trained with morphology-preserving augmentations to maintain the structural integrity of thin hyphae. Evaluation on an independent test set demonstrated robust object-level performance, with a recall of 0.9737, precision of 0.8043, and an AP@0.50 of 93.56%. When aggregated for image-level diagnosis, the...