IlluC-Net: An attention-guided U-Net for illumination correction in dermatological macro-photographs

Document Type

Article

Publication Title

Franklin Open

Abstract

Uneven illumination in dermatological macro-photographs presents a significant challenge for accurate analysis of skin lesions, affecting both clinical diagnosis and the performance of automated tools used to diagnose skin cancer. To address this issue, we propose IlluC-Net, a novel deep learning framework designed for illumination correction in dermatological macro-photographs. IlluC-Net is designed on a U-Net architecture integrated with attention mechanisms to effectively capture both local and global features from the macro-photographs. This approach equalizes the background illumination while preserving the structure of the lesion. The IlluC-Net model is trained using the mean squared error (MSE) loss function, with dynamic learning rate scheduling and early stopping strategies are employed to prevent overfitting. The performance of IlluC-Net is evaluated using a curated set of dermatological macro-photographs from the University of Waterloo skin cancer dataset and MED-NODE dataset under five-fold cross-validation. Quantitative results show that IlluC-Net outperforms existing state-of-the-art (SOTA) illumination correction methods, including U-Net, TransUNet, GAN, EnlightenGAN, CSWin-P, and IECET, achieving the highest PSNR of 35.99 ± 3.15 dB and SSIM of 0.98 ± 0.01, while obtaining the lowest BRISQUE, PIQE, and NIQE scores of 27.09 ± 10.43, 44.39 ± 9.77, and 0.27 ± 0.02, respectively, across all test images. Visual evaluations further confirm that the illumination-corrected images produced by IlluC-Net closely resemble the ground-truth images, exhibiting minimal visual artifacts and enhanced contrast. Owing to its superior performance and computational efficiency, IlluC-Net is well-suited for integration into computer-aided diagnosis (CAD) systems and can be effectively deployed on edge devices for real-time diagnosis.

DOI

10.1016/j.fraope.2025.100439

Publication Date

12-1-2025

This document is currently not available here.

Share

COinS