Systematic Hyperparameter Optimisation of a U-Net Architecture for Robust Skin Lesion Segmentation
DOI:
https://doi.org/10.63313/JCSFT.9057Keywords:
skin lesion segmentation, Bayesian optimisation, U-Net, dermoscopic image analysisAbstract
Accurate skin lesion segmentation is essential for reliable computer-aided der-moscopic analysis, as segmentation errors propagate to downstream feature extraction and diagnostic decisions. Many existing approaches rely on man-ually selected hyperparameters, resulting in time-consuming, suboptimal, and poorly reproducible outcomes. This study investigates whether structured, task-aware Bayesian optimisation can systematically enhance baseline segmentation performance. A controlled U-Net architectural template is coupled with a hierarchically defined Bayesian search space encompassing training dynamics, regularisation, loss formulation, and data augmentation. The optimisation pro-cess is designed to be sample-efficient and aligned with segmentation-specific evaluation metrics. The framework is evaluated on the ISIC 2016, 2017, and 2018 benchmark datasets, including cross-dataset generalisation analysis. The proposed approach achieves Jaccard indices of 91.1%, 85.0%, and 89.4% on ISIC 2016, 2017, and 2018, respectively, consistently outperforming standard U-Net baselines. Cross-dataset evaluation demonstrates stable performance, with Jac-card scores remaining within 1–2% of in-dataset results. These findings indicate that structured, task-aware Bayesian optimisation substantially improves both performance and robustness, highlighting that principled optimisation of training dynamics can unlock the full potential of standard architectures without requiring architectural modifications.
References
[1] Sanchez, U.: ISBI 2016 Challenge Results. https://challenge.kitware.com/ submission/56fe2b60cad3a55ecee8cf74 (2016)
[2] Yu, L., Chen, H., Dou, Q., Qin, J., Heng, P.-A.: Automated melanoma recognition in dermoscopy images via very deep residual networks. IEEE Transactions on Medical Imaging 36(4), 994–1004 (2016)
[3] Rahman, M.: ISBI 2016 Challenge Results. https://challenge.kitware.com/ submission/56fbfa1bcad3a54f8bb809bf (2016)
[4] Lin, B.S., Michael, K., Kalra, S., Tizhoosh, H.R.: Skin lesion segmentation: U-nets versus clustering. In: Proceedings of the IEEE Symposium Series on Computational Intelligence (SSCI), pp. 1–7. IEEE, ??? (2017)
[5] Yuan, Y.: Automatic skin lesion segmentation with fully convolutional-deconvolutional networks. arXiv preprint arXiv:1703.05165 (2017)
[6] Ashraf, H., Waris, A., Ghafoor, M.F., Gilani, S.O., Niazi, I.K.: Melanoma segmen-tation using deep learning with test-time augmentations and conditional random fields. Scientific Reports 12(1), 1–16 (2022)
[7] Tong, X., Wei, J., Sun, B., Su, S., Zuo, Z., Wu, P.: ASCU-Net: Attention gate, spatial and channel attention U-net for skin lesion segmentation. Diagnostics 11(3), 501 (2021)
[8] Zafar, K., Gilani, S.O., Waris, A., Ahmed, A., Jamil, M., Khan, M.N., Kashif, A.S.: Skin lesion segmentation from dermoscopic images using convolutional neural network. Sensors 20(6), 1601 (2020)
[9] Manjunath, R.V., Y. G., N.: Automated approach for skin lesion segmentation utilizing a hybrid deep learning algorithm. Multimedia Tools and Applications 83(15), 46017–46035 (2024)
[10] Qian, C., Liu, T., Jiang, H., Wang, Z., Wang, P., Guan, M., Sun, B.: A detection and segmentation architecture for skin lesion segmentation on dermoscopy images. arXiv preprint arXiv:1809.03917 (2018)
[11] Berseth, M.: Isic 2017—skin lesion analysis towards melanoma detection. arXiv preprint arXiv:1703.00523 (2017)
[12] Du, H., Seok, J.Y., Ngiam, D., Yuan, K., Feng, M.: Team Holiday Burned at ISIC Challenge 2018. https://challenge.isic-archive.com/leaderboards/2018 (2018)
[13] Bi, L., Kim, J., Ahn, E., Feng, D.: Automatic skin lesion analysis using large-scale dermoscopy images and deep residual networks. arXiv preprint arXiv:1703.04197 (2017)
Downloads
Published
Issue
Section
License
Copyright (c) 2026 by author(s) and Erytis Publishing Limited.

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.













