The high nominal accuracy achieved by deep learning models in predicting malignant skin lesions is frequently undermined by their susceptibility to operational uncertainty. Image acquisition conditions, such as lighting, device settings, and skin characteristics, introduce variations in optical parameters that compromise the model's reliability in real-world clinical settings. This instability produces an unquantified diagnostic risk, which makes the safe clinical implementation of these powerful systems difficult. With this critical gap in mind, this paper proposes and utilizes global sensitivity analysis to rigorously quantify the robustness of a convolutional neural network architecture with respect to five critical optical image parameters. The analysis aims to quantify the model's instability, moving beyond simple accuracy to provide a robust, risk-quantified assessment. This approach is essential for establishing the level of confidence required for the accreditation and safe clinical deployment of AI-based diagnostic systems.

Global Sensitivity Analysis for Robust XAI: Quantifying Clinical Risk and Prediction Instability in Dermoscopic Image Classification / Vannucci, Giulia; Patrik Williame Coppolecchia, Renato; Siciliano, Roberta. - In: RISK ANALYSIS. - ISSN 1539-6924. - 46:4(2026). [10.1111/risa.70237]

Global Sensitivity Analysis for Robust XAI: Quantifying Clinical Risk and Prediction Instability in Dermoscopic Image Classification

Giulia Vannucci
Primo
Conceptualization
;
Roberta Siciliano
2026

Abstract

The high nominal accuracy achieved by deep learning models in predicting malignant skin lesions is frequently undermined by their susceptibility to operational uncertainty. Image acquisition conditions, such as lighting, device settings, and skin characteristics, introduce variations in optical parameters that compromise the model's reliability in real-world clinical settings. This instability produces an unquantified diagnostic risk, which makes the safe clinical implementation of these powerful systems difficult. With this critical gap in mind, this paper proposes and utilizes global sensitivity analysis to rigorously quantify the robustness of a convolutional neural network architecture with respect to five critical optical image parameters. The analysis aims to quantify the model's instability, moving beyond simple accuracy to provide a robust, risk-quantified assessment. This approach is essential for establishing the level of confidence required for the accreditation and safe clinical deployment of AI-based diagnostic systems.
2026
Global Sensitivity Analysis for Robust XAI: Quantifying Clinical Risk and Prediction Instability in Dermoscopic Image Classification / Vannucci, Giulia; Patrik Williame Coppolecchia, Renato; Siciliano, Roberta. - In: RISK ANALYSIS. - ISSN 1539-6924. - 46:4(2026). [10.1111/risa.70237]
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11588/1042354
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact