In the last years, one of the main challenges in Industry 4.0 concerns maintenance operations optimization, which has been widely dealt with several predictive maintenance frameworks aiming to jointly reduce maintenance costs and downtime intervals. Nevertheless, the most recent and effective frameworks mainly rely on deep learning models, but their internal representations (black box) are too complex for human understanding making difficult explain their predictions. This issue can be challenged by using eXplainable artificial intelligence (XAI) methodologies, the aim of which is to explain the decisions of data-driven AI models, characterizing the strengths and weaknesses of the decision-making process by making results more understandable by humans. In this paper, we focus on explanation of the predictions made by a recurrent neural networks based model, which requires a tree-dimensional dataset because it exploits spatial and temporal features for estimating remaining useful life (RUL) of hard disk drives (HDDs). In particular, we have analyzed in depth as explanations about RUL prediction provided by different XAI tools, compared using different metrics and showing the generated dashboards, can be really useful for supporting predictive maintenance task by means of both global and local explanations. For this aim, we have realized an explanation framework able to investigate local interpretable model-agnostic explanations (LIME) and SHapley Additive exPlanations (SHAP) tools w.r.t. to the Backblaze Dataset and a long short-term memory (LSTM) prediction model. The achieved results show how SHAP outperforms LIME in almost all the considered metrics, resulting a suitable and effective solution for HDD predictive maintenance applications.
Evaluating eXplainable artificial intelligence tools for hard disk drive predictive maintenance / Ferraro, A.; Galli, A.; Moscato, V.; Sperli', G.. - In: ARTIFICIAL INTELLIGENCE REVIEW. - ISSN 0269-2821. - 56:7(2023), pp. 7279-7314. [10.1007/s10462-022-10354-7]
Evaluating eXplainable artificial intelligence tools for hard disk drive predictive maintenance
Ferraro A.;Galli A.;Moscato V.;Sperli' G.
2023
Abstract
In the last years, one of the main challenges in Industry 4.0 concerns maintenance operations optimization, which has been widely dealt with several predictive maintenance frameworks aiming to jointly reduce maintenance costs and downtime intervals. Nevertheless, the most recent and effective frameworks mainly rely on deep learning models, but their internal representations (black box) are too complex for human understanding making difficult explain their predictions. This issue can be challenged by using eXplainable artificial intelligence (XAI) methodologies, the aim of which is to explain the decisions of data-driven AI models, characterizing the strengths and weaknesses of the decision-making process by making results more understandable by humans. In this paper, we focus on explanation of the predictions made by a recurrent neural networks based model, which requires a tree-dimensional dataset because it exploits spatial and temporal features for estimating remaining useful life (RUL) of hard disk drives (HDDs). In particular, we have analyzed in depth as explanations about RUL prediction provided by different XAI tools, compared using different metrics and showing the generated dashboards, can be really useful for supporting predictive maintenance task by means of both global and local explanations. For this aim, we have realized an explanation framework able to investigate local interpretable model-agnostic explanations (LIME) and SHapley Additive exPlanations (SHAP) tools w.r.t. to the Backblaze Dataset and a long short-term memory (LSTM) prediction model. The achieved results show how SHAP outperforms LIME in almost all the considered metrics, resulting a suitable and effective solution for HDD predictive maintenance applications.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.