TY - JOUR
T1 - An uncertainty-informed framework for trustworthy fault diagnosis in safety-critical applications
AU - Zhou, Taotao
AU - Zhang, Laibin
AU - Han, Te
AU - Droguett, Enrique Lopez
AU - Mosleh, Ali
AU - Chan, Felix T.S.
N1 - Publisher Copyright:
© 2022
PY - 2023/1
Y1 - 2023/1
N2 - Deep learning-based models, while highly effective for prognostics and health management, fail to reliably detect the data unknown in the training stage, referred to as out-of-distribution (OOD) data. This restricts their use in safety-critical assets, where unknowns may impose significant risks and cause serious consequences. To address this issue, we propose to leverage predictive uncertainty as a sign of trustworthiness that aids decision-makers in comprehending fault diagnostic results. A novel probabilistic Bayesian convolutional neural network (PBCNN) is presented to quantify predictive uncertainty instead of deterministic deep learning, so as to develop a trustworthy fault diagnosis framework. Then, a predictive risk-aware strategy is proposed to guide the fault diagnosis model to make predictions within tolerable risk limits and otherwise to request the assistance of human experts. The proposed method is capable of not only achieving accurate results, but also improving the trustworthiness of deep learning-based fault diagnosis in safety-critical applications. The proposed framework is demonstrated by fault diagnosis of bearings using three types of OOD data. The results show that the proposed framework has high accuracy in handling a mix of irrelevant data, and also maintains good performance when dealing with a mix of sensor faults and unknown faults, respectively.
AB - Deep learning-based models, while highly effective for prognostics and health management, fail to reliably detect the data unknown in the training stage, referred to as out-of-distribution (OOD) data. This restricts their use in safety-critical assets, where unknowns may impose significant risks and cause serious consequences. To address this issue, we propose to leverage predictive uncertainty as a sign of trustworthiness that aids decision-makers in comprehending fault diagnostic results. A novel probabilistic Bayesian convolutional neural network (PBCNN) is presented to quantify predictive uncertainty instead of deterministic deep learning, so as to develop a trustworthy fault diagnosis framework. Then, a predictive risk-aware strategy is proposed to guide the fault diagnosis model to make predictions within tolerable risk limits and otherwise to request the assistance of human experts. The proposed method is capable of not only achieving accurate results, but also improving the trustworthiness of deep learning-based fault diagnosis in safety-critical applications. The proposed framework is demonstrated by fault diagnosis of bearings using three types of OOD data. The results show that the proposed framework has high accuracy in handling a mix of irrelevant data, and also maintains good performance when dealing with a mix of sensor faults and unknown faults, respectively.
KW - Bayesian deep learning
KW - Out-of-distribution
KW - Probabilistic
KW - Trustworthy fault diagnosis
KW - Uncertainty-informed
UR - http://www.scopus.com/inward/record.url?scp=85140136291&partnerID=8YFLogxK
U2 - 10.1016/j.ress.2022.108865
DO - 10.1016/j.ress.2022.108865
M3 - Article
AN - SCOPUS:85140136291
SN - 0951-8320
VL - 229
JO - Reliability Engineering and System Safety
JF - Reliability Engineering and System Safety
M1 - 108865
ER -