TY - JOUR
T1 - Towards trustworthy rotating machinery fault diagnosis via attention uncertainty in transformer
AU - Xiao, Yiming
AU - Shao, Haidong
AU - Feng, Minjie
AU - Han, Te
AU - Wan, Jiafu
AU - Liu, Bin
N1 - Publisher Copyright:
© 2023 The Society of Manufacturing Engineers
PY - 2023/10
Y1 - 2023/10
N2 - To enable researchers to fully trust the decisions made by deep diagnostic models, interpretable rotating machinery fault diagnosis (RMFD) research has emerged. Existing interpretable RMFD research focuses on developing interpretable modules embedded in deep models to assign physical meaning to results, or on inferring the logic of the model to make decisions based on results. However, there is limited work on how to quantify uncertainty in results and explain its sources and composition. Uncertainty quantification and decomposition not only provide the confidence of the results, but also identify the source of unknown factors in the data, and consequently guide to enhance the interpretability and trustworthiness of models. Therefore, this paper proposes to use Bayesian variational learning to introduce uncertainty into the attention weights of Transformer to construct a probabilistic Bayesian Transformer for trustworthy RMFD. A probabilistic attention is designed and the corresponding optimization objective is defined, which can infer the prior and variational posterior distributions of attention weights, thus empowering the model to perceive uncertainty. An uncertainty quantification and decomposition scheme is developed to achieve confidence characterization of results and separation of epistemic and aleatoric uncertainty. The effectiveness of the proposed method is fully verified in three out-of-distribution generalization scenarios.
AB - To enable researchers to fully trust the decisions made by deep diagnostic models, interpretable rotating machinery fault diagnosis (RMFD) research has emerged. Existing interpretable RMFD research focuses on developing interpretable modules embedded in deep models to assign physical meaning to results, or on inferring the logic of the model to make decisions based on results. However, there is limited work on how to quantify uncertainty in results and explain its sources and composition. Uncertainty quantification and decomposition not only provide the confidence of the results, but also identify the source of unknown factors in the data, and consequently guide to enhance the interpretability and trustworthiness of models. Therefore, this paper proposes to use Bayesian variational learning to introduce uncertainty into the attention weights of Transformer to construct a probabilistic Bayesian Transformer for trustworthy RMFD. A probabilistic attention is designed and the corresponding optimization objective is defined, which can infer the prior and variational posterior distributions of attention weights, thus empowering the model to perceive uncertainty. An uncertainty quantification and decomposition scheme is developed to achieve confidence characterization of results and separation of epistemic and aleatoric uncertainty. The effectiveness of the proposed method is fully verified in three out-of-distribution generalization scenarios.
KW - Bayesian deep learning
KW - Probabilistic attention
KW - Transformer
KW - Trustworthy rotating machinery fault diagnosis
KW - Uncertainty quantification and decomposition
UR - http://www.scopus.com/inward/record.url?scp=85166187801&partnerID=8YFLogxK
U2 - 10.1016/j.jmsy.2023.07.012
DO - 10.1016/j.jmsy.2023.07.012
M3 - Article
AN - SCOPUS:85166187801
SN - 0278-6125
VL - 70
SP - 186
EP - 201
JO - Journal of Manufacturing Systems
JF - Journal of Manufacturing Systems
ER -