A Bayesian adversarial probsparse Transformer model for long-term remaining useful life prediction

Yongbo Cheng, Junheng Qv, Ke Feng, Te Han*

*此作品的通讯作者

    科研成果: 期刊稿件文章同行评审

    4 引用 (Scopus)

    摘要

    Long-term remaining useful life (RUL) prediction is essential for the maintenance of safety-crucial engineering assets. Deep learning (DL) models, especially Transformer-based models have achieved outstanding performance in long-term RUL prediction. However, existing Transformer models neglect the impact of discrepancy loss in model training. The accumulation of the discrepancy loss during the inference will hamper the generalization of prediction model, resulting in an overfitting problem. To address the problem, this paper proposes a Bayesian Adversarial Probsparse Transformer (BAPT) model for long-term RUL prediction. Firstly, the adversarial learning method is leveraged to mitigate the impact of accumulated discrepancy loss caused by varying working conditions in long-term prediction, thus diminishing the error accumulation. Secondly, the Probsparse multi-head attention is adopted to enhance the efficiency of feature extraction. The Probsparse multi-head attention focuses on the significant degradation features in long time-series to reduce the computation complexity. Lastly, the Bayesian neural network is introduced to quantify the uncertainty in RUL prediction. The effectiveness of the proposed model is verified using two commercial aircraft turbofan engine datasets. The results indicate that BAPT model for long-term RUL prediction demonstrates better performance than the existing state-of-the-art models.

    源语言英语
    文章编号110188
    期刊Reliability Engineering and System Safety
    248
    DOI
    出版状态已出版 - 8月 2024

    指纹

    探究 'A Bayesian adversarial probsparse Transformer model for long-term remaining useful life prediction' 的科研主题。它们共同构成独一无二的指纹。

    引用此