InFocus: Amplifying Critical Feature Influence on Non-Intrusive Load Monitoring Through Self-Attention Mechanisms

Jialing He, Zijian Zhang*, Liran Ma, Zhouyu Zhang, Meng Li, Bakh Khoussainov, Jiamou Liu, Liehuang Zhu*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

9 Citations (Scopus)

Abstract

Non-intrusive load monitoring (NILM) enables extracting individual appliances' power consumption data from an aggregated power signal in a cost-effective way. The extracted appliance-level power data can greatly facilitate tasks such as malfunction diagnosis and load forecasting, which are of significant importance for efficient energy use. Various neural networks including convolutional neural network (CNN), recurrent neural network (RNN), and transformer (self-attention-based neural network) are employed in the design of NILM solutions since 2015. In particular, CNN is revealed to extract certain critical features such as power level and typical usage duration, thus, achieving superior performance. However, the global features especially the dependency correlations between different positions in a sequence cannot be properly acquired. Accordingly, we devise a novel model incorporating an added attention layer to overcome this limitation. The added self-attention mechanism can automatically assign attention scores/weights to different features outputted by convolutional layers, which amplifies the positive influence of critical knowledge while realizing global reference. Moreover, this model can explicitly extract the appliance's multi-state information, which endows the model with more interpretability. We further improve our model by substituting the added self-attention mechanism with a lightweight one, which decreases the number of model parameters while maintaining the decomposing accuracy. Experimental results over two real-world datasets, REDD and UK-DALE, demonstrate that our models outperform the state-of-the-art, achieving 6.5%-52% improvement on average for three standard evaluation metrics. Moreover, we offer an extracted NILM solution system architecture incorporating neural networks, aiming to establish a framework to support future research.

Original languageEnglish
Pages (from-to)3828-3840
Number of pages13
JournalIEEE Transactions on Smart Grid
Volume14
Issue number5
DOIs
Publication statusPublished - 1 Sept 2023

Keywords

  • Non-intrusive load monitoring
  • dual-convolutional neural network
  • energy disaggregation
  • self-attention

Fingerprint

Dive into the research topics of 'InFocus: Amplifying Critical Feature Influence on Non-Intrusive Load Monitoring Through Self-Attention Mechanisms'. Together they form a unique fingerprint.

Cite this