TY - JOUR
T1 - A Quantitative Analysis of Non-Profiled Side-Channel Attacks Based on Attention Mechanism
AU - Pu, Kangran
AU - Dang, Hua
AU - Kong, Fancong
AU - Zhang, Jingqi
AU - Wang, Weijiang
N1 - Publisher Copyright:
© 2023 by the authors.
PY - 2023/8
Y1 - 2023/8
N2 - In recent years, the deep learning method has emerged as a mainstream approach to non-profiled side-channel attacks. However, most existing methods of deep learning-based non-profiled side-channel attack rely on traditional metrics such as loss and accuracy, which often suffer from unclear results in practical scenarios. Furthermore, most previous studies have not fully considered the properties of power traces as long time-series data. In this paper, a novel non-profiled side-channel attack architecture is proposed, which incorporates the attention mechanism and derives a corresponding attention metric. By attaching the attention mechanism after the network layers, the attention mechanism provides a quantitative prediction of correct key. Moreover, this architecture can effectively extract and analyze the features from long power traces. The success rate on different datasets is at least 86%, which demonstrates the superior reliability of this architecture compared to other works when facing various countermeasures and noise. Notably, even in scenarios where traditional loss and accuracy metrics fail to provide reliable results, the proposed attention metric remains capable of accurately distinguishing the correct key.
AB - In recent years, the deep learning method has emerged as a mainstream approach to non-profiled side-channel attacks. However, most existing methods of deep learning-based non-profiled side-channel attack rely on traditional metrics such as loss and accuracy, which often suffer from unclear results in practical scenarios. Furthermore, most previous studies have not fully considered the properties of power traces as long time-series data. In this paper, a novel non-profiled side-channel attack architecture is proposed, which incorporates the attention mechanism and derives a corresponding attention metric. By attaching the attention mechanism after the network layers, the attention mechanism provides a quantitative prediction of correct key. Moreover, this architecture can effectively extract and analyze the features from long power traces. The success rate on different datasets is at least 86%, which demonstrates the superior reliability of this architecture compared to other works when facing various countermeasures and noise. Notably, even in scenarios where traditional loss and accuracy metrics fail to provide reliable results, the proposed attention metric remains capable of accurately distinguishing the correct key.
KW - attention mechanism
KW - deep learning
KW - non-profiled attacks
KW - quantitative analysis
UR - http://www.scopus.com/inward/record.url?scp=85167788706&partnerID=8YFLogxK
U2 - 10.3390/electronics12153279
DO - 10.3390/electronics12153279
M3 - Article
AN - SCOPUS:85167788706
SN - 2079-9292
VL - 12
JO - Electronics (Switzerland)
JF - Electronics (Switzerland)
IS - 15
M1 - 3279
ER -