TY - GEN
T1 - Visualizing One Pixel Attack Using Adversarial Maps
AU - Wang, Wanyi
AU - Sun, Jian
AU - Wang, Gang
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2020/11/6
Y1 - 2020/11/6
N2 - One pixel attack is one of the most puzzling adversarial attacks, in which the position of the attack plays an important role. However, little research has been conducted on the distributions of one pixel attack. In this context, a technique called adversarial maps is proposed, which helps visualize the distributions of one pixel attack for the first time. Adversarial maps consist of pixel adversarial maps and probability adversarial maps, which record the pixel changes and the confidence of the target class in successful attack cases, respectively. Leveraging this technique, one pixel attack distributions and why the position of one pixel attack impacts success rate is explored. Adversarial maps reveal that successful attacks always group as regions and the high saliency areas of saliency maps are more likely to be attacked successfully. Moreover, these observations are further corroborated by a mathematical analysis, demonstrating that adversarial attacks are disturbances in the saliency maps.
AB - One pixel attack is one of the most puzzling adversarial attacks, in which the position of the attack plays an important role. However, little research has been conducted on the distributions of one pixel attack. In this context, a technique called adversarial maps is proposed, which helps visualize the distributions of one pixel attack for the first time. Adversarial maps consist of pixel adversarial maps and probability adversarial maps, which record the pixel changes and the confidence of the target class in successful attack cases, respectively. Leveraging this technique, one pixel attack distributions and why the position of one pixel attack impacts success rate is explored. Adversarial maps reveal that successful attacks always group as regions and the high saliency areas of saliency maps are more likely to be attacked successfully. Moreover, these observations are further corroborated by a mathematical analysis, demonstrating that adversarial attacks are disturbances in the saliency maps.
KW - adversarial maps
KW - attack distributions
KW - one pixel attack
KW - saliency maps
UR - http://www.scopus.com/inward/record.url?scp=85100932494&partnerID=8YFLogxK
U2 - 10.1109/CAC51589.2020.9327603
DO - 10.1109/CAC51589.2020.9327603
M3 - Conference contribution
AN - SCOPUS:85100932494
T3 - Proceedings - 2020 Chinese Automation Congress, CAC 2020
SP - 924
EP - 929
BT - Proceedings - 2020 Chinese Automation Congress, CAC 2020
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2020 Chinese Automation Congress, CAC 2020
Y2 - 6 November 2020 through 8 November 2020
ER -