ATTACK-COSM: attacking the camouflaged object segmentation model through digital world adversarial examples

Qiaoyi Li, Zhengjie Wang*, Xiaoning Zhang, Yang Li

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

The camouflaged object segmentation model (COSM) has recently gained substantial attention due to its remarkable ability to detect camouflaged objects. Nevertheless, deep vision models are widely acknowledged to be susceptible to adversarial examples, which can mislead models, causing them to make incorrect predictions through imperceptible perturbations. The vulnerability to adversarial attacks raises significant concerns when deploying COSM in security-sensitive applications. Consequently, it is crucial to determine whether the foundational vision model COSM is also susceptible to such attacks. To our knowledge, our work represents the first exploration of strategies for targeting COSM with adversarial examples in the digital world. With the primary objective of reversing the predictions for both masked objects and backgrounds, we explore the adversarial robustness of COSM in full white-box and black-box settings. In addition to the primary objective of reversing the predictions for masked objects and backgrounds, our investigation reveals the potential to generate any desired mask through adversarial attacks. The experimental results indicate that COSM demonstrates weak robustness, rendering it vulnerable to adversarial example attacks. In the realm of COS, the projected gradient descent (PGD) attack method exhibits superior attack capabilities compared to the fast gradient sign (FGSM) method in both white-box and black-box settings. These findings reduce the security risks in the application of COSM and pave the way for multiple applications of COSM.

Original languageEnglish
JournalComplex and Intelligent Systems
DOIs
Publication statusAccepted/In press - 2024

Keywords

  • Adversarial robustness
  • Black-box setting
  • COSM
  • White-box setting

Fingerprint

Dive into the research topics of 'ATTACK-COSM: attacking the camouflaged object segmentation model through digital world adversarial examples'. Together they form a unique fingerprint.

Cite this