AdvLensPolluter: Object-oriented Adversarial Camera Contamination Attack

  • Feiyang Xu
  • , Shihao Wang
  • , Genghua Kou
  • , Ying Li*
  • *Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

Abstract

The cameras equipped on autonomous vehicles are frequently contaminated by natural occurrences such as mud spots, stain, and raindrops. Such natural contamination is difficult to detect as abnormality. Inspired by this phenomenon, we propose a novel adversarial camera contamination attack that takes advantage of the inconspicuous nature of lens contamination to achieve object-oriented attacks. Our method specifically targets certain objects (e.g., stop signs) while minimizing the impact on untargeted objects (e.g., cars). By designing a contamination generation module, we model various types of contamination and generate adversarial camera patches that are realistic and effective. Our experiments, conducted in both the digital and physical domains, demonstrate that our method can successfully degrade the detection accuracy of target objects by over 60%, while having minimal impact on untargeted objects.

Original languageEnglish
JournalProceedings of the International Joint Conference on Neural Networks
DOIs
Publication statusPublished - 2025
Externally publishedYes
Event2025 International Joint Conference on Neural Networks, IJCNN 2025 - Rome, Italy
Duration: 30 Jun 20255 Jul 2025

Keywords

  • adversarial attacks
  • camera attacks
  • object detection

Fingerprint

Dive into the research topics of 'AdvLensPolluter: Object-oriented Adversarial Camera Contamination Attack'. Together they form a unique fingerprint.

Cite this