Abstract
The cameras equipped on autonomous vehicles are frequently contaminated by natural occurrences such as mud spots, stain, and raindrops. Such natural contamination is difficult to detect as abnormality. Inspired by this phenomenon, we propose a novel adversarial camera contamination attack that takes advantage of the inconspicuous nature of lens contamination to achieve object-oriented attacks. Our method specifically targets certain objects (e.g., stop signs) while minimizing the impact on untargeted objects (e.g., cars). By designing a contamination generation module, we model various types of contamination and generate adversarial camera patches that are realistic and effective. Our experiments, conducted in both the digital and physical domains, demonstrate that our method can successfully degrade the detection accuracy of target objects by over 60%, while having minimal impact on untargeted objects.
| Original language | English |
|---|---|
| Journal | Proceedings of the International Joint Conference on Neural Networks |
| DOIs | |
| Publication status | Published - 2025 |
| Externally published | Yes |
| Event | 2025 International Joint Conference on Neural Networks, IJCNN 2025 - Rome, Italy Duration: 30 Jun 2025 → 5 Jul 2025 |
Keywords
- adversarial attacks
- camera attacks
- object detection
Fingerprint
Dive into the research topics of 'AdvLensPolluter: Object-oriented Adversarial Camera Contamination Attack'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver