Invisible CMOS Camera Dazzling for Conducting Adversarial Attacks on Deep Neural Networks

Zvi Stein, Adir Hazan, Adrian Stern

Research output: Contribution to journalArticlepeer-review

Abstract

Despite the outstanding performance of deep neural networks, they remain vulnerable to adversarial attacks. While digital domain adversarial attacks are well-documented, most physical-world attacks are typically visible to the human eye. Here, we present a novel invisible optical-based physical adversarial attack via dazzling a CMOS camera. This attack involves using a designed light pulse sequence spatially transformed within the acquired image due to the camera’s shutter mechanism. We provide a detailed analysis of the photopic conditions required to keep the attacking light source invisible to human observers while effectively disrupting the image, thereby deceiving the DNN. The results indicate that the light source duty cycle controls the tradeoff between the attack’s success rate and the degree of concealment needed.

Original languageAmerican English
Article number2301
JournalSensors
Volume25
Issue number7
DOIs
StatePublished - 1 Apr 2025

Keywords

  • CMOS
  • PSF
  • adversarial attack
  • rolling shutter

All Science Journal Classification (ASJC) codes

  • Analytical Chemistry
  • Information Systems
  • Atomic and Molecular Physics, and Optics
  • Biochemistry
  • Instrumentation
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Invisible CMOS Camera Dazzling for Conducting Adversarial Attacks on Deep Neural Networks'. Together they form a unique fingerprint.

Cite this