Patch of Invisibility: Naturalistic Physical Black-Box Adversarial Attacks on Object Detectors

By R. Lapid et al
Published on Oct. 17, 2023
Read the original document by opening this link in a new tab.

Table of Contents

1. Introduction
2. Previous work
3. Method
3.1. Generating adversarial patches
3.2. Adversarial gradient estimation

Summary

This paper introduces a novel black-box adversarial attack method using pretrained generative adversarial networks to create physical adversarial patches for object detectors. The approach focuses on real-world scenarios and aims to deceive object detection models without the need for gradient information. By leveraging the latent space of GANs, the authors efficiently generate diverse and realistic adversarial examples. The method outperforms other black-box attacks in both digital and physical domains. The study emphasizes the importance of addressing physical adversarial attacks, which can have serious consequences in various applications. Overall, the research contributes a model-agnostic approach to black-box attacks on object detectors.
×
This is where the content will go.