Picture for Kazuki Koga

Kazuki Koga

Simple black-box universal adversarial attacks on medical image classification based on deep neural networks

Add code
Aug 11, 2021
Figure 1 for Simple black-box universal adversarial attacks on medical image classification based on deep neural networks
Figure 2 for Simple black-box universal adversarial attacks on medical image classification based on deep neural networks
Figure 3 for Simple black-box universal adversarial attacks on medical image classification based on deep neural networks
Figure 4 for Simple black-box universal adversarial attacks on medical image classification based on deep neural networks
Viaarxiv icon

Vulnerability of deep neural networks for detecting COVID-19 cases from chest X-ray images to universal adversarial attacks

Add code
May 22, 2020
Figure 1 for Vulnerability of deep neural networks for detecting COVID-19 cases from chest X-ray images to universal adversarial attacks
Figure 2 for Vulnerability of deep neural networks for detecting COVID-19 cases from chest X-ray images to universal adversarial attacks
Figure 3 for Vulnerability of deep neural networks for detecting COVID-19 cases from chest X-ray images to universal adversarial attacks
Figure 4 for Vulnerability of deep neural networks for detecting COVID-19 cases from chest X-ray images to universal adversarial attacks
Viaarxiv icon