Events Calendar

Rahul Paul, PhD (University of South Florida): Ensemble Adversarial Attack and Defense Against Medical Deep Learning System
Tuesday 10 March 2020, 12:00pm - 01:00pm

The lung cancer survival rate is one of the lowest worldwide. Convolutional Neural Network (CNN) for the study of lung cancer is now frequently deployed. Nonetheless, recent work exhibited that adding some noise to an image would potentially deceive a trained CNN model. This is known as an adversarial attack. We evaluated the Fast gradient signed method (FGSM) adversarial attack for the lung nodule malignancy analysis. Our study used the National Lung Screening Trial (NLST) dataset. Originally, 75.1%, 75.5%, and 76% accuracy were reported on the original images (no adversarial attack) using the three CNNs we proposed. Then, our trained models showed a 28% -36% reduction in accuracy after the FGSM attack. We also observed that the efficiency of classification would be improved by incorporating the adversarial images in the training set. To counteract the adversarial attack, we proposed a multi-initialization ensemble. We also proposed an adversarial ensemble attack that showed further accuracy loss than a single FGSM attack.

Location : Goitein Room