Bui, N.-M. and Dinh, V.-N. and Pham, V.-H. and Trinh, Q.-K. (2023) Uncovering the Resilience of Binarized Spiking Neural Networks under Adversarial Attacks. In: 12th IEEE International Conference on Control, Automation and Information Sciences, ICCAIS 2023, 27 November 2023 Through 29 November 2023, Hanoi.
Full text not available from this repository. (Upload)Abstract
The Binarized Spiking Neural Network (BSNN)-a Spiking Neural Network with binary weights, is particularly suitable for Edge-AI hardware architectures thanks to its simplicity in data format and computing functions. However, like other SNNs, BSNNs could be directly trained or converted from Artificial Neural Networks using the gradient principle. They hence are highly susceptible to adversarial attacks. This study focuses on investigating the resilience of BSNNs against adversarial attacks. We assess the robustness of BSNNs through FGSM and PGD attacks on the Fashion-MNIST dataset. This work marks the first implementation of adversarial attack and defense tailored to the BSNNs. Our results show that adversarial training significantly enhances the robustness of BSNNs against adversarial attacks compared to the original model. Improving the resilience of BSNN opens doors to its potential applications in real-world scenarios. © 2023 IEEE.
Item Type: | Conference or Workshop Item (Paper) |
---|---|
Divisions: | Faculties > Faculty of Radio-Electronic Engineering |
Identification Number: | 10.1109/ICCAIS59597.2023.10382270 |
Uncontrolled Keywords: | Adversarial attack; Adversarial training; Binarized spiking neural network; Computing functions; Hardware architecture; Neural-networks; Open doors; Original model; Real-world scenario, Neural networks |
Additional Information: | Conference of 12th IEEE International Conference on Control, Automation and Information Sciences, ICCAIS 2023 ; Conference Date: 27 November 2023 Through 29 November 2023; Conference Code:196337 |
URI: | http://eprints.lqdtu.edu.vn/id/eprint/11130 |