Dinh, V.-N. and Bui, N.-M. and Nguyen, V.-T. and John, D. and Lin, L.-Y. and Trinh, Q.-K. (2023) NUTS-BSNN: A non-uniform time-step binarized spiking neural network with energy-efficient in-memory computing macro. Neurocomputing, 560: 126838. ISSN 09252312
Full text not available from this repository. (Upload)Abstract
This work introduces a network architecture NUTS-BSNN: A Non-uniform Time-step Binarized Spiking Neural Network. NUTS-BSNN is a fully binarized spiking neural network with all binary weights, including the input and output layers. In the input and output layers, the weights are represented as stochastic series of numbers, while in the hidden layers, they are approximated to binary values for using simple XNOR-based computations. To compensate for the information loss due to binarization, we increased the convolutions at the input layer sequentially computed over multiple time-steps. The results from these operations are accumulated before generating spikes for the subsequent layers to increase the overall performance. We chose 14 time-steps for accumulation to achieve a good tradeoff between performance and inference latency. The proposed technique was evaluated using three datasets by direct training method and using a surrogate gradient algorithm. We achieved classification accuracies of 93.25, 88.71, and 70.31 on the Fashion-MNIST, CIFAR-10, and CIFAR-100 datasets, respectively. Further, we present an in-memory computing architecture for NUTS-BSNN, which limits resource and power consumption for hardware implementation. © 2023 Elsevier B.V.
Item Type: | Article |
---|---|
Divisions: | Faculties > Faculty of Radio-Electronic Engineering |
Identification Number: | 10.1016/j.neucom.2023.126838 |
Uncontrolled Keywords: | Classification (of information); Energy efficiency; Memory architecture; Multilayer neural networks; Stochastic systems, AI applications; Binary spiking neural network; Edge-AI application; In-memory computing; Input and outputs; Input layers; Neural-networks; Neuromorphic computing; Non-uniform; Time step, Network architecture, accuracy; algorithm; Article; binary classification; computer analysis; computer simulation; convolution algorithm; data processing; energy consumption; latent period; memory; non uniform time step binarized spiking neural network; spiking neural network; stochastic model; surrogate gradient algorithm |
URI: | http://eprints.lqdtu.edu.vn/id/eprint/10975 |