AutoSNN: Towards Energy-Efficient Spiking Neural Networks

Authors
Na, ByunggookMok, JisooPark, SeongsikLee, DongjinChoe, HyeokjunYoon, Sungroh
Issue Date
2022-07
Publisher
JMLR-Journal Machine Learning Research
Citation
38th International Conference on Machine Learning (ICML)
Abstract
Spiking neural networks (SNNs) that mimic information transmission in the brain can energy-efficiently process spatio-temporal information through discrete and sparse spikes, thereby receiving considerable attention. To improve accuracy and energy efficiency of SNNs, most previous studies have focused solely on training methods, and the effect of architecture has rarely been studied. We investigate the design choices used in the previous studies in terms of the accuracy and number of spikes and figure out that they are not best-suited for SNNs. To further improve the accuracy and reduce the spikes generated by SNNs, we propose a spike-aware neural architecture search framework called AutoSNN. We define a search space consisting of architectures without undesirable design choices. To enable the spike-aware architecture search, we introduce a fitness that considers both the accuracy and number of spikes. AutoSNN successfully searches for SNN architectures that outperform hand-crafted SNNs in accuracy and energy efficiency. We thoroughly demonstrate the effectiveness of AutoSNN on various datasets including neuromorphic datasets.
URI
https://pubs.kist.re.kr/handle/201004/77153
DOI
10.48550/arXiv.2201.12738
Appears in Collections:
KIST Conference Paper > 2022
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE