Late Breaking Results: Improving Deep SNNs with Gradient Clipping and Noise Exploitation in Neuromorphic Devices
- Authors
- Park, Seongsik; Park, Jongkil; Jang, Hyun Jae; Kim, Jaewook; Jeong, YeonJoo; Hwang, Gyu Weon; Kim, Inho; Park, Jong-Keuk; Lee, Kyeong Seok; Lee, Suyoun
- Issue Date
- 2025-05
- Publisher
- IEEE
- Citation
- 2025 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE, DATE
- Abstract
- Deep spiking neural networks (SNNs) have shown remarkable progress due to improvements, such as training algorithms. However, most of them have not considered the features of neuromorphic devices. Their improvement has relied on soft resets, which are computationally expensive and unsuitable for neuromorphic devices. To address this, this paper proposes gradient clipping in hard reset-based deep SNNs and explores how device noise enhances learning performance. According to our experiments on various datasets and models, the proposed approach improved the training performance of deep SNNs with hard reset. These findings bridge gaps between SNN algorithms and hardware constraints, paving the way for efficient neuromorphic computing.
- Keywords
- hard reset; gradient clipping; spiking neural networks; neuromorhpic device
- ISSN
- 1530-1591
- URI
- https://pubs.kist.re.kr/handle/201004/152996
- DOI
- 10.23919/DATE64628.2025.10993187
- Appears in Collections:
- KIST Article > Others
- Files in This Item:
There are no files associated with this item.
- Export
- RIS (EndNote)
- XLS (Excel)
- XML
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.