STAR-SNN: A spatio-temporal adaptive recurrent spiking neural network with separated propagation surrogate gradient for hardware efficient real-time learning

Authors
Choi, HojaeKim, JaewookPark, JongkilPark, SeongsikJang, Hyun JaeLee, Seung HwanJu, Byeong-KwonJeong, YeonJoo
Issue Date
2026-04
Publisher
Elsevier BV
Citation
Neurocomputing, v.674
Abstract
Backpropagation Through Time (BPTT) trains Recurrent Spiking Neural Networks (R-SNNs) effectively but incurs high computational and memory costs, limiting real-time applications. To mitigate resource demands, we adopt truncated BPTT (K=1), reducing memory cost by three orders of magnitude. However, this truncation weakens sequence learning by limiting gradient propagation. To compensate, we introduce the Spatio-temporal Adaptive Recurrent Spiking Neural Network (STAR-SNN), which incorporates adaptive parameters to enhance high-dimensional representations and effectively retain sequence information despite truncation. Additionally, R-SNNs suffer from unstable training due to the entanglement of spike generation and suppression in weight updates. To resolve this, we develop Separated Propagation Surrogate Gradient (SPSG), which decouples these processes by selectively propagating error signals, stabilizing learning and improving convergence. Our approach achieves a 393-fold reduction in MSE loss for chaotic system forecasting and delivers high performance in event-driven DVS-Gesture recognition, establishing a scalable, hardware-efficient framework for real-time neuromorphic computing.
Keywords
MODEL; BACKPROPAGATION; Backpropagation through time; Energy efficiency; Neuromorphic computing; Real-time learning; Spiking neural networks; Surrogate gradient
ISSN
0925-2312
URI
https://pubs.kist.re.kr/handle/201004/154368
DOI
10.1016/j.neucom.2026.132968
Appears in Collections:
KIST Article > 2026
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE