Full metadata record

DC Field Value Language
dc.contributor.authorChoi, Hojae-
dc.contributor.authorKim, Jaewook-
dc.contributor.authorPark, Jongkil-
dc.contributor.authorPark, Seongsik-
dc.contributor.authorJang, Hyun Jae-
dc.contributor.authorLee, Seung Hwan-
dc.contributor.authorJu, Byeong-Kwon-
dc.contributor.authorJeong, YeonJoo-
dc.date.accessioned2026-02-26T05:00:05Z-
dc.date.available2026-02-26T05:00:05Z-
dc.date.created2026-02-26-
dc.date.issued2026-04-
dc.identifier.issn0925-2312-
dc.identifier.urihttps://pubs.kist.re.kr/handle/201004/154368-
dc.description.abstractBackpropagation Through Time (BPTT) trains Recurrent Spiking Neural Networks (R-SNNs) effectively but incurs high computational and memory costs, limiting real-time applications. To mitigate resource demands, we adopt truncated BPTT (K=1), reducing memory cost by three orders of magnitude. However, this truncation weakens sequence learning by limiting gradient propagation. To compensate, we introduce the Spatio-temporal Adaptive Recurrent Spiking Neural Network (STAR-SNN), which incorporates adaptive parameters to enhance high-dimensional representations and effectively retain sequence information despite truncation. Additionally, R-SNNs suffer from unstable training due to the entanglement of spike generation and suppression in weight updates. To resolve this, we develop Separated Propagation Surrogate Gradient (SPSG), which decouples these processes by selectively propagating error signals, stabilizing learning and improving convergence. Our approach achieves a 393-fold reduction in MSE loss for chaotic system forecasting and delivers high performance in event-driven DVS-Gesture recognition, establishing a scalable, hardware-efficient framework for real-time neuromorphic computing.-
dc.languageEnglish-
dc.publisherElsevier BV-
dc.titleSTAR-SNN: A spatio-temporal adaptive recurrent spiking neural network with separated propagation surrogate gradient for hardware efficient real-time learning-
dc.typeArticle-
dc.identifier.doi10.1016/j.neucom.2026.132968-
dc.description.journalClass1-
dc.identifier.bibliographicCitationNeurocomputing, v.674-
dc.citation.titleNeurocomputing-
dc.citation.volume674-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.identifier.wosid001688702900001-
dc.identifier.scopusid2-s2.0-105029521411-
dc.relation.journalWebOfScienceCategoryComputer Science, Artificial Intelligence-
dc.relation.journalResearchAreaComputer Science-
dc.type.docTypeArticle-
dc.subject.keywordPlusMODEL-
dc.subject.keywordPlusBACKPROPAGATION-
dc.subject.keywordAuthorBackpropagation through time-
dc.subject.keywordAuthorEnergy efficiency-
dc.subject.keywordAuthorNeuromorphic computing-
dc.subject.keywordAuthorReal-time learning-
dc.subject.keywordAuthorSpiking neural networks-
dc.subject.keywordAuthorSurrogate gradient-
Appears in Collections:
KIST Article > 2026
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE