Full metadata record

DC Field Value Language
dc.contributor.authorKim, Guhyun-
dc.contributor.authorKornijcuk, Vladimir-
dc.contributor.authorKim, Dohun-
dc.contributor.authorKim, Inho-
dc.contributor.authorKim, Jaewook-
dc.contributor.authorWoo, Hyo Cheon-
dc.contributor.authorKim, Jihun-
dc.contributor.authorHwang, Cheol Seong-
dc.contributor.authorJeong, Doo Seok-
dc.date.accessioned2024-01-19T21:03:00Z-
dc.date.available2024-01-19T21:03:00Z-
dc.date.created2021-09-05-
dc.date.issued2019-01-
dc.identifier.issn2169-3536-
dc.identifier.urihttps://pubs.kist.re.kr/handle/201004/120512-
dc.description.abstractIn spite of remarkable progress in machine learning techniques, the state-of-the-art machine learning algorithms often keep machines from real-time learning (online learning) due, in part, to computational complexity in parameter optimization. As an alternative, a learning algorithm to train a memory in real time is proposed, named the Markov chain Hebbian learning algorithm. The algorithm pursues efficient use in memory during training in that: 1) the weight matrix has ternary elements (-1, 0, 1) and 2) each update follows a Markov chain-the upcoming update does not need past weight values. The algorithm was verified by two proof-of-concept tasks: image (MNIST and CIFAR-10 datasets) recognition and multiplication table memorization. Particularly, the latter bases multiplication arithmetic on memory, which may be analogous to humans' mental arithmetic. The memory-based multiplication arithmetic feasibly offers the basis of factorization, supporting novel insight into memory-based arithmetic.-
dc.languageEnglish-
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC-
dc.subjectSINGLE NEURONS-
dc.subjectDEEP-
dc.subjectMEMORY-
dc.subjectNETWORKS-
dc.titleMarkov Chain Hebbian Learning Algorithm With Ternary Synaptic Units-
dc.typeArticle-
dc.identifier.doi10.1109/ACCESS.2018.2890543-
dc.description.journalClass1-
dc.identifier.bibliographicCitationIEEE ACCESS, v.7, pp.10208 - 10223-
dc.citation.titleIEEE ACCESS-
dc.citation.volume7-
dc.citation.startPage10208-
dc.citation.endPage10223-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.identifier.wosid000457963100001-
dc.identifier.scopusid2-s2.0-85061094324-
dc.relation.journalWebOfScienceCategoryComputer Science, Information Systems-
dc.relation.journalWebOfScienceCategoryEngineering, Electrical & Electronic-
dc.relation.journalWebOfScienceCategoryTelecommunications-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalResearchAreaEngineering-
dc.relation.journalResearchAreaTelecommunications-
dc.type.docTypeArticle-
dc.subject.keywordPlusSINGLE NEURONS-
dc.subject.keywordPlusDEEP-
dc.subject.keywordPlusMEMORY-
dc.subject.keywordPlusNETWORKS-
dc.subject.keywordAuthorGreedy edge-wise training-
dc.subject.keywordAuthorHebbian learning-
dc.subject.keywordAuthorMarkov chain-
dc.subject.keywordAuthormental arithmetic-
dc.subject.keywordAuthorprime factorization-
dc.subject.keywordAuthorsupervised learning-
dc.subject.keywordAuthorternary unit-
Appears in Collections:
KIST Article > 2019
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE