Full metadata record

DC Field Value Language
dc.contributor.authorHa, Jihyeon-
dc.contributor.authorPark, Sangin-
dc.contributor.authorIm, Chang-Hwan-
dc.contributor.authorKim, Laehyun-
dc.date.accessioned2024-01-19T14:30:30Z-
dc.date.available2024-01-19T14:30:30Z-
dc.date.created2021-10-21-
dc.date.issued2021-07-
dc.identifier.issn1424-8220-
dc.identifier.urihttps://pubs.kist.re.kr/handle/201004/116794-
dc.description.abstractAssistant devices such as meal-assist robots aid individuals with disabilities and support the elderly in performing daily activities. However, existing meal-assist robots are inconvenient to operate due to non-intuitive user interfaces, requiring additional time and effort. Thus, we developed a hybrid brain-computer interface-based meal-assist robot system following three features that can be measured using scalp electrodes for electroencephalography. The following three procedures comprise a single meal cycle. (1) Triple eye-blinks (EBs) from the prefrontal channel were treated as activation for initiating the cycle. (2) Steady-state visual evoked potentials (SSVEPs) from occipital channels were used to select the food per the user's intention. (3) Electromyograms (EMGs) were recorded from temporal channels as the users chewed the food to mark the end of a cycle and indicate readiness for starting the following meal. The accuracy, information transfer rate, and false positive rate during experiments on five subjects were as follows: accuracy (EBs/SSVEPs/EMGs) (%): (94.67/83.33/97.33); FPR (EBs/EMGs) (times/min): (0.11/0.08); ITR (SSVEPs) (bit/min): 20.41. These results revealed the feasibility of this assistive system. The proposed system allows users to eat on their own more naturally. Furthermore, it can increase the self-esteem of disabled and elderly peeople and enhance their quality of life.-
dc.languageEnglish-
dc.publisherMDPI-
dc.titleA Hybrid Brain-Computer Interface for Real-Life Meal-Assist Robot Control-
dc.typeArticle-
dc.identifier.doi10.3390/s21134578-
dc.description.journalClass1-
dc.identifier.bibliographicCitationSENSORS, v.21, no.13-
dc.citation.titleSENSORS-
dc.citation.volume21-
dc.citation.number13-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.identifier.wosid000670916500001-
dc.identifier.scopusid2-s2.0-85108945848-
dc.relation.journalWebOfScienceCategoryChemistry, Analytical-
dc.relation.journalWebOfScienceCategoryEngineering, Electrical & Electronic-
dc.relation.journalWebOfScienceCategoryInstruments & Instrumentation-
dc.relation.journalResearchAreaChemistry-
dc.relation.journalResearchAreaEngineering-
dc.relation.journalResearchAreaInstruments & Instrumentation-
dc.type.docTypeArticle-
dc.subject.keywordAuthormeal-assist robot-
dc.subject.keywordAuthorbrain-computer interface-
dc.subject.keywordAuthorelectroencephalogram-
dc.subject.keywordAuthorsteady-state visual evoked potential-
dc.subject.keywordAuthoreye-blink-
dc.subject.keywordAuthorelectromyogram-
Appears in Collections:
KIST Article > 2021
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE