Full metadata record

DC Field Value Language
dc.contributor.authorKwon, Jangho-
dc.contributor.authorHa, Jihyeon-
dc.contributor.authorKim, Da-Hye-
dc.contributor.authorChoi, Jun Won-
dc.contributor.authorKim, Laehyun-
dc.date.accessioned2024-01-19T13:34:02Z-
dc.date.available2024-01-19T13:34:02Z-
dc.date.created2022-01-10-
dc.date.issued2021-10-
dc.identifier.issn2169-3536-
dc.identifier.urihttps://pubs.kist.re.kr/handle/201004/116340-
dc.description.abstractWe present a glasses-type wearable device to detect emotions from a human face in an unobtrusive manner. The device is designed to gather multi-channel responses from the user's face naturally and continuously while he/she is wearing it. The multi-channel facial responses consist of local facial images and biosignals including electrodermal activity (EDA) and photoplethysmogram (PPG). We had conducted experiments to determine the optimal positions of EDA sensors on the wearable device because EDA signal quality is very sensitive to the sensing position. In addition to the physiological data, the device can capture the image region representing local facial expressions around the left eye via a built-in camera. In this study, we developed and validated an algorithm to recognize emotions using multi-channel responses obtained from the device. The results show that the emotion recognition algorithm using only local facial images has an accuracy of 76.09% at classifying emotions. Using multi-channel data including EDA and PPG, this accuracy was increased by 8.46% compared to using the local facial expression alone. This glasses-type wearable system measuring multi-channel facial responses in a natural manner is very useful for monitoring a user's emotions in daily life, which has a huge potential for use in the healthcare industry.-
dc.languageEnglish-
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC-
dc.subjectHEART-RATE-VARIABILITY-
dc.subjectSENSOR-
dc.titleEmotion Recognition Using a Glasses-Type Wearable Device via Multi-Channel Facial Responses-
dc.typeArticle-
dc.identifier.doi10.1109/ACCESS.2021.3121543-
dc.description.journalClass1-
dc.identifier.bibliographicCitationIEEE ACCESS, v.9, pp.146392 - 146403-
dc.citation.titleIEEE ACCESS-
dc.citation.volume9-
dc.citation.startPage146392-
dc.citation.endPage146403-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.identifier.wosid000714706800001-
dc.identifier.scopusid2-s2.0-85118246850-
dc.relation.journalWebOfScienceCategoryComputer Science, Information Systems-
dc.relation.journalWebOfScienceCategoryEngineering, Electrical & Electronic-
dc.relation.journalWebOfScienceCategoryTelecommunications-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalResearchAreaEngineering-
dc.relation.journalResearchAreaTelecommunications-
dc.type.docTypeArticle-
dc.subject.keywordPlusHEART-RATE-VARIABILITY-
dc.subject.keywordPlusSENSOR-
dc.subject.keywordAuthorWearable computers-
dc.subject.keywordAuthorEmotion recognition-
dc.subject.keywordAuthorSensors-
dc.subject.keywordAuthorCameras-
dc.subject.keywordAuthorBiomedical monitoring-
dc.subject.keywordAuthorGlass-
dc.subject.keywordAuthorMotion pictures-
dc.subject.keywordAuthorWearable device-
dc.subject.keywordAuthoremotion recognition-
dc.subject.keywordAuthoraffective computing-
dc.subject.keywordAuthorfacial expression-
dc.subject.keywordAuthorbiosignal-
dc.subject.keywordAuthorphysiological responses-
Appears in Collections:
KIST Article > 2021
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE