Full metadata record

DC Field Value Language
dc.contributor.authorYun, S.-S.-
dc.contributor.authorKim, M.-
dc.contributor.authorChoi, M.-T.-
dc.contributor.authorSong, J.-B.-
dc.date.accessioned2024-01-20T12:01:27Z-
dc.date.available2024-01-20T12:01:27Z-
dc.date.created2021-09-02-
dc.date.issued2013-08-
dc.identifier.issn1976-5622-
dc.identifier.urihttps://pubs.kist.re.kr/handle/201004/127846-
dc.description.abstractAccording to the cognitive science research, the interaction intent of humans can be estimated through an analysis of the representing behaviors. This paper proposes a novel methodology for reliable intention analysis of humans by applying this approach. To identify the intention, 8 behavioral features are extracted from the 4 characteristics in human-human interaction and we outline a set of core components for nonverbal behavior of humans. These nonverbal behaviors are associated with various recognition modules including multimodal sensors which have each modality with localizing sound source of the speaker in the audition part, recognizing frontal face and facial expression in the vision part, and estimating human trajectories, body pose and leaning, and hand gesture in the spatial part. As a post-processing step, temporal confidential reasoning is utilized to improve the recognition performance and integrated human model is utilized to quantitatively classify the intention from multi-dimensional cues by applying the weight factor. Thus, interactive robots can make informed engagement decision to effectively interact with multiple persons. Experimental results show that the proposed scheme works successfully between human users and a robot in human-robot interaction. ? ICROS 2013.-
dc.languageKorean-
dc.subjectBehavioral features-
dc.subjectCognitive science-
dc.subjectConfidential reasoning-
dc.subjectFacial Expressions-
dc.subjectHuman intentions-
dc.subjectHuman-human interactions-
dc.subjectNonverbal behavior-
dc.subjectNovel methodology-
dc.subjectGesture recognition-
dc.subjectHuman computer interaction-
dc.subjectHuman robot interaction-
dc.subjectMan machine systems-
dc.subjectFace recognition-
dc.titleInteraction intent analysis of multiple persons using nonverbal behavior features-
dc.typeArticle-
dc.identifier.doi10.5302/J.ICROS.2013.13.1893-
dc.description.journalClass1-
dc.identifier.bibliographicCitationJournal of Institute of Control, Robotics and Systems, v.19, no.8, pp.738 - 744-
dc.citation.titleJournal of Institute of Control, Robotics and Systems-
dc.citation.volume19-
dc.citation.number8-
dc.citation.startPage738-
dc.citation.endPage744-
dc.description.journalRegisteredClassscopus-
dc.description.journalRegisteredClasskci-
dc.identifier.kciidART001791669-
dc.identifier.scopusid2-s2.0-84887181651-
dc.type.docTypeArticle-
dc.subject.keywordPlusBehavioral features-
dc.subject.keywordPlusCognitive science-
dc.subject.keywordPlusConfidential reasoning-
dc.subject.keywordPlusFacial Expressions-
dc.subject.keywordPlusHuman intentions-
dc.subject.keywordPlusHuman-human interactions-
dc.subject.keywordPlusNonverbal behavior-
dc.subject.keywordPlusNovel methodology-
dc.subject.keywordPlusGesture recognition-
dc.subject.keywordPlusHuman computer interaction-
dc.subject.keywordPlusHuman robot interaction-
dc.subject.keywordPlusMan machine systems-
dc.subject.keywordPlusFace recognition-
dc.subject.keywordAuthorConfidential reasoning-
dc.subject.keywordAuthorHuman intention analysis-
dc.subject.keywordAuthorHuman-robot interaction-
dc.subject.keywordAuthorMultiple-person interactions-
Appears in Collections:
KIST Article > 2013
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE