Full metadata record

DC Field Value Language
dc.contributor.authorCHAE, YU JUNG-
dc.contributor.authorChoi, JongSuk-
dc.contributor.authorChangHwan, Kim-
dc.contributor.authorYun, Sang-Seok-
dc.date.accessioned2024-01-19T11:09:59Z-
dc.date.available2024-01-19T11:09:59Z-
dc.date.created2022-03-01-
dc.date.issued2016-08-
dc.identifier.issn2325-033X-
dc.identifier.urihttps://pubs.kist.re.kr/handle/201004/114716-
dc.description.abstractIn the field of human-robot interaction, it is important to attract user's interests. In this regard, it is useful to use gaze and pointing gesture to read and lead to intentions. However, there are not sufficient to studies that generate behaviors according to recognized users' intentions. Furthermore, it is difficult to select suitable behavior considering changing user's state. In this paper, we propose a method that is to guide user's decision using robot's gaze and pointing gestures.-
dc.languageEnglish-
dc.publisherIEEE-
dc.titleGuide system based users' intentions for a humanoid robot-
dc.typeConference-
dc.description.journalClass1-
dc.identifier.bibliographicCitation13th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI), pp.67 - 70-
dc.citation.title13th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI)-
dc.citation.startPage67-
dc.citation.endPage70-
dc.citation.conferencePlaceUS-
dc.citation.conferencePlaceXian, PEOPLES R CHINA-
dc.citation.conferenceDate2016-08-19-
dc.relation.isPartOf2016 13TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS AND AMBIENT INTELLIGENCE (URAI)-
dc.identifier.wosid000387249900015-
dc.identifier.scopusid2-s2.0-85000473282-
Appears in Collections:
KIST Conference Paper > 2016
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE