Full metadata record

DC Field Value Language
dc.contributor.author김형오-
dc.contributor.author김수환-
dc.contributor.author김동환-
dc.contributor.author박성기-
dc.date.accessioned2024-01-20T21:34:00Z-
dc.date.available2024-01-20T21:34:00Z-
dc.date.created2021-09-06-
dc.date.issued2009-04-
dc.identifier.issn1975-6291-
dc.identifier.urihttps://pubs.kist.re.kr/handle/201004/132597-
dc.description.abstractThis paper describes how a person extracts a unknown object with pointing gesture while interacting with a robot. Using a stereo vision sensor, our proposed method consists of two stages: the detection of the operators' face, the estimation of the pointing direction, and the extraction of the pointed object. The operator's face is recognized by using the Haar-like features. And then we estimate the 3D pointing direction from the shoulder-to-hand line. Finally, we segment an unknown object from 3D point clouds in estimated region of interest. On the basis of this proposed method, we implemented an object registration system with our mobile robot and obtained reliable experimental results.-
dc.languageEnglish-
dc.publisher한국로봇학회-
dc.title서비스 로봇을 위한 지시 물체 분할 방법-
dc.title.alternativeSegmentation of Pointed Objects for Service Robots-
dc.typeArticle-
dc.description.journalClass2-
dc.identifier.bibliographicCitation로봇학회 논문지, v.4, no.2, pp.139 - 146-
dc.citation.title로봇학회 논문지-
dc.citation.volume4-
dc.citation.number2-
dc.citation.startPage139-
dc.citation.endPage146-
dc.description.journalRegisteredClasskci-
dc.description.journalRegisteredClassother-
dc.identifier.kciidART001412521-
dc.subject.keywordAuthorArm-pointing Gestures-
dc.subject.keywordAuthorObject Segmentation-
dc.subject.keywordAuthorObject Recognition-
dc.subject.keywordAuthorStereo vision.-
Appears in Collections:
KIST Article > 2009
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE