Full metadata record

DC Field Value Language
dc.contributor.authorCheong, Howon-
dc.contributor.authorKim, Euntai-
dc.contributor.authorPark, Sung-Kee-
dc.date.accessioned2024-01-19T18:34:31Z-
dc.date.available2024-01-19T18:34:31Z-
dc.date.created2021-09-05-
dc.date.issued2019-11-15-
dc.identifier.issn1530-437X-
dc.identifier.urihttps://pubs.kist.re.kr/handle/201004/119318-
dc.description.abstractThis paper suggests a new landmark descriptor for indoor mobile robot navigation with sensor fusion and a global localization method using it. In previous research on robot pose estimation, various landmarks such as geometric features, visual local-invariant features, or objects are utilized. However, in real-world situations, there is a possibility that distinctive landmarks are insufficient or there are many similar landmarks repeated in indoor environment, which makes accurate pose estimation difficult. In this work, we suggest a new landmark descriptor, called depth-guided photometric edge descriptor (DPED), which is composed of superpixels and approximated 3D depth information of photometric vertical edge. With this descriptor, we propose a global localization method based on coarse-to-fine strategy. In the coarse step, candidate nodes are found by place recognition using our pairwise constraint-based spectral matching technique, and the robot pose is estimated with a probabilistic scan matching in the fine step. The experimental results show that our method successfully estimates the robot pose in the real-world tests even when there is a lack of distinctive features and objects.-
dc.languageEnglish-
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC-
dc.subjectVISION-
dc.titleIndoor Global Localization Using Depth-Guided Photometric Edge Descriptor for Mobile Robot Navigation-
dc.typeArticle-
dc.identifier.doi10.1109/JSEN.2019.2932131-
dc.description.journalClass1-
dc.identifier.bibliographicCitationIEEE SENSORS JOURNAL, v.19, no.22, pp.10837 - 10847-
dc.citation.titleIEEE SENSORS JOURNAL-
dc.citation.volume19-
dc.citation.number22-
dc.citation.startPage10837-
dc.citation.endPage10847-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.identifier.wosid000503399200078-
dc.identifier.scopusid2-s2.0-85073880260-
dc.relation.journalWebOfScienceCategoryEngineering, Electrical & Electronic-
dc.relation.journalWebOfScienceCategoryInstruments & Instrumentation-
dc.relation.journalWebOfScienceCategoryPhysics, Applied-
dc.relation.journalResearchAreaEngineering-
dc.relation.journalResearchAreaInstruments & Instrumentation-
dc.relation.journalResearchAreaPhysics-
dc.type.docTypeArticle-
dc.subject.keywordPlusVISION-
dc.subject.keywordAuthorMobile robot-
dc.subject.keywordAuthorglobal localization-
dc.subject.keywordAuthorsensor fusion-
dc.subject.keywordAuthorspectral matching-
Appears in Collections:
KIST Article > 2019
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE