Indoor Global Localization Using Depth-Guided Photometric Edge Descriptor for Mobile Robot Navigation

Authors
Cheong, HowonKim, EuntaiPark, Sung-Kee
Issue Date
2019-11-15
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Citation
IEEE SENSORS JOURNAL, v.19, no.22, pp.10837 - 10847
Abstract
This paper suggests a new landmark descriptor for indoor mobile robot navigation with sensor fusion and a global localization method using it. In previous research on robot pose estimation, various landmarks such as geometric features, visual local-invariant features, or objects are utilized. However, in real-world situations, there is a possibility that distinctive landmarks are insufficient or there are many similar landmarks repeated in indoor environment, which makes accurate pose estimation difficult. In this work, we suggest a new landmark descriptor, called depth-guided photometric edge descriptor (DPED), which is composed of superpixels and approximated 3D depth information of photometric vertical edge. With this descriptor, we propose a global localization method based on coarse-to-fine strategy. In the coarse step, candidate nodes are found by place recognition using our pairwise constraint-based spectral matching technique, and the robot pose is estimated with a probabilistic scan matching in the fine step. The experimental results show that our method successfully estimates the robot pose in the real-world tests even when there is a lack of distinctive features and objects.
Keywords
VISION; VISION; Mobile robot; global localization; sensor fusion; spectral matching
ISSN
1530-437X
URI
https://pubs.kist.re.kr/handle/201004/119318
DOI
10.1109/JSEN.2019.2932131
Appears in Collections:
KIST Article > 2019
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE