Diverse Humanoid Robot Pose Estimation from Images Using Only Sparse Datasets

Authors
Heo, SeokhyeonCho, YoungdaePark, JeongwooCho, SeokhyunTsoy, ZiyaLim, HwasupCha, Youngwoon
Issue Date
2024-10
Publisher
MDPI
Citation
Applied Sciences-basel, v.14, no.19
Abstract
We present a novel dataset for humanoid robot pose estimation from images, addressing the critical need for accurate pose estimation to enhance human-robot interaction in extended reality (XR) applications. Despite the importance of this task, large-scale pose datasets for diverse humanoid robots remain scarce. To overcome this limitation, we collected sparse pose datasets for commercially available humanoid robots and augmented them through various synthetic data generation techniques, including AI-assisted image synthesis, foreground removal, and 3D character simulations. Our dataset is the first to provide full-body pose annotations for a wide range of humanoid robots exhibiting diverse motions, including side and back movements, in real-world scenarios. Furthermore, we introduce a new benchmark method for real-time full-body 2D keypoint estimation from a single image. Extensive experiments demonstrate that our extended dataset-based pose estimation approach achieves over 33.9% improvement in accuracy compared to using only sparse datasets. Additionally, our method demonstrates the real-time capability of 42 frames per second (FPS) and maintains full-body pose estimation consistency in side and back motions across 11 differently shaped humanoid robots, utilizing approximately 350 training images per robot.
Keywords
MARKERS; computer vision; robotics; deep learning
URI
https://pubs.kist.re.kr/handle/201004/150881
DOI
10.3390/app14199042
Appears in Collections:
KIST Article > 2024
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE