A Generative Neural Network for Learning Coordinated Reach-Grasp Motions

Authors
Chong, EunsukPark, JinhyukKim, HyungminPark, Frank C.
Issue Date
2019-07
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Citation
IEEE ROBOTICS AND AUTOMATION LETTERS, v.4, no.3, pp.2769 - 2776
Abstract
A neural network for generating coordinated reach-grasp motions is proposed, based on a type of generative neural network called the conditional restricted Boltzmann machine (CRBM). Given demonstrations of humans reaching and grasping various target objects of different shapes and poses, a mixture-type CRBM model is first used to learn and cluster the reach-grasp motions into different movement types. A novel variant of CRBM, called CRBM-l, is then proposed, in which the CRBM network is augmented with a control variable that is adjustable for different target objects. A CRBM-l trained with the previously obtained movement-specific data is then used to generate real-time reach-grasp motions for new target objects, by appropriately adjusting the control variable. The generated reach-grasp motions are then fine-tuned taking into account the contact states between the object and the hand/fingers. The versatility and efficiency of our reachgrasp motion generation method is validated through systematic experiments involving a diverse set of target objects.
Keywords
Grasping; reaching; learning from demonstration; conditional restricted Boltzmann machine; neural network
ISSN
2377-3766
URI
https://pubs.kist.re.kr/handle/201004/119833
DOI
10.1109/LRA.2019.2917381
Appears in Collections:
KIST Article > 2019
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML

qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE