A generative human-robot motion retargeting approach using a single RGBD sensor

Sen Wang, Xinxin Zuo, Runxiao Wang, Ruigang Yang

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

The goal of human-robot motion retargeting is to let a robot follow the movements performed by a human subject. Typically in previous approaches, the human poses are precomputed from a human pose tracking system, after which the explicit joint mapping strategies are specified to apply the estimated poses to a target robot. However, there is not any generic mapping strategy that we can use to map the human joint to robots with different kinds of configurations. In this paper, we present a novel motion retargeting approach that combines the human pose estimation and the motion retargeting procedure in a unified generative framework without relying on any explicit mapping. First, a 3D parametric human-robot (HUMROB) model is proposed which has the specific joint and stability configurations as the target robot while its shape conforms the source human subject. The robot configurations, including its skeleton proportions, joint limitations, and DoFs are enforced in the HUMROB model and get preserved during the tracking procedure. Using a single RGBD camera to monitor human pose, we use the raw RGB and depth sequence as input. The HUMROB model is deformed to fit the input point cloud, from which the joint angle of the model is calculated and applied to the target robots for retargeting. In this way, instead of fitted individually for each joint, we will get the joint angle of the robot fitted globally so that the surface of the deformed model is as consistent as possible to the input point cloud. In the end, no explicit or pre-defined joint mapping strategies are needed. To demonstrate its effectiveness for human-robot motion retargeting, the approach is tested under both simulations and on real robots which have a quite different skeleton configurations and joint degree of freedoms (DoFs) as compared with the source human subjects.

Original languageEnglish
Article number8693808
Pages (from-to)51499-51512
Number of pages14
JournalIEEE Access
Volume7
DOIs
StatePublished - 2019

Bibliographical note

Publisher Copyright:
© 2019 IEEE.

Funding

This work was supported in part by the USDA under Grant 2018-67021-27416, in part by the NSF under Grant IIP-1543172, in part by the NSFC under Grant 51475373 and Grant 61603302, and in part by the Key Industrial Innovation Chain of Shaanxi Province Industrial Area under Grant 2016KTZDGY06-01.

FundersFunder number
Key Industrial Innovation Chain of Shaanxi Province Industrial Area2016KTZDGY06-01
National Science Foundation (NSF)IIP-1543172
U.S. Department of Agriculture2018-67021-27416
National Natural Science Foundation of China (NSFC)51475373, 61603302

    Keywords

    • Motion retargeting
    • RGBD sensor
    • human robot interaction

    ASJC Scopus subject areas

    • General Computer Science
    • General Materials Science
    • General Engineering

    Fingerprint

    Dive into the research topics of 'A generative human-robot motion retargeting approach using a single RGBD sensor'. Together they form a unique fingerprint.

    Cite this