Developing a Demonstration-Based Motion Planner for Space Telerobots

Grants and Contracts Details

Description

Developing a Demonstration-Based Motion Planner for Space Telerobots PI: Biyun Xie Assistant Professor Electrical and Computer Engineering Department University of Kentucky NASA Kentucky EPSCoR Research Infrastructure Development Grants (RIDG) Proposed Funds: $35,000 Project Duration: 12 months Abstract Space telerobots play a crucial role in space station assembly, on-orbit servicing, and space exploration missions. Space robot arms are commonly teleoperated by a human operator from a space shuttle or a ground control station using a joystick, slider, and other devices. It is challenging to precisely teleoperate a space robot due to the virtual working environment, eye-hand coordina- tion issues, user tension, and frustration. The scientific objective of this project is to develop a demonstration-based motion planner for space telerobots, where the robot first learns human arm motion from demonstration and then adapts the learned motion to its specific working environ- ments. Three research goals are proposed: (1) Transforming human arm motion to telerobots motion in real-time using vision-based and IMU-based motion capture systems to teach telerobots to perform various tasks; (2) Developing a motion planning algorithm to modify the learned hu- man arm motion to avoid obstacles in the robot workspace; (3) Building both virtual and physical testbeds using the Robot Operating System (ROS) and a Kinova Gen3 robot to test the developed algorithms. This project will enable telerobots to learn human motion skills of performing various tasks and give telerobots the ability to adapt to new tasks and new working environments. The outcomes of this project will significantly improve the accuracy and efficiency of space robot arm teleoperation. Page 1 of 1
StatusActive
Effective start/end date4/1/233/31/24

Funding

  • National Aeronautics and Space Administration

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.