Human-Like Motion Planning of Collaborative Robots based on Human Arm Motion Analysis

Grants and Contracts Details

Description

Human-Like Motion Planning of Collaborative Robots based on Human Arm Motion Analysis PI: Biyun Xie Assistant Professor Electrical and Computer Engineering Department University of Kentucky SCEEE Development Fund Grants Proposed Funds: SCEEE: $27,983 Cost Share: $27,983 Total: $55,966 Project Duration: 12 months 1 Abstract Collaborative robots or cobots are a type of robots that are intelligent and safe enough to work with humans in a shared workspace, or even collaborate with humans to ful?ll a shared task. In human-robot interaction, ef?ciency and safety are the two most signi?cant factors that need to be considered. The overall objective of this project is to generate human-like motions for collaborative robots to improve the ef?ciency and safety in human-robot interaction. First, human-like motions of collaborative robots can help humans understand and predict the robots’ intension and motions. Based on such information, humans can prepare for the next task in advance to improve the ef?ciency, and can also take emergency response actions in dangerous situations to protect both humans’ and robots’ safety. Furthermore, in our daily human-human interaction, we usually have a strong exception of other people’s behavior and motion, and other people’s weird and strange motion will cause our anxiety. Therefore, in human-robot interaction, human-like motions of collaborative robots can give humans a strong sense of safety, which makes the interaction more human- friendly. The human arm can be modeled as a kinematic chain with seven degrees of freedom (DOF), three rotational joints at the shoulder, one rotational joint at the elbow, and three rotational joints at the wrist. Human arm motions can be roughly divided into two categories, i.e., reaching movements and grasping movements. In reaching movements, such as pushing a button, there is no speci?c requirement for the hand orientation, so the motion of the wrist joint can be ignored and the human arm can be simpli?ed into a kinematic chain with only four DOFs. In grasping movements, such as rotating a switch, all the shoulder, elbow and wrist joints are involved to reach a desired position and orientation. Therefore, in both reaching and grasping movements, the human arm has one degree of redundancy (the difference between DOFs and workspace dimension), which can be physically interpreted as there are in?nitely many arm postures at one target location. The speci?c scienti?c objectives of this research are: (1) analyze natural human arm postures in reach- ing movements, and develop a method to predict corresponding natural human arm postures. (2) Analyze natural human arm postures in grasping movements, and develop a method to predict corresponding natural human arm postures. (3) Based on these synthesized human arm postures, develop an inverse kinematics algorithm to generate human-like motions for collaborative robots in different tasks. Page 1 of 1
StatusFinished
Effective start/end date7/1/206/30/21

Funding

  • Southeastern Center for Electrical Engineering Ed: $32,500.00

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.