Skip to content

IMITATION LEARNING OF HAND GESTURES FOR A DUAL ARM ROBOT MANIPULATOR

Notifications You must be signed in to change notification settings

Rohith-coder1/Co-Speech_GESTURES_Dual_ARM_Robot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Co-Speech_GESTURES_Dual_ARM_Robot

IMITATION LEARNING OF HAND GESTURES FOR A DUAL ARM ROBOT MANIPULATOR

The primary objective of this project is to use a deep learning-based gesture generation model, and feed it with a custom voice/text as the primary input data and obtain the joint angles to make a dual arm robot perform those gestures. To accomplish it, the following steps are carried out: ● To generate joint coordinates using Deep learning model which is done through feeding customised speech/text as input. Converting those joint coordinates of the simulated skeleton in the 3D space into joint angles. ● Evaluating and mapping particular joints in a dual arm robot to perform gestures, and defining the joint angles with respect to the joint angle range for each joint in a dual arm robot. By accomplishing these steps this project aims at bringing the relevant gestures for a dual arm robot to perform with any customised speech/text input.

About

IMITATION LEARNING OF HAND GESTURES FOR A DUAL ARM ROBOT MANIPULATOR

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages