-
-
Notifications
You must be signed in to change notification settings - Fork 329
Gesture
Vladimir Mandic edited this page Aug 7, 2023
·
9 revisions
Gesture recognition is done by looking up coordinates of different detected landmarks
Entire implementation is in src/gesture.ts
and can be further extended with additional rules
There are three pre-defined methods:
- face(): implements:
- "facing
<left|center|right>
": if depth of left and right face border matches - "blink
<left|right>
eye" - "mouth
<percentage>
open" - "head
<up|down>
"
- "facing
- iris(): implements:
- "facing center": if iris area sizes matches
- "looking <left|right>": if iris center is far from eye outside corner
- "looking <up|down>": if iris center is far from eye bottom eyelid
- "looking center": if neither up/down and left/right
- body(): implements:
- "leaning
<left|right>
" - "raise
<left|right>
hand" - "i give up"
- "leaning
- hand(): implements:
- "
<finger>
forward<finger>
up"
- "
Example output of result.gesture
:
gesture = [
{ face: '0', gesture: 'facing camera' },
{ face: '0', gesture: 'head up' },
{ iris: '0', gesture: 'looking center' },
{ body: '0', gesture: 'i give up' },
{ body: '0', gesture: 'leaning left' },
{ hand: '0', gesture: 'thumb forward middlefinger up' },
];
Where number after gesture refers to number of person that detection belongs to in scenes with multiple people.
If you wish to modify or add new gesture rules, modify src/gesture.ts
as needed and rebuild Human
using npm run build
.
Human Library Wiki Pages
3D Face Detection, Body Pose, Hand & Finger Tracking, Iris Tracking, Age & Gender Prediction, Emotion Prediction & Gesture Recognition