-
Notifications
You must be signed in to change notification settings - Fork 16
Motion Capture to BAP values
The virtual agents can reproduce the data record via Motion Capture system. The module BVHFileReader (Add/Inputs/) has to be connected with MPEG4Animatable module. The main function of this module is to convert the stream of BVH data to BAP data, as body motion of the Greta agents are animated through BAP representation.
Examples of bhv file are stored in the folder <GRETA_DIR>/bin/Examples/BvhMocap/
The bvh file can have different agent's root position in the space and when you run different bhv file can you see the agent shift or disappear from the current camera view. To overcome this problem in the BVHFileReader interface there are three checkbox: (example of https://github.com/gretaproject/greta/blob/master/bin/Examples/BvhMocap/c-ladder.bvh and https://github.com/gretaproject/greta/blob/master/bin/Examples/BvhMocap/Dance.bvh run with any of the checkbox enabled https://youtu.be/Vxqv0W5gKTg)
- "Delete initial offset of Root Position" --> the agent root (hip) will move around the initial position (example: https://youtu.be/avfjlQVA94E);
- "Delete initial offset of Root Orientation" --> the agent will not rotate the root around the y-axis;
- "No shift in Root Position" --> the agent root will not undergo any shift along the three axes (example: https://youtu.be/W3-_oVEkmiY).
Advanced
- Generating New Facial expressions
- Generating New Gestures
- Generating new Hand configurations
- Torso Editor Interface
- Creating an Instance for Interaction
- Create a new virtual character
- Creating a Greta Module in Java
- Modular Application
- Basic Configuration
- Signal
- Feedbacks
- From text to FML
- Expressivity Parameters
- Text-to-speech, TTS
-
AUs from external sources
-
Large language model (LLM)
-
Automatic speech recognition (ASR)
-
Extentions
-
Integration examples
Nothing to show here