This repository contains the code for the phase-based motion in-betweening technology. Any questions, feel free to ask. For any issues you might find, please let me know and send me a message; [email protected].
-
Clone this repository.
-
Download learned phases and processed Assets of the LaFan1 motion capture dataset.
-
Extract
MotionCapture.zip
toAssets/Demo/Authoring
folder. -
Download trained Models.
-
Extract
Model.zip
toAssets/Demo/Authoring
folder.
We provide two demo scenes in Assets/Demo/Authoring
. Open them in Unity and hit play - the system should run automatically.
If not, ensure that Unity Version 2020.3.18f1 and the Barracuda 2.0.0 package (required for ONNX inference) is installed.
Two runtime controllers are implemented for the task.
InBetweeningController.cs
samples the control parameters from a processed animation clip asset in Unity.
AuthoringInBetweeningController.cs
is using the control parameters from the linked Authoring tool.
For both controllers, select if the system should run with no phases, local motion phases or learned (deep) phases by adjusting the corresponding
parameters in the Inspector UI. The trained models for each option are available in Demo/Authoring/Model
.
Visualization options can be turned on/off in the inspector.
To create sparse keyframes for the character, add the Authoring.cs
script to any gameobject in your scene.
- Add/Insert/Delete controlpoints:
<Shift> + LeftMouseClick
in Scene View - Select-Mode:
LeftMouseClick
on a controlpoint in the Scene View -> properties of this control point show up in inspector - Unselect:
<Esc>
in Scene View - Undo:
<Ctrl> + Z
- Redo:
<Ctrl> + Y
Drag and drop existing control points in scene view to change their position. Translate or rotate the gameobject to move the whole spline path correspondingly.
Each control point must have a target pose. To load poses of the character from the motion capture, import the processed assets in the <Motion Import Options>
menue in the inspector.
To change the target pose of a controlpoint, press <Sync Assets>
in the <Betweening Module>
inspector menue and select a desired frame. Rotate the target pose by dragging the circle near the controlpoint with your mouse.
Once the Authoring is set up, make sure to link it to the AuthoringInBetweeningController.cs
script of the character.
The complete code that was used for processing, training, and generating the in-between movements is provided in this repository. To reproduce the model complete the following steps:
- Open
Assets/Demo/Authoring/MotionCapture/Mocap_LaFan.unity
. - Click on the MotionEditor game object in the scene hierarchy window.
- Open the Motion Exporter
Header -> AI4Animation -> Exporter -> Motion Exporter
. Set "Frame Shifts" to 0 and "Frame Buffer" to 30, "Phases" to "Deep Phases" and have the box for "Write Mirror" checked. - Click the "Export" button, which will generate the training data and save it.
- Navigate to
DeepLearningONNX/Models/GNN
. - Run
InBetweeningNetwork.py
which will start the training. - Wait for a few hours.
- You will find the trained .onnx model in the training folder.
- Import the model into Unity and link it to the controller.
- Hit Play.
If you decide to start from the raw motion capture and not use the already processed assets in Unity, you will need to download the LaFAN1 dataset and complete the following steps:
- Open
MocapExample.unity
in the Demo folder. - Import the motion data into Unity by opening the BVH Importer
Header -> AI4Animation -> Importer -> BVH Importer
. Define the path where the original .bvh data is saved on your hard disk, and where the Unity assets should be saved inside the project. - Set Scale to 0.01, check
Flip
onXPositive axis
and press "Load Directory" and "Import Motion Data". Wait until the motion data is imported. - In the scene
MocapExample.unity
go to the MotionEditor component. - Input the path where the imported motion data assets have been saved and click "Import".
- In the "Editor Settings" at the bottom, make sure that "Target Framerate" is set to 30Hz and "Character" is linked with LaFan's prefabs
Actor.cs
component. - Open the MotionProcessor window
Header -> AI4Animation -> Tools -> MotionProcessor
, make sure that "LaFAN Pipeline" is selected and click "Process". - Wait for a few hours.
- At this point, the raw motion capture data has been successfully processed and is at the same stage as the motion assets provided in this repository. You are ready to continue with the steps above to export the data, train the network and control the character movements.
The code to train the Periodic Autoencoder and extract the phase parameters for the mocap is available here.