-
Notifications
You must be signed in to change notification settings - Fork 15
LLM module
Lucie Galland edited this page Aug 2, 2024
·
4 revisions
Greta can be used in conjunction with an LLM to create a full system able to sustain a dialogue
Currently, only Mistral and Mistral incremental are implemented
To create a new LLM module,
- Create a class in the auxiliary project LLM that extends LLMFrame
- Create a new folder in Common/Data/LLM/{Yourmodelname} Copiyng the files in Common/Data/LLM/Mistral
- Modify the copy of Mistral.py with the code to access your LLM model, the final answer should be printed by your pyhton module
- Modify requirements.txt with your requirements
- Modify check_env.py and init_env.bat with a new conda environment name
- In the java class, initiate the strings LLM_python_env_checker_path, LLM_python_env_installer_path ,python_path_llm with the appropriate relative paths
- Add your new Module in Modular.xml
Advanced
- Generating New Facial expressions
- Generating New Gestures
- Generating new Hand configurations
- Torso Editor Interface
- Creating an Instance for Interaction
- Create a new virtual character
- Creating a Greta Module in Java
- Modular Application
- Basic Configuration
- Signal
- Feedbacks
- From text to FML
- Expressivity Parameters
- Text-to-speech, TTS
-
AUs from external sources
-
Large language model (LLM)
-
Automatic speech recognition (ASR)
-
Extentions
-
Integration examples
Nothing to show here