Skip to content

LLM module

Lucie Galland edited this page Aug 2, 2024 · 4 revisions

Greta can be used in conjunction with an LLM to create a full system able to sustain a dialogue

Integrated LLMs

Currently, only Mistral and Mistral incremental are implemented

Integrate a new LLM model in Greta

To create a new LLM module,

  • Create a class in the auxiliary project LLM that extends LLMFrame
  • Create a new folder in Common/Data/LLM/{Yourmodelname} Copiyng the files in Common/Data/LLM/Mistral
  • Modify the copy of Mistral.py with the code to access your LLM model, the final answer should be printed by your pyhton module
  • Modify requirements.txt with your requirements
  • Modify check_env.py and init_env.bat with a new conda environment name
  • In the java class, initiate the strings LLM_python_env_checker_path, LLM_python_env_installer_path ,python_path_llm with the appropriate relative paths
  • Add your new Module in Modular.xml

Getting started with Greta

Greta Architecture

Quick start

Advanced

Functionalities

Core functionality

Auxiliary functionalities

Preview functionality

Nothing to show here

Previous functionality (possibly it still works, but not supported anymore)

Clone this wiki locally