Skip to content

Containerization and cloud native suite for OPEA

License

Notifications You must be signed in to change notification settings

Feelas/GenAIInfra

 
 

Repository files navigation

GenAIInfra

GenAIInfra is the containerization and cloud native suite for OPEA, including artifacts to deploy GenAIExamples in a cloud native way, which can be used by enterprise users to deploy to their own cloud.

Overview

The GenAIInfra repository is organized under four main directories, which include artifacts for OPEA deploying:

Directory Purpose
microservices-connector GenAI Microservices Connector (GMC) supports the launching, monitoring, and updating of GenAI microservice chains, such as those in GenAIExamples on Kubernetes. It essentially supports a Kubernetes Custom Resource Definition for GenAI chains/pipelines that may be comprised of sequential, conditional, and parallel steps.
helm-charts Helm charts for deploying GenAIComponents on Kubernetes.
kubernetes-addons Deploy Kubernetes add-ons for OPEA.
manifests Manifests for deploying GenAIComponents on Kubernetes and on Docker Compose.
scripts Scripts for testing, tools to facilitate OPEA deployment, and etc.

Prerequisite

GenAIInfra uses Kubernetes as the cloud native infrastructure. Please follow the steps below to prepare the Kubernetes environment.

Setup Kubernetes cluster

Please follow Kubernetes official setup guide to setup Kubernetes. We recommend to use Kubernetes with version >= 1.27.

There are different methods to setup Kubernetes production cluster, such as kubeadm, kubespray, and more.

NOTE: We recommend to use containerd when choosing the container runtime during Kubernetes setup. Docker engine is also verified on Ubuntu 22.04 and above.

(Optional) To run GenAIInfra on Intel Gaudi product:

The following steps are optional. They're only required if you want to run the workloads on Intel Gaudi product.

  1. Please check the support matrix to make sure that environment meets the requirements.

  2. Install Intel Gaudi software stack.

  3. Install and setup container runtime, based on the container runtime used by Kubernetes.

NOTE: Please make sure you configure the appropriate container runtime based on the type of container runtime you installed during Kubernetes setup.

  1. Install Intel Gaudi device plugin for Kubernetes.

Usages

Use GenAI Microservices Connector (GMC) to deploy and adjust GenAIExamples

Follow GMC README to install GMC into your kubernetes cluster. GenAIExamples contains several sample GenAI example use case pipelines such as ChatQnA, DocSum, etc. Once you have deployed GMC in your Kubernetes cluster, you can deploy any of the example pipelines by following its Readme file (e.g. Docsum).

Use helm charts to deploy

To deploy GenAIExamples to Kubernetes using helm charts, you need Helm installed on your machine.

Clone the GenAIInfra repository and change into the helm-charts directory:

git clone https://github.com/opea-project/GenAIInfra.git
cd GenAIInfra/helm-charts

Select the example workflow you want to deploy, set the customized values in values.yaml and deploy the example (e.g. codegen) using helm:

helm install codegen ./codegen

Use manifests to deploy

GenAIInfra also supports deploy GenAIExamples using manifests, you need kubectl installed on your machine.

Clone the GenAIInfra repository and change into the manifests directory:

git clone https://github.com/opea-project/GenAIInfra.git
cd GenAIInfra/manifests

Select the example workflow you want to deploy, deploy the example (e.g. DocSum) using kubectl:

kubectl apply -f ./DocSum/manifests/

Additional Content

About

Containerization and cloud native suite for OPEA

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Go 64.0%
  • Smarty 25.1%
  • Makefile 7.5%
  • Shell 3.4%