From 377dd2fa9eac012b6927abee3ef5f6339549a4eb Mon Sep 17 00:00:00 2001 From: ctao456 Date: Sun, 7 Jul 2024 23:47:03 -0700 Subject: [PATCH] [CodeGen] Add codegen flowchart (#369) * Add codegen flowchart Signed-off-by: Chun Tao * update flowchart to markdown format Signed-off-by: Chun Tao * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * update markdown diagram Signed-off-by: Chun Tao * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * update Signed-off-by: Chun Tao * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * delete last line Signed-off-by: Chun Tao * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Add flowchart for CodeGen, update readme Signed-off-by: Chun Tao * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * udpates Signed-off-by: Chun Tao * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci --------- Signed-off-by: Chun Tao Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> --- CodeGen/docker/gaudi/README.md | 24 ++++++++++++++++++++++++ CodeGen/docker/xeon/README.md | 24 ++++++++++++++++++++++++ 2 files changed, 48 insertions(+) diff --git a/CodeGen/docker/gaudi/README.md b/CodeGen/docker/gaudi/README.md index 4a2030d56..05f0d2056 100644 --- a/CodeGen/docker/gaudi/README.md +++ b/CodeGen/docker/gaudi/README.md @@ -46,6 +46,30 @@ Then run the command `docker images`, you will have the following 3 Docker image ## 🚀 Start MicroServices and MegaService +The CodeGen megaservice manages a single microservice called LLM within a Directed Acyclic Graph (DAG). In the diagram above, the LLM microservice is a language model microservice that generates code snippets based on the user's input query. The TGI service serves as a text generation interface, providing a RESTful API for the LLM microservice. The CodeGen Gateway acts as the entry point for the CodeGen application, invoking the Megaservice to generate code snippets in response to the user's input query. + +The mega flow of the CodeGen application, from user's input query to the application's output response, is as follows: + +```mermaid +flowchart LR + subgraph CodeGen + direction LR + A[User] --> |Input query| B[CodeGen Gateway] + B --> |Invoke| Megaservice + subgraph Megaservice["Megaservice"] + direction TB + C((LLM
9000)) -. Post .-> D{{TGI Service
8028}} + end + Megaservice --> |Output| E[Response] + end + + subgraph Legend + direction LR + G([Microservice]) ==> H([Microservice]) + I([Microservice]) -.-> J{{Server API}} + end +``` + ### Setup Environment Variables Since the `docker_compose.yaml` will consume some environment variables, you need to setup them in advance as below. diff --git a/CodeGen/docker/xeon/README.md b/CodeGen/docker/xeon/README.md index 81961e47a..7ed57c671 100644 --- a/CodeGen/docker/xeon/README.md +++ b/CodeGen/docker/xeon/README.md @@ -54,6 +54,30 @@ Then run the command `docker images`, you will have the following 3 Docker Image ## 🚀 Start Microservices and MegaService +The CodeGen megaservice manages a single microservice called LLM within a Directed Acyclic Graph (DAG). In the diagram above, the LLM microservice is a language model microservice that generates code snippets based on the user's input query. The TGI service serves as a text generation interface, providing a RESTful API for the LLM microservice. The CodeGen Gateway acts as the entry point for the CodeGen application, invoking the Megaservice to generate code snippets in response to the user's input query. + +The mega flow of the CodeGen application, from user's input query to the application's output response, is as follows: + +```mermaid +flowchart LR + subgraph CodeGen + direction LR + A[User] --> |Input query| B[CodeGen Gateway] + B --> |Invoke| Megaservice + subgraph Megaservice["Megaservice"] + direction TB + C((LLM
9000)) -. Post .-> D{{TGI Service
8028}} + end + Megaservice --> |Output| E[Response] + end + + subgraph Legend + direction LR + G([Microservice]) ==> H([Microservice]) + I([Microservice]) -.-> J{{Server API}} + end +``` + ### Setup Environment Variables Since the `docker_compose.yaml` will consume some environment variables, you need to setup them in advance as below.