diff --git a/CodeGen/docker/gaudi/README.md b/CodeGen/docker/gaudi/README.md
index 4a2030d561..05f0d2056b 100644
--- a/CodeGen/docker/gaudi/README.md
+++ b/CodeGen/docker/gaudi/README.md
@@ -46,6 +46,30 @@ Then run the command `docker images`, you will have the following 3 Docker image
## 🚀 Start MicroServices and MegaService
+The CodeGen megaservice manages a single microservice called LLM within a Directed Acyclic Graph (DAG). In the diagram above, the LLM microservice is a language model microservice that generates code snippets based on the user's input query. The TGI service serves as a text generation interface, providing a RESTful API for the LLM microservice. The CodeGen Gateway acts as the entry point for the CodeGen application, invoking the Megaservice to generate code snippets in response to the user's input query.
+
+The mega flow of the CodeGen application, from user's input query to the application's output response, is as follows:
+
+```mermaid
+flowchart LR
+ subgraph CodeGen
+ direction LR
+ A[User] --> |Input query| B[CodeGen Gateway]
+ B --> |Invoke| Megaservice
+ subgraph Megaservice["Megaservice"]
+ direction TB
+ C((LLM
9000)) -. Post .-> D{{TGI Service
8028}}
+ end
+ Megaservice --> |Output| E[Response]
+ end
+
+ subgraph Legend
+ direction LR
+ G([Microservice]) ==> H([Microservice])
+ I([Microservice]) -.-> J{{Server API}}
+ end
+```
+
### Setup Environment Variables
Since the `docker_compose.yaml` will consume some environment variables, you need to setup them in advance as below.
diff --git a/CodeGen/docker/xeon/README.md b/CodeGen/docker/xeon/README.md
index 81961e47a7..7ed57c6713 100644
--- a/CodeGen/docker/xeon/README.md
+++ b/CodeGen/docker/xeon/README.md
@@ -54,6 +54,30 @@ Then run the command `docker images`, you will have the following 3 Docker Image
## 🚀 Start Microservices and MegaService
+The CodeGen megaservice manages a single microservice called LLM within a Directed Acyclic Graph (DAG). In the diagram above, the LLM microservice is a language model microservice that generates code snippets based on the user's input query. The TGI service serves as a text generation interface, providing a RESTful API for the LLM microservice. The CodeGen Gateway acts as the entry point for the CodeGen application, invoking the Megaservice to generate code snippets in response to the user's input query.
+
+The mega flow of the CodeGen application, from user's input query to the application's output response, is as follows:
+
+```mermaid
+flowchart LR
+ subgraph CodeGen
+ direction LR
+ A[User] --> |Input query| B[CodeGen Gateway]
+ B --> |Invoke| Megaservice
+ subgraph Megaservice["Megaservice"]
+ direction TB
+ C((LLM
9000)) -. Post .-> D{{TGI Service
8028}}
+ end
+ Megaservice --> |Output| E[Response]
+ end
+
+ subgraph Legend
+ direction LR
+ G([Microservice]) ==> H([Microservice])
+ I([Microservice]) -.-> J{{Server API}}
+ end
+```
+
### Setup Environment Variables
Since the `docker_compose.yaml` will consume some environment variables, you need to setup them in advance as below.