Skip to content

Commit

Permalink
Format opea docker images name (opea-project#103)
Browse files Browse the repository at this point in the history
Signed-off-by: Wang, Xigui <[email protected]>
Signed-off-by: V, Ganesan <[email protected]>
  • Loading branch information
xiguiw authored and ganesanintel committed Jun 3, 2024
1 parent 8cf37cf commit 011c7d1
Show file tree
Hide file tree
Showing 9 changed files with 19 additions and 19 deletions.
4 changes: 2 additions & 2 deletions comps/dataprep/redis/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ python prepare_doc_redis.py

```bash
cd ../../../../
docker build -t opea/gen-ai-comps:dataprep-redis-xeon-server --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/dataprep/redis/docker/Dockerfile .
docker build -t opea/dataprep-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/dataprep/redis/docker/Dockerfile .
```

## Run Docker with CLI
Expand All @@ -48,7 +48,7 @@ export LANGCHAIN_TRACING_V2=true
export LANGCHAIN_API_KEY=${your_langchain_api_key}
export LANGCHAIN_PROJECT="opea/gen-ai-comps:dataprep"

docker run -d --name="dataprep-redis-server" -p 6007:6007 --ipc=host -e http_proxy=$http_proxy -e https_proxy=$https_proxy -e REDIS_URL=$REDIS_URL -e INDEX_NAME=$INDEX_NAME opea/gen-ai-comps:dataprep-redis-xeon-server
docker run -d --name="dataprep-redis-server" -p 6007:6007 --ipc=host -e http_proxy=$http_proxy -e https_proxy=$https_proxy -e REDIS_URL=$REDIS_URL -e INDEX_NAME=$INDEX_NAME opea/dataprep-redis:latest
```

## Run Docker with Docker Compose
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ services:
- "6379:6379"
- "8001:8001"
dataprep-redis:
image: opea/gen-ai-comps:dataprep-redis-xeon-server
image: opea/dataprep-redis:latest
container_name: dataprep-redis-server
ports:
- "6007:6007"
Expand Down
4 changes: 2 additions & 2 deletions comps/embeddings/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,13 +73,13 @@ python embedding_tei_gaudi.py

```bash
cd ../../
docker build -t opea/gen-ai-comps:embedding-tei-server --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/langchain/docker/Dockerfile .
docker build -t opea/embedding-tei:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/embeddings/langchain/docker/Dockerfile .
```

## Run Docker with CLI

```bash
docker run -d --name="embedding-tei-server" -p 6000:6000 --ipc=host -e http_proxy=$http_proxy -e https_proxy=$https_proxy -e TEI_EMBEDDING_ENDPOINT=$TEI_EMBEDDING_ENDPOINT opea/gen-ai-comps:embedding-tei-server
docker run -d --name="embedding-tei-server" -p 6000:6000 --ipc=host -e http_proxy=$http_proxy -e https_proxy=$https_proxy -e TEI_EMBEDDING_ENDPOINT=$TEI_EMBEDDING_ENDPOINT opea/embedding-tei:latest
```

## Run Docker with Docker Compose
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ version: "3.8"

services:
embedding:
image: opea/gen-ai-comps:embedding-tei-server
image: opea/embedding-tei:latest
container_name: embedding-tei-server
ports:
- "6000:6000"
Expand Down
6 changes: 3 additions & 3 deletions comps/llms/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ docker build -t opea/gen-ai-comps:llm-tgi-server --build-arg https_proxy=$https_
## Run Docker with CLI

```bash
docker run -d --name="llm-tgi-server" -p 9000:9000 --ipc=host -e http_proxy=$http_proxy -e https_proxy=$https_proxy -e TGI_LLM_ENDPOINT=$TGI_LLM_ENDPOINT -e HUGGINGFACEHUB_API_TOKEN=$HUGGINGFACEHUB_API_TOKEN opea/gen-ai-comps:llm-tgi-server
docker run -d --name="llm-tgi-server" -p 9000:9000 --ipc=host -e http_proxy=$http_proxy -e https_proxy=$https_proxy -e TGI_LLM_ENDPOINT=$TGI_LLM_ENDPOINT -e HUGGINGFACEHUB_API_TOKEN=$HUGGINGFACEHUB_API_TOKEN opea/llm-tgi:latest
```

## Run Docker with Docker Compose
Expand Down Expand Up @@ -97,13 +97,13 @@ The `streaming` parameter determines the format of the data returned by the API.

```bash
# non-streaming mode
curl http://${your_ip}:9000/v1/chat/completions\
curl http://${your_ip}:9000/v1/chat/completions \
-X POST \
-d '{"query":"What is Deep Learning?","max_new_tokens":17,"top_k":10,"top_p":0.95,"typical_p":0.95,"temperature":0.01,"repetition_penalty":1.03,"streaming":false}' \
-H 'Content-Type: application/json'

# streaming mode
curl http://${your_ip}:9000/v1/chat/completions\
curl http://${your_ip}:9000/v1/chat/completions \
-X POST \
-d '{"query":"What is Deep Learning?","max_new_tokens":17,"top_k":10,"top_p":0.95,"typical_p":0.95,"temperature":0.01,"repetition_penalty":1.03,"streaming":true}' \
-H 'Content-Type: application/json'
Expand Down
6 changes: 3 additions & 3 deletions comps/reranks/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ docker build -t opea/gen-ai-comps:reranking-tei-xeon-server --build-arg https_pr
## Run Docker with CLI

```bash
docker run -d --name="reranking-tei-server" -p 8000:8000 --ipc=host -e http_proxy=$http_proxy -e https_proxy=$https_proxy -e TEI_RERANKING_ENDPOINT=$TEI_RERANKING_ENDPOINT -e HUGGINGFACEHUB_API_TOKEN=$HUGGINGFACEHUB_API_TOKEN opea/gen-ai-comps:reranking-tei-xeon-server
docker run -d --name="reranking-tei-server" -p 8000:8000 --ipc=host -e http_proxy=$http_proxy -e https_proxy=$https_proxy -e TEI_RERANKING_ENDPOINT=$TEI_RERANKING_ENDPOINT -e HUGGINGFACEHUB_API_TOKEN=$HUGGINGFACEHUB_API_TOKEN opea/reranking-tei:latest
```

## Run Docker with Docker Compose
Expand All @@ -72,15 +72,15 @@ docker compose -f docker_compose_reranking.yaml up -d
## Check Service Status

```bash
curl http://localhost:8000/v1/health_check\
curl http://localhost:8000/v1/health_check \
-X GET \
-H 'Content-Type: application/json'
```

## Consume Reranking Service

```bash
curl http://localhost:8000/v1/reranking\
curl http://localhost:8000/v1/reranking \
-X POST \
-d '{"initial_query":"What is Deep Learning?", "retrieved_docs": [{"text":"Deep Learning is not..."}, {"text":"Deep learning is..."}]}' \
-H 'Content-Type: application/json'
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ services:
shm_size: 1g
command: --model-id ${RERANK_MODEL_ID}
reranking:
image: opea/gen-ai-comps:reranking-tei-xeon-server
image: opea/reranking-tei:latest
container_name: reranking-tei-xeon-server
ports:
- "8000:8000"
Expand Down
10 changes: 5 additions & 5 deletions comps/retrievers/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,13 +49,13 @@ python langchain/retriever_redis.py

```bash
cd ../../
docker build -t opea/gen-ai-comps:retriever-redis-server --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/langchain/docker/Dockerfile .
docker build -t opea/retriever-redis:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/langchain/docker/Dockerfile .
```

## Run Docker with CLI

```bash
docker run -d --name="retriever-redis-server" -p 7000:7000 --ipc=host -e http_proxy=$http_proxy -e https_proxy=$https_proxy -e REDIS_URL=$REDIS_URL -e INDEX_NAME=$INDEX_NAME opea/gen-ai-comps:retriever-redis-server
docker run -d --name="retriever-redis-server" -p 7000:7000 --ipc=host -e http_proxy=$http_proxy -e https_proxy=$https_proxy -e REDIS_URL=$REDIS_URL -e INDEX_NAME=$INDEX_NAME opea/retriever-redis:latest
```

## Run Docker with Docker Compose
Expand All @@ -70,7 +70,7 @@ docker compose -f docker_compose_retriever.yaml up -d
## Check Service Status

```bash
curl http://localhost:7000/v1/health_check\
curl http://localhost:7000/v1/health_check \
-X GET \
-H 'Content-Type: application/json'
```
Expand All @@ -85,10 +85,10 @@ embedding = [random.uniform(-1, 1) for _ in range(768)]
print(embedding)
```

Then substitute your mock embedding vector for the `${your_embedding}` in the following cURL command:
Then substitute your mock embedding vector for the `${your_embedding}` in the following `curl` command:

```bash
curl http://${your_ip}:7000/v1/retrieval\
curl http://${your_ip}:7000/v1/retrieval \
-X POST \
-d '{"text":"What is the revenue of Nike in 2023?","embedding":${your_embedding}}' \
-H 'Content-Type: application/json'
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ version: "3.8"

services:
retriever:
image: opea/gen-ai-comps:retriever-redis-server
image: opea/retriever-redis:latest
container_name: retriever-redis-server
ports:
- "7000:7000"
Expand Down

0 comments on commit 011c7d1

Please sign in to comment.