From 0784b9180dce2d27807c89df0911598a4e465864 Mon Sep 17 00:00:00 2001 From: "chen, suyue" Date: Wed, 11 Sep 2024 09:26:55 +0800 Subject: [PATCH] fix image build issue on push (#780) Signed-off-by: chensuyue --- .github/workflows/push-image-build.yml | 2 ++ docker_images_list.md | 4 ++-- 2 files changed, 4 insertions(+), 2 deletions(-) diff --git a/.github/workflows/push-image-build.yml b/.github/workflows/push-image-build.yml index 0576f854c7..b2a7dea5ae 100644 --- a/.github/workflows/push-image-build.yml +++ b/.github/workflows/push-image-build.yml @@ -18,6 +18,8 @@ concurrency: jobs: job1: uses: ./.github/workflows/_get-test-matrix.yml + with: + test_mode: "docker_image_build/build.yaml" image-build: needs: job1 diff --git a/docker_images_list.md b/docker_images_list.md index 092ce1f795..965b21127b 100644 --- a/docker_images_list.md +++ b/docker_images_list.md @@ -46,9 +46,9 @@ Take ChatQnA for example. ChatQnA is a chatbot application service based on the | [opea/gmcrouter](https://hub.docker.com/r/opea/gmcrouter) | [Link](https://github.com/opea-project/GenAIInfra/blob/main/microservices-connector/Dockerfile.manager) | The docker image served as one of key parts of the OPEA GenAI Microservice Connector(GMC) to route the traffic among the microservices defined in GMC | | [opea/gmcmanager](https://hub.docker.com/r/opea/gmcmanager) | [Link](https://github.com/opea-project/GenAIInfra/blob/main/microservices-connector/Dockerfile.router) | The docker image served as one of key parts of the OPEA GenAI Microservice Connector(GMC) to be controller manager to handle GMC CRD | | [opea/guardrails-tgi](https://hub.docker.com/r/opea/guardrails-tgi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/llama_guard/langchain/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide content review for GenAI application use | -| [opea/guardrails-pii-detection](https://hub.docker.com/r/opea/guardrails-pii-detection) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/pii_detection/docker/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide PII detection for GenAI application use | +| [opea/guardrails-pii-detection](https://hub.docker.com/r/opea/guardrails-pii-detection) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/pii_detection/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide PII detection for GenAI application use | | [opea/habanalabs](https://hub.docker.com/r/opea/habanalabs) | | | -| [opea/knowledge_graphs](https://hub.docker.com/r/opea/knowledge_graphs) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/knowledgegraphs/langchain/docker/Dockerfile) | The docker image served as knowledge graph gateway to enhance question answering with graph knowledge searching. | +| [opea/knowledge_graphs](https://hub.docker.com/r/opea/knowledge_graphs) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/knowledgegraphs/langchain/Dockerfile) | The docker image served as knowledge graph gateway to enhance question answering with graph knowledge searching. | | [opea/llm-docsum-tgi](https://hub.docker.com/r/opea/llm-docsum-tgi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/llms/summarization/tgi/langchain/Dockerfile) | This docker image is designed to build a document summarization microservice using the HuggingFace Text Generation Inference(TGI) framework. The microservice accepts document input and generates a document summary. | | [opea/llm-faqgen-tgi](https://hub.docker.com/r/opea/llm-faqgen-tgi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/llms/faq-generation/tgi/langchain/Dockerfile) | This docker image is designed to build a frequently asked questions microservice using the HuggingFace Text Generation Inference(TGI) framework. The microservice accepts document input and generates a FAQ. | | [opea/llm-ollama](https://hub.docker.com/r/opea/llm-ollama) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/llms/text-generation/ollama/langchain/Dockerfile) | The docker image exposed the OPEA LLM microservice based on ollama for GenAI application use |