-
Notifications
You must be signed in to change notification settings - Fork 198
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* initial code and readme for hierarchical agent example * agent test with openai llm passed * update readme and add test * update test * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * change example name and update docker yaml Signed-off-by: minmin-intel <[email protected]> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * change diagram name and test script name Signed-off-by: minmin-intel <[email protected]> * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * update test --------- Signed-off-by: minmin-intel <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
- Loading branch information
1 parent
46af6f3
commit 67df280
Showing
9 changed files
with
703 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,106 @@ | ||
# Agents for Question Answering | ||
|
||
## Overview | ||
|
||
This example showcases a hierarchical multi-agent system for question-answering applications. The architecture diagram is shown below. The supervisor agent interfaces with the user and dispatch tasks to the worker agent and other tools to gather information and come up with answers. The worker agent uses the retrieval tool to generate answers to the queries posted by the supervisor agent. Other tools used by the supervisor agent may include APIs to interface knowledge graphs, SQL databases, external knowledge bases, etc. | ||
![Architecture Overview](assets/agent_qna_arch.png) | ||
|
||
### Why Agent for question answering? | ||
|
||
1. Improve relevancy of retrieved context. | ||
Agent can rephrase user queries, decompose user queries, and iterate to get the most relevant context for answering user's questions. Compared to conventional RAG, RAG agent can significantly improve the correctness and relevancy of the answer. | ||
2. Use tools to get additional knowledge. | ||
For example, knowledge graphs and SQL databases can be exposed as APIs for Agents to gather knowledge that may be missing in the retrieval vector database. | ||
3. Hierarchical agent can further improve performance. | ||
Expert worker agents, such as retrieval agent, knowledge graph agent, SQL agent, etc., can provide high-quality output for different aspects of a complex query, and the supervisor agent can aggregate the information together to provide a comprehensive answer. | ||
|
||
### Roadmap | ||
|
||
- v0.9: Worker agent uses open-source websearch tool (duckduckgo), agents use OpenAI GPT-4o-mini as llm backend. | ||
- v1.0: Worker agent uses OPEA retrieval megaservice as tool. | ||
- v1.0 or later: agents use open-source llm backend. | ||
- v1.1 or later: add safeguards | ||
|
||
## Getting started | ||
|
||
1. Build agent docker image </br> | ||
First, clone the opea GenAIComps repo | ||
|
||
``` | ||
export WORKDIR=<your-work-directory> | ||
cd $WORKDIR | ||
git clone https://github.com/opea-project/GenAIComps.git | ||
``` | ||
|
||
Then build the agent docker image. Both the supervisor agent and the worker agent will use the same docker image, but when we launch the two agents we will specify different strategies and register different tools. | ||
|
||
``` | ||
cd GenAIComps | ||
docker build -t opea/comps-agent-langchain:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/agent/langchain/docker/Dockerfile . | ||
``` | ||
|
||
2. Launch tool services </br> | ||
In this example, we will use some of the mock APIs provided in the Meta CRAG KDD Challenge to demonstrate the benefits of gaining additional context from mock knowledge graphs. | ||
|
||
``` | ||
docker run -d -p=8080:8000 docker.io/aicrowd/kdd-cup-24-crag-mock-api:v0 | ||
``` | ||
|
||
3. Set up environment for this example </br> | ||
First, clone this repo | ||
|
||
``` | ||
cd $WORKDIR | ||
git clone https://github.com/opea-project/GenAIExamples.git | ||
``` | ||
|
||
Second, set up env vars | ||
|
||
``` | ||
export TOOLSET_PATH=$WORKDIR/GenAIExamples/AgentQnA/tools/ | ||
# optional: OPANAI_API_KEY | ||
export OPENAI_API_KEY=<your-openai-key> | ||
``` | ||
|
||
4. Launch agent services</br> | ||
The configurations of the supervisor agent and the worker agent are defined in the docker-compose yaml file. We currently use openAI GPT-4o-mini as LLM, and we plan to add support for llama3.1-70B-instruct (served by TGI-Gaudi) in a subsequent release. | ||
To use openai llm, run command below. | ||
|
||
``` | ||
cd docker/openai/ | ||
bash launch_agent_service_openai.sh | ||
``` | ||
|
||
## Validate services | ||
|
||
First look at logs of the agent docker containers: | ||
|
||
``` | ||
docker logs docgrader-agent-endpoint | ||
``` | ||
|
||
``` | ||
docker logs react-agent-endpoint | ||
``` | ||
|
||
You should see something like "HTTP server setup successful" if the docker containers are started successfully.</p> | ||
|
||
Second, validate worker agent: | ||
|
||
``` | ||
curl http://${ip_address}:9095/v1/chat/completions -X POST -H "Content-Type: application/json" -d '{ | ||
"query": "Most recent album by Taylor Swift" | ||
}' | ||
``` | ||
|
||
Third, validate supervisor agent: | ||
|
||
``` | ||
curl http://${ip_address}:9090/v1/chat/completions -X POST -H "Content-Type: application/json" -d '{ | ||
"query": "Most recent album by Taylor Swift" | ||
}' | ||
``` | ||
|
||
## How to register your own tools with agent | ||
|
||
You can take a look at the tools yaml and python files in this example. For more details, please refer to the "Provide your own tools" section in the instructions [here](https://github.com/minmin-intel/GenAIComps/tree/agent-comp-dev/comps/agent/langchain#-4-provide-your-own-tools). |
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,63 @@ | ||
# Copyright (C) 2024 Intel Corporation | ||
# SPDX-License-Identifier: Apache-2.0 | ||
|
||
services: | ||
worker-docgrader-agent: | ||
image: opea/comps-agent-langchain:latest | ||
container_name: docgrader-agent-endpoint | ||
volumes: | ||
- ${WORKDIR}/GenAIComps/comps/agent/langchain/:/home/user/comps/agent/langchain/ | ||
- ${TOOLSET_PATH}:/home/user/tools/ | ||
ports: | ||
- "9095:9095" | ||
ipc: host | ||
environment: | ||
ip_address: ${ip_address} | ||
strategy: rag_agent | ||
recursion_limit: ${recursion_limit} | ||
llm_engine: openai | ||
OPENAI_API_KEY: ${OPENAI_API_KEY} | ||
model: ${model} | ||
temperature: ${temperature} | ||
max_new_tokens: ${max_new_tokens} | ||
streaming: false | ||
tools: /home/user/tools/worker_agent_tools.yaml | ||
require_human_feedback: false | ||
no_proxy: ${no_proxy} | ||
http_proxy: ${http_proxy} | ||
https_proxy: ${https_proxy} | ||
LANGCHAIN_API_KEY: ${LANGCHAIN_API_KEY} | ||
LANGCHAIN_TRACING_V2: ${LANGCHAIN_TRACING_V2} | ||
LANGCHAIN_PROJECT: "opea-worker-agent-service" | ||
port: 9095 | ||
|
||
supervisor-react-agent: | ||
image: opea/comps-agent-langchain:latest | ||
container_name: react-agent-endpoint | ||
volumes: | ||
- ${WORKDIR}/GenAIComps/comps/agent/langchain/:/home/user/comps/agent/langchain/ | ||
- ${TOOLSET_PATH}:/home/user/tools/ | ||
ports: | ||
- "9090:9090" | ||
ipc: host | ||
environment: | ||
ip_address: ${ip_address} | ||
strategy: react_langgraph | ||
recursion_limit: ${recursion_limit} | ||
llm_engine: openai | ||
OPENAI_API_KEY: ${OPENAI_API_KEY} | ||
model: ${model} | ||
temperature: ${temperature} | ||
max_new_tokens: ${max_new_tokens} | ||
streaming: ${streaming} | ||
tools: /home/user/tools/supervisor_agent_tools.yaml | ||
require_human_feedback: false | ||
no_proxy: ${no_proxy} | ||
http_proxy: ${http_proxy} | ||
https_proxy: ${https_proxy} | ||
LANGCHAIN_API_KEY: ${LANGCHAIN_API_KEY} | ||
LANGCHAIN_TRACING_V2: ${LANGCHAIN_TRACING_V2} | ||
LANGCHAIN_PROJECT: "opea-supervisor-agent-service" | ||
CRAG_SERVER: $CRAG_SERVER | ||
WORKER_AGENT_URL: $WORKER_AGENT_URL | ||
port: 9090 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,13 @@ | ||
# Copyright (C) 2024 Intel Corporation | ||
# SPDX-License-Identifier: Apache-2.0 | ||
|
||
export ip_address=$(hostname -I | awk '{print $1}') | ||
export recursion_limit=12 | ||
export model="gpt-4o-mini-2024-07-18" | ||
export temperature=0 | ||
export max_new_tokens=512 | ||
export OPENAI_API_KEY=${OPENAI_API_KEY} | ||
export WORKER_AGENT_URL="http://${ip_address}:9095/v1/chat/completions" | ||
export CRAG_SERVER=http://${ip_address}:8080 | ||
|
||
docker compose -f docker-compose-agent-openai.yaml up -d |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,75 @@ | ||
#!/bin/bash | ||
# Copyright (C) 2024 Intel Corporation | ||
# SPDX-License-Identifier: Apache-2.0 | ||
|
||
set -e | ||
echo "IMAGE_REPO=${IMAGE_REPO}" | ||
echo "OPENAI_API_KEY=${OPENAI_API_KEY}" | ||
|
||
WORKPATH=$(dirname "$PWD") | ||
export WORKDIR=$WORKPATH/../../ | ||
echo "WORKDIR=${WORKDIR}" | ||
export ip_address=$(hostname -I | awk '{print $1}') | ||
export TOOLSET_PATH=$WORKDIR/GenAIExamples/AgentQnA/tools/ | ||
|
||
function build_agent_docker_image() { | ||
cd $WORKDIR | ||
if [ ! -d "GenAIComps" ] ; then | ||
git clone https://github.com/opea-project/GenAIComps.git | ||
fi | ||
cd GenAIComps | ||
echo PWD: $(pwd) | ||
docker build -t opea/comps-agent-langchain:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/agent/langchain/docker/Dockerfile . | ||
} | ||
|
||
function start_services() { | ||
echo "Starting CRAG server" | ||
docker run -d -p=8080:8000 docker.io/aicrowd/kdd-cup-24-crag-mock-api:v0 | ||
echo "Starting Agent services" | ||
cd $WORKDIR/GenAIExamples/AgentQnA/docker/openai | ||
bash launch_agent_service_openai.sh | ||
} | ||
|
||
function validate() { | ||
local CONTENT="$1" | ||
local EXPECTED_RESULT="$2" | ||
local SERVICE_NAME="$3" | ||
|
||
if echo "$CONTENT" | grep -q "$EXPECTED_RESULT"; then | ||
echo "[ $SERVICE_NAME ] Content is as expected: $CONTENT" | ||
echo 0 | ||
else | ||
echo "[ $SERVICE_NAME ] Content does not match the expected result: $CONTENT" | ||
echo 1 | ||
fi | ||
} | ||
|
||
|
||
function run_tests() { | ||
echo "----------------Test supervisor agent ----------------" | ||
local CONTENT=$(http_proxy="" curl http://${ip_address}:9090/v1/chat/completions -X POST -H "Content-Type: application/json" -d '{ | ||
"query": "Most recent album by Taylor Swift" | ||
}') | ||
local EXIT_CODE=$(validate "$CONTENT" "Taylor" "react-agent-endpoint") | ||
docker logs react-agent-endpoint | ||
if [ "$EXIT_CODE" == "1" ]; then | ||
exit 1 | ||
fi | ||
|
||
} | ||
|
||
function stop_services() { | ||
echo "Stopping CRAG server" | ||
docker stop $(docker ps -q --filter ancestor=docker.io/aicrowd/kdd-cup-24-crag-mock-api:v0) | ||
echo "Stopping Agent services" | ||
docker stop $(docker ps -q --filter ancestor=opea/comps-agent-langchain:latest) | ||
} | ||
|
||
function main() { | ||
build_agent_docker_image | ||
start_services | ||
run_tests | ||
stop_services | ||
} | ||
|
||
main |
Oops, something went wrong.