This project uses the Azure OpenAI Assistants API to create a chatbot that interacts with users, processes their messages, and performs actions based on the content of the messages.
The project tries to implement the same functionality as the contoso-chat project but it uses Assistants API instead of Prompt Flow
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.
- Python 3.9 or higher
- Azure OpenAI resource and a GPT model deployment
- Azure Functions Core Tools
- Azure CosmosDB resource
- Visual Studio Code with Azure Functions extension
- Clone the repo
git clone https://github.com/dfmera/contoso-chat-assistant.git
- Open project in Visual Studio Code
cd contoso-chat-assistant code .
- Create a Python virtual enviroment
- 3.1. for Mac OS / Linux
python3 -m venv .venv source .venv/bin/activate
- 3.2. for Windows
python3 -m venv .venv .venv/Scripts/Activate.ps1
- 3.1. for Mac OS / Linux
- Install Python packages
pip install -r requirements.txt
Create an assistant in Azure OpenAI Studio
-
Open Azure OpenAI Studio and go to Assistants (preview)
-
Give your assistant a name
-
Instructions: Copy the instructions in
assistant/customer_prompt.txt
-
Deployment: Select a GPT model deployment
-
Functions: Copy the function definition in
assistant/GetCustomerInfo_definition.json
-
Activate code interpreter
-
Add the file
data/product_info/products.csv
-
Save the assistant and copy the assistant id
-
Create a
.env
file and fill it with the next valuesCOSMOS_ENDPOINT= COSMOS_KEY= AZURE_OPENAI_ENDPOINT= AZURE_OPENAI_API_KEY= OPENAI_API_VERSION=2024-02-15-preview OPENAI_GPT_DEPLOYMENT= OPENAI_ASSISTANT_ID=
-
Create a CosmosDB database and container by running the notebook in
/data/customer_info/create-cosmos-db.ipynb
Make sure you run the notebook in the .venv you created
-
Open folder
api
in VS Code and initiate the Azure Functions project or create a python.venv
virtual enviroment -
Create a local.settings.json file and fill it with the next values
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "",
"FUNCTIONS_WORKER_RUNTIME": "python",
"CosmosDB": "<YOUR COSMOSDB CONNECTION STRING>",
"OPENAI_ASSISTANT_ID": "<YOUR OPENAI_ASSISTANT_ID>",
"AZURE_OPENAI_ENDPOINT": "<YOUR AZURE_OPENAI_ENDPOINT>",
"AZURE_OPENAI_API_KEY": "<YOUR AZURE_OPENAI_API_KEY>",
"OPENAI_API_VERSION": "2024-02-15-preview",
"OPENAI_GPT_DEPLOYMENT": "<YOUR OPENAI_GPT_DEPLOYMENT>",
"CUSTOMER_INFO_API": "<YOUR LOCAL GetCustomerInfo FUNCTION URL>"
}
}
- Run the Azure Functions API in VS Code or in a terminal
cd api func start
- Test the function
ContosoChatAssistant
as in this example (replace {port} with your local port):POST http://localhost:{port}/api/ContosoChatAssistant Content-Type: application/json { "customerId": 1, "question": "Can you remember me my last orders?", "chat_history": [] }
-
Clone contoso-web repository
-
Follow the instructions to run the project localy
-
Replace the value of PROMPTFLOW_ENDPOINT key in
.env
file with your ContosoChatAssistant local URLPROMPTFLOW_ENDPOINT=http://{local_url}/api/ContosoChatAssistant
-
Run the project and test the chat
- Automate resource creation with
azd up
. - Test the reading of the products.csv file for product query
Contributions are welcome!