Skip to content

Commit

Permalink
Merge pull request #267 from vrothberg/chat-readme
Browse files Browse the repository at this point in the history
chatbot README: a couple of refinements
  • Loading branch information
rhatdan authored Apr 15, 2024
2 parents f42a8fc + a4ac409 commit a85350c
Showing 1 changed file with 9 additions and 13 deletions.
22 changes: 9 additions & 13 deletions recipes/natural_language_processing/chatbot/README.md
Original file line number Diff line number Diff line change
@@ -1,43 +1,39 @@
# Chat Application

This demo provides a simple recipe to help developers start building out their own custom LLM enabled chat applications. It consists of two main components; the Model Service and the AI Application.
This recipe helps developers start building their own custom LLM enabled chat applications. It consists of two main components: the Model Service and the AI Application.

There are a few options today for local Model Serving, but this recipe will use [`llama-cpp-python`](https://github.com/abetlen/llama-cpp-python) and their OpenAI compatible Model Service. There is a Containerfile provided that can be used to build this Model Service within the repo, [`model_servers/llamacpp_python/base/Containerfile`](/model_servers/llamacpp_python/base/Containerfile).

Our AI Application will connect to our Model Service via it's OpenAI compatible API. In this example we rely on [Langchain's](https://python.langchain.com/docs/get_started/introduction) python package to simplify communication with our Model Service and we use [Streamlit](https://streamlit.io/) for our UI layer. Below please see an example of the chatbot application.

The AI Application will connect to the Model Service via it's OpenAI compatible API. The recipe relies on [Langchain's](https://python.langchain.com/docs/get_started/introduction) python package to simplify communication with the Model Service and uses [Streamlit](https://streamlit.io/) for the UI layer. You can find an example of the chat application below.

![](/assets/chatbot_ui.png)


## Try the chat application
## Try the Chat Application

Podman desktop [AI Lab extension](https://github.com/containers/podman-desktop-extension-ai-lab) includes this application as a sample recipe.
Choose `Recipes Catalog` -> `Chatbot` to launch from the AI Lab extension in podman desktop.
The [Podman Desktop](https://podman-desktop.io) [AI Lab Extension](https://github.com/containers/podman-desktop-extension-ai-lab) includes this recipe among others. To try it out, open `Recipes Catalog` -> `Chatbot` and follow the instructions to start the application.

There are also published images to try out this application as a local pod without the need to build anything yourself.
Start a local pod by running the following from this directory:
If you prefer building and running the application from terminal, please run the following commands from this directory.

First, build application's meta data and run the generated Kubernetes YAML which will spin up a Pod along with a number of containers:
```
make quadlet
podman kube play build/chatbot.yaml
```

To monitor this application running as a local pod, run:

The Pod is named `chatbot`, so you may use [Podman](https://podman.io) to manage the Pod and its containers:
```
podman pod list
podman ps
```

And, to stop the application:

To stop and remove the Pod, run:
```
podman pod stop chatbot
podman pod rm chatbot
```

After the pod is running, refer to the section below to [interact with the chatbot application](#interact-with-the-ai-application).
Once the Pod is running, please refer to the section below to [interact with the chatbot application](#interact-with-the-ai-application).

# Build the Application

Expand Down

0 comments on commit a85350c

Please sign in to comment.