Local Chat Bot is a simple python flask application to interact in a chatbot way with your preferred models imported in Ollama.
You will need :
- Python
- A bunch of python library : json, os, time, langchain, flask, langchain_ollama
- Ollama : https://ollama.com/download/linux
- You're prefered Ollama model : https://ollama.com/library
Paste the script and the template where you want, launch it and go to http://127.0.0.1:5000. That's all. Start discussing with your model without needing chatGPT accounts, giving all your information to GAFAM, etc...
P.S.: You can update the template to suits your preference and certainly do better than this P.P.S: The template was generated partly by llama3 using this script (Ascii ART, CSS, etc...)
Ask your question, get your answer and if you want to switch subject, check the "Forget previous conversations and start a new one" checkbox. Again that's all.
Maybe. Who knows?
This script is based on the wonderful work made here : https://blog.devops.dev/local-gen-ai-chatbot-with-memory-using-ollama-llama3-using-python-3e07f4057cad
Basically, I just added a web interface to it (and certainly in a poor way) but it works! :D