Skip to content

Vampeyer/docker-genai-sample

 
 

Repository files navigation

GENERATIONAL AI , PDF SCANNER , AND AI ANALYZER / RESPONDER.

Using , Neo4J,
Ollama, Streamlit, and Docker.

This is a fork from the repository - https://github.com/craig-osterhout/docker-genai-sample

From the OFFICIAL DOCKER DOCS , located here - https://docs.docker.com/guides/use-case/genai-pdf-bot/

  • This version is working as of 09/16/2024 -

Running the application.

  • To run the aplication properly , your device must meet the following prerequisites.
  1. Docker running on the machine
    2 . Neo4j Docker image running , by pulling the official Neo4j image from Docker hub , and running it on port

    • "7474:7474"
    • "7687:7687" -- To see if you have it connected properly , you may visit localhost:7474 to see if it is running ,
    • While it is running in docker container , and you should see a site displaying its content.
  2. Ollama in a Dockerized container must also be running on port 11434. -- To see if you have it connected properly , you may visit localhost:11434 to see if it is running ,

    • While it is running in docker container , and you should see a site displaying its content.
  3. With neo4j and Ollama hooked up and configured properly , after running them in docker , you should be able to run

docker compose up --build

and build and run the project from there.







Updates


  • Updated the requirements.txt file , through a variation of a .venv and retesting.

  • Intialized a dockerfile , using dockerinit , selcting the following variations. - Python - 3.12 -Listening on port: - 8000 What is the command to run your app?: - streamlit run app.py --server.address=0.0.0.0 --server.port=8000

  • Updated the compose.yaml to hold the proper settings to connect to a local containerized Neo4j Database and Ollama server

  • Ran and tested the application using a local , docker containerized Neo4j container and a Ollama LLM model container , both running concurrently.


--- You should be running both containers , like this image on the right.

Screenshot (543)

These other images with Neo4j Aura Cloud services , and running a Ollama locally OUT of docker , is NOT what this repo is configured for ,

This is configured to run your ollama inside of docker from docker hub. Ok ? kk - But enjoy the screenshots provided below ! Screenshot (542) Screenshot (541) Screenshot (540) Screenshot (539) Screenshot (538) Screenshot (537) Screenshot (545) Screenshot (544)

NOTICE , after running docker compose up --build , something similar to this ^ is what your container should have running in it.


docker-genai-sample

A simple GenAI app for Docker's Docs based on the GenAI Stack PDF Reader application.

About

A simple GenAI app for a guide in Docker's Docs

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 89.7%
  • Dockerfile 10.3%