Skip to content

Latest commit

 

History

History
26 lines (19 loc) · 1.03 KB

README.md

File metadata and controls

26 lines (19 loc) · 1.03 KB

Ollama Self-Hosting Setup

Basic setup to run self-hosted Ollama and use LLMs locally (Works with Coolify)

Requirements

  • Ubuntu 22.04
  • Nvidia GPU
  • Docker

Steps

  1. Run setup.bash (installs Nvidia drivers, Nvidia CUDA toolkit, and Nvidia Container Toolkit)
  2. If using Coolify, paste the docker-compose.yaml content in the Docker Compose section of your project. Otherwise, just run docker compose up -d inside the project's folder on your machine.
  3. Go to localhost:3000 or the respective Coolify link for your service.

Installing models

For more information: