Provides a bash interface for local and remotely hosted ollama APIs powered by cowsay
_____________ ____________________________________
< What's up? > ( Time to Update all Local models .. )
------------- ------------------------------------
\ o
\ /\⌒⌒⌒⌒⌒/\ o /\⌒⌒⌒⌒⌒/\
\ { } o { }
(°('◞◟') °) (°('◞◟') °)
( ) ( )
( ) ( )
- (optional) create an alias for convenience:
echo 'alias ollama="/path/to/cowllama.sh"' >> ~/.bashrc
- set execute permissions
sudo chmod +x /path/to/cowllama.sh
- install
cowsay
on your system:
-
Debian:
apt install cowsay
-
Arch:
yay -S cowsay
..you'll figure it out..
-
copy
cows/llama.cow
from this repository to/usr/share/cows/
-
run it as you would normally run ollama, but with additional argument-options:
ollama [[OPTIONS] [MODEL] [PROMPT]]
Option: Description
-L | --local run ollama commands via localhost / ollama binary
-D | --docker run ollama commands through local docker container
-R | --remote run ollama commands via remote API
--health check the availability of all available APIs
--update-all pulls all publicly available models via preconfigured API
if no docker-container is provided the default API is 'localhost'
run equivalent to 'ollama run'
list equivalent to 'ollama list'
pull equivalent to 'ollama pull'
health check availability of the chosen API
update-all pulls all publicly available model blobs on the chosen API
# check health of all APIs
ollama --health
# pull "phi3:medium" on remote machine
ollama -R pull phi3:medium
# update all models on remote machine
ollama -R update-all
# run codellama on localhost with optional prompt
ollama -L run codellama [prompt]