Skip to content

Provides a bash interface for local and remotely hosted ollama APIs powered by cowsay

Notifications You must be signed in to change notification settings

v1-valux/cowllama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 

Repository files navigation

cowllama

Provides a bash interface for local and remotely hosted ollama APIs powered by cowsay

 _____________            ____________________________________ 
<  What's up? >          ( Time to Update all Local models .. )
 -------------            ------------------------------------ 
      \                            o
       \   /\⌒⌒⌒⌒⌒/\                 o    /\⌒⌒⌒⌒⌒/\
        \ {         }                  o {         }
          (°('◞◟') °)                    (°('◞◟') °)
          (         )                    (         )
          (         )                    (         )

Usage:

  1. (optional) create an alias for convenience:
echo 'alias ollama="/path/to/cowllama.sh"' >> ~/.bashrc
  1. set execute permissions
sudo chmod +x /path/to/cowllama.sh
  1. install cowsay on your system:
  • Debian: apt install cowsay

  • Arch: yay -S cowsay

    ..you'll figure it out..

  1. copy cows/llama.cow from this repository to /usr/share/cows/

  2. run it as you would normally run ollama, but with additional argument-options:

ollama [[OPTIONS] [MODEL] [PROMPT]]

Option:           Description
-L | --local      run ollama commands via localhost / ollama binary
-D | --docker     run ollama commands through local docker container
-R | --remote     run ollama commands via remote API
--health          check the availability of all available APIs
--update-all      pulls all publicly available models via preconfigured API 
                  if no docker-container is provided the default API is 'localhost'

run               equivalent to 'ollama run'
list              equivalent to 'ollama list'
pull              equivalent to 'ollama pull'

health            check availability of the chosen API 
update-all        pulls all publicly available model blobs on the chosen API

Examples:

# check health of all APIs
ollama --health

# pull "phi3:medium" on remote machine
ollama -R pull phi3:medium

# update all models on remote machine
ollama -R update-all

# run codellama on localhost with optional prompt
ollama -L run codellama [prompt]

About

Provides a bash interface for local and remotely hosted ollama APIs powered by cowsay

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages