Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama with ROCm support #1488

Open
Gutifarra opened this issue Aug 11, 2024 · 6 comments
Open

Ollama with ROCm support #1488

Gutifarra opened this issue Aug 11, 2024 · 6 comments
Labels
enhancement New feature or request

Comments

@Gutifarra
Copy link

Describe the bug

First off, thanks for your support and for the work done with bazzite.

I've been looking through various posts on the Discourse which seem to indicate that:

  • There was a ujust install script included in Bazzite to install ollama.
  • This script was later removed as it was to be upstreamed to bluefin.

This should mean that the ujust ollama mentioned in eg. this post should work, but for some reason it is not present on my system:

username@bazzite:/usr/share/ublue-os/just$ grep ollama ./*
username@bazzite:/usr/share/ublue-os/just$ 

Not sure if this is expected behavior, if there is some other way of installing ollama (using the script in the ollama site does not support ROCm), or if this is some bug in my setup.

Thanks for any support you can give and if I've made any mistakes.

What did you expect to happen?

Ollama to be installed with ROCm support.

Output of rpm-ostree status

[username]:/usr/share/ublue-os$ rpm-ostree status
State: idle
Deployments:
● ostree-image-signed:docker://ghcr.io/ublue-os/bazzite:stable
                   Digest: sha256:21573bb1aea958466ad2f0065d8a898a2c29485aa57a68324aa39e4dc847bb89
                  Version: v3.6-40.20240809.0 (2024-08-09T21:13:31Z)
          LayeredPackages: sunshine
            LocalPackages: lact-0.5.5-0.x86_64

  ostree-image-signed:docker://ghcr.io/ublue-os/bazzite:stable
                   Digest: sha256:21573bb1aea958466ad2f0065d8a898a2c29485aa57a68324aa39e4dc847bb89
                  Version: v3.6-40.20240809.0 (2024-08-09T21:13:31Z)
          LayeredPackages: sunshine
            LocalPackages: lact-0.5.5-0.x86_64

Hardware

AMD 5600X
32GB RAM
AMD 6700XT

Extra information or context

No response

@dosubot dosubot bot added the bug Something isn't working label Aug 11, 2024
@muddyfeather
Copy link

I'm just a random internet stranger, but i've recently done a PR to update the ollama and open-webui ujust commands in bluefin, and once that's in, i'm planning a PR to bring the just file over to bazzite too. ublue-os/bluefin#1588
But FYI for now, if you just use the just file from the bluefin/just/bluefin-tools.just file on bazzite (just --justfile bluefin-tools.just ollama" it will work if you want to get it up and running right now on your machine.

@castrojo
Copy link
Member

When we're done in bluefin let's just move it to the config repo, that way it's in both and centralized.

@muddyfeather
Copy link

@Gutifarra would you be willing to do a little testing on this PR? ublue-os/bluefin#1588

maybe just try out rhe recipes from the justfile as you have an AMD GPU?

@castrojo castrojo added enhancement New feature or request and removed bug Something isn't working labels Aug 16, 2024
@austonpramodh
Copy link

austonpramodh commented Aug 20, 2024

I was able to run it with docker container, using podman. Ill share scripts that I used in a bit.

[Update]

sudo podman run -it --name ollama --replace -p 127.0.0.1:11434:11434 -v ollama:/root/.ollama --device /dev/kfd --device /dev/dri docker.io/ollama/ollama:rocm

You would need to run it in sudo, not sure why but even after adding the user to render group it doesn't work.

@austonpramodh
Copy link

Looks like it was SELinux,

sudo setsebool -P container_use_devices=true

running this allows podman to access the GPU devices without root.

@Dark-Thoughts
Copy link

I tried to get rocm & various tools to run with rocm without success in Bazzite and through Distrobox instances. Namely krita's new LLM implementation, and koboldcpp which itself I could get to run via Distrobox, but I could not compile it with rocm support to enable Vulkan text generation.
rocm is already a huge pain to get running in regular distros, but this seems like a complete roadblock for me, leaving me stuck with pure cpu gen, which is incredibly slow. Would be great to see some ujust script or something to get this going for AMD users.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants