Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Document NixOS system requirements? #2

Open
epigramengineer opened this issue Feb 23, 2023 · 3 comments
Open

Document NixOS system requirements? #2

epigramengineer opened this issue Feb 23, 2023 · 3 comments

Comments

@epigramengineer
Copy link

Hi, thanks so much for your work! I'm very excited to see this come together.

I'm getting started with this (and stable diffusion in general) and I'm running into some problems, when I run with invokeai-amd I get the following error:

"hipErrorNoBinaryForGpu: Unable to find code object for all current devices!"

I've gone through the steps in https://nixos.wiki/wiki/AMD_GPU for both the HIP section and OpenCL and I get output for my graphics card when I run rocminfo.

I'm wonder if I need something like https://github.com/nixos-rocm/nixos-rocm to make this work, or what other steps are necessary to get this running on a NixOS system?

Thanks again!

@MatthewCroughan
Copy link
Member

Myself and @max-privatevoid do not own a wide range of hardware to test this on, so it's likely you've actually just ran into a legitimate issue that we need to fix and document, you may need to help us out here!

@max-privatevoid
Copy link
Member

Hi there, the AMD version has some quirks right now as ROCm sadly isn't as well-supported as CUDA in the ML world in general. The error you're seeing should be resolved by setting an environment variable:

export HSA_OVERRIDE_GFX_VERSION=10.3.0

This tells ROCm to also consider "unsupported" GPUs as acceptable. This makes the project work on my RX 5500 XT.

The other thing you'll need to be doing is putting libdrm into a place where the torch binary can pick it up. It should be located at /opt/amdgpu. You can set this up in your NixOS configuration through regular means such as systemd.tmpfiles.rules. For testing purposes, what I've used so far is:

$ sudo mkdir /opt
$ sudo chown max:max /opt
$ nix build nixpkgs#libdrm^out --out-link /opt/amdgpu

In the future, we'll hopefully have the package patched properly so these hacks aren't necessary anymore.

@Mynacol
Copy link

Mynacol commented Mar 31, 2023

I can confirm, with HSA_OVERRIDE_GFX_VERSION=10.3.0 it runs on my AMD RX 6600 XT with Arch Linux and nix 😲

I fell into another error: Error: unsupported locale setting when it tries to run the invokeai-configure script.
After running nix run with LC_ALL=C it finally worked. The InvokeAI web interface even stayed in my normal locale (browser preference probably?), the terminal log is of course in english.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants