-
Notifications
You must be signed in to change notification settings - Fork 97
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GPU support #65
Comments
Hi. In order to use GPU, you need to have card of Nvidia, because of using CUDA. You can find this information in wiki or wunjo.online. But, you can try to use alternative for AMD cards ZLUDA. I didn't use ZLUDA before, but maybe this will help you. Switch before GPU and CPU did if torch cuda exists, you need to do torch.cuda.is_available be true with ZLUDA for AMD card. |
Thank You wladrachenko, Have never heard of ZLUDA... I do have ROCm working on a Debian 12 install. Was quite a challenge to be honest. Thanks again |
Just wanted to follow up. Any ideas,,,how to launch Wunjo to see the ZLUDA/AMD/torch setup. Worth noteing ,utilizing ZLUDA really speeds up ComfyUI under the hood somehow. Thanks, |
I don't have AMD card to test this. I think maybe this relate to pip install requirements process of specific version torch. Because of briefcase package import version of torch in lib for CUDA and maybe this difference with ZLUDA. I would recommend you run Wunjo CE by briefcase dev from code. You can use this video how run the app or wiki chapter how build gpu version. Also you can open requirements for ComfyUI and watch what's versions of torch using and additional libs, maybe to there specific version. You need to update versions and add new in pyproject.toml before run. I think will be amazing for other users of on YouTube will be video how run Wunjo and AMD with ZLUDA. |
In ComfyUI README I see this text:
You can add https://download.pytorch.org/whl/nightly/rocm6.2 in Wunjo requirements and pyproject.toml and try to run. If this will work please tell me, I will add logical in code. |
@wladradchenko I did do manual install for the pip command for ROCM6.2 and that went ok. in my wunjo venv TIA |
@brcisna pyproject.toml need to change if you wanna use briefcase build, simple pip install will work for briefcase dev. Anyway, if you wanna do build, you need to for Windows change lines 149-151 When you change pyproject_gpu.toml, you need to rename this on pyproject.toml and after you can use do briefcase build to run app from exe. |
Two things
2, I have a fairly new AMD card. In the documentaion for my particular card i have to add the following to launch comfui,,,otherwise i get errors. HSA_OVERRIDE_GFX_VERSION=10.3.0 python main.py I am guessing i need something similar added to launch Wunjo. Barry |
@brcisna I suggest that you need to add in app.py the line 11: How do you run ComfyUI Zluda? If you run bat file, I see interest line |
Also I think you can try to run file from ComfyUI Zludo in Wunjo folder near venv. This file. For experiment, if the above didn't help you, you can run installer bat from ComfyUI Zludo in Wunjo project folder near venv folder. Because of I see lines:
This will be also install and chande specific parameters for torch Zluda as in ComfyUI in Wunjo for Windows. |
I am very sorry...I didnt mention,,I ONLY do Linux.... no Windows involved here... to run ComfyUI i cearted the following script. ( regularly I would just run,,, 'python main.py') HSA_OVERRIDE_GFX_VERSION=10.3.0 python main.py This works without erros,,,for my Pro W6600 card. these cards have some of the paletst,,,ray tracing,,,features,,etc.,,,I dont have a clue how this all ties together. |
I apologize.. In comfyui,zluda,,, I do not even see an app.py???? I do see the main.py,,, |
@brcisna app.py is file of Wunjo. You can use os.envitoment or in shell script use HSA_OVERRIDE_GFX_VERSION=10.3.0, those will be similar. If after all that has been done for Linux, the application still does not start on AMD, then I do not know what is missing. Perhaps you can contact the repository with a discussion https://github.com/patientx/ComfyUI-Zluda or https://github.com/comfyanonymous/ComfyUI how they have been solved similar trouble with Zluda. |
os.environ['HSA_OVERRIDE_GFX_VERSION']=10.3.0 Problem running app wunjo. |
OK,,,my mistake,,, the GL errors was due to the remote desktop client i was remoting into my,,,linux machine in basement.. Am certain,,, Zluda,,,and my GPU is being used now,,, launching ith the HSA_OVERRIDE_GFX_VERSION=10.3.0, I would like to try and streamline this i put this in the app.py as you suggested but,,i get syntax errors. Keep in mind my GPU is AMD Radeon Pro W6600,,only 8GB of GDRR6 VRAM just for completness.. |
@brcisna If you want to increase the resolution that the Wunjo gets in output, you can go to the settings page. By default, it is HD. |
Hi,
Am not clear on the CE version of Wunjo AI V2 if GPU support exists ( on Linux Debian ) ?
Maybe it is for Nvidia cards?
Have an Amd Pro W6600 GPU card and it is never being utiliazed in Wunjo AI V2
TIA
The text was updated successfully, but these errors were encountered: