-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"cgroups v2" Error #14
Comments
Hmm, can you add Another possibility is that unRAID doesn't support cgroups, in which case that's a bit surprising but could add a workaround for it. |
Same result when adding that argument to the docker run |
I experienced the same error with WSL Ubuntu. I resolved this by adding this line to the
So my
Saved the My docker run command:
Hope this helps. |
Thank you. This means the problem is that the Linux kernel used in this case doesn't have cgroups v2 and is likely still using cgroups v1. I don't think it's worth making the function support both types of cgroups (cgroups v1 is antiquated and much harder to work with). But it could be set up such that it ignores cgroups when cgroups v1 is in use. This means it cannot enforce RAM limits, but still better than having it not work at all. |
I believe this is fixed as of v0.8.0, if you set the |
Description
I'm running the open-webui as a Docker on an unRAID server.
Whenever I try to use your tool or function I get the following error and it fails to work:
Environment needs setup work: /sys/fs/cgroup/cgroup.subtree_control not found; make sure you are using cgroups v2
General information
Linux 57091785507d 5.19.17-Unraid #2 SMP PREEMPT_DYNAMIC Wed Nov 2 11:54:15 PDT 2022 x86_64 GNU/Linux
docker run
command:docker run -d --name='open-webui' --net='bridge' --privileged=true -e TZ="America/New_York" -e HOST_OS="Unraid" -e HOST_HOSTNAME="CENSORED" -e HOST_CONTAINERNAME="open-webui" -e 'OLLAMA_BASE_URL'='http://192.168.1.115:11434' -e 'OPENAI_API_KEY'='' -l net.unraid.docker.managed=dockerman -l net.unraid.docker.webui='http://[IP]:[PORT:8080]/' -l net.unraid.docker.icon='https://raw.githubusercontent.com/open-webui/open-webui/main/static/favicon.png' -p '8086:8080/tcp' -v '/mnt/user/AnythingLLM/ollama-webui':'/app/backend/data':'rw' --runtime=nvidia --gpus all 'ghcr.io/open-webui/open-webui:cuda'
Debug logs
To get debug logs, please follow these steps:
self._debug = False
toself._debug = True
in the code.Download
→Export chat (json)
).json
file to this bug report.chat-export-1727105479309.json
Additional context
[Add any other context about the problem here.]
The text was updated successfully, but these errors were encountered: