Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"cgroups v2" Error #14

Closed
Timboman opened this issue Sep 23, 2024 · 5 comments
Closed

"cgroups v2" Error #14

Timboman opened this issue Sep 23, 2024 · 5 comments

Comments

@Timboman
Copy link

Timboman commented Sep 23, 2024

Description

I'm running the open-webui as a Docker on an unRAID server.
Whenever I try to use your tool or function I get the following error and it fails to work:

Environment needs setup work: /sys/fs/cgroup/cgroup.subtree_control not found; make sure you are using cgroups v2

General information

  • Open WebUI version: v0.3.23
  • Tool/function version: 0.6.0
  • Open WebUI setup:
    • Kernel information: Linux 57091785507d 5.19.17-Unraid #2 SMP PREEMPT_DYNAMIC Wed Nov 2 11:54:15 PDT 2022 x86_64 GNU/Linux
    • Runtime: Docker
    • If running in Docker:
      • Docker version: Docker version 20.10.21, build baeda1f
      • docker run command: docker run -d --name='open-webui' --net='bridge' --privileged=true -e TZ="America/New_York" -e HOST_OS="Unraid" -e HOST_HOSTNAME="CENSORED" -e HOST_CONTAINERNAME="open-webui" -e 'OLLAMA_BASE_URL'='http://192.168.1.115:11434' -e 'OPENAI_API_KEY'='' -l net.unraid.docker.managed=dockerman -l net.unraid.docker.webui='http://[IP]:[PORT:8080]/' -l net.unraid.docker.icon='https://raw.githubusercontent.com/open-webui/open-webui/main/static/favicon.png' -p '8086:8080/tcp' -v '/mnt/user/AnythingLLM/ollama-webui':'/app/backend/data':'rw' --runtime=nvidia --gpus all 'ghcr.io/open-webui/open-webui:cuda'
      • Docker container info:
[
    {
        "Id": "21fd7532bfd7d499b030a1e915149299f81ff583625d0bbefc85f6f45710610b",
        "Created": "2024-09-23T15:21:41.803461556Z",
        "Path": "bash",
        "Args": [
            "start.sh"
        ],
        "State": {
            "Status": "running",
            "Running": true,
            "Paused": false,
            "Restarting": false,
            "OOMKilled": false,
            "Dead": false,
            "Pid": 461,
            "ExitCode": 0,
            "Error": "",
            "StartedAt": "2024-09-23T15:21:43.173669061Z",
            "FinishedAt": "0001-01-01T00:00:00Z",
            "Health": {
                "Status": "healthy",
                "FailingStreak": 0,
                "Log": [
                    {
                        "Start": "2024-09-23T11:22:13.174124939-04:00",
                        "End": "2024-09-23T11:22:13.601104099-04:00",
                        "ExitCode": 0,
                        "Output": "true\n"
                    }
                ]
            }
        },
        "Image": "sha256:d0422a3fe0127514e7d571effa6d6f5cade3c66bb9dda8c81782b870119dc9cf",
        "ResolvConfPath": "/var/lib/docker/containers/21fd7532bfd7d499b030a1e915149299f81ff583625d0bbefc85f6f45710610b/resolv.conf",
        "HostnamePath": "/var/lib/docker/containers/21fd7532bfd7d499b030a1e915149299f81ff583625d0bbefc85f6f45710610b/hostname",
        "HostsPath": "/var/lib/docker/containers/21fd7532bfd7d499b030a1e915149299f81ff583625d0bbefc85f6f45710610b/hosts",
        "LogPath": "/var/lib/docker/containers/21fd7532bfd7d499b030a1e915149299f81ff583625d0bbefc85f6f45710610b/21fd7532bfd7d499b030a1e915149299f81ff583625d0bbefc85f6f45710610b-json.log",
        "Name": "/open-webui",
        "RestartCount": 0,
        "Driver": "btrfs",
        "Platform": "linux",
        "MountLabel": "",
        "ProcessLabel": "",
        "AppArmorProfile": "",
        "ExecIDs": null,
        "HostConfig": {
            "Binds": [
                "/mnt/user/AnythingLLM/ollama-webui:/app/backend/data:rw"
            ],
            "ContainerIDFile": "",
            "LogConfig": {
                "Type": "json-file",
                "Config": {}
            },
            "NetworkMode": "bridge",
            "PortBindings": {
                "8080/tcp": [
                    {
                        "HostIp": "",
                        "HostPort": "8086"
                    }
                ]
            },
            "RestartPolicy": {
                "Name": "no",
                "MaximumRetryCount": 0
            },
            "AutoRemove": false,
            "VolumeDriver": "",
            "VolumesFrom": null,
            "CapAdd": null,
            "CapDrop": null,
            "CgroupnsMode": "host",
            "Dns": [],
            "DnsOptions": [],
            "DnsSearch": [],
            "ExtraHosts": null,
            "GroupAdd": null,
            "IpcMode": "private",
            "Cgroup": "",
            "Links": null,
            "OomScoreAdj": 0,
            "PidMode": "",
            "Privileged": true,
            "PublishAllPorts": false,
            "ReadonlyRootfs": false,
            "SecurityOpt": [
                "label=disable"
            ],
            "UTSMode": "",
            "UsernsMode": "",
            "ShmSize": 67108864,
            "Runtime": "nvidia",
            "ConsoleSize": [
                0,
                0
            ],
            "Isolation": "",
            "CpuShares": 0,
            "Memory": 0,
            "NanoCpus": 0,
            "CgroupParent": "",
            "BlkioWeight": 0,
            "BlkioWeightDevice": [],
            "BlkioDeviceReadBps": null,
            "BlkioDeviceWriteBps": null,
            "BlkioDeviceReadIOps": null,
            "BlkioDeviceWriteIOps": null,
            "CpuPeriod": 0,
            "CpuQuota": 0,
            "CpuRealtimePeriod": 0,
            "CpuRealtimeRuntime": 0,
            "CpusetCpus": "",
            "CpusetMems": "",
            "Devices": [],
            "DeviceCgroupRules": null,
            "DeviceRequests": [
                {
                    "Driver": "",
                    "Count": -1,
                    "DeviceIDs": null,
                    "Capabilities": [
                        [
                            "gpu"
                        ]
                    ],
                    "Options": {}
                }
            ],
            "KernelMemory": 0,
            "KernelMemoryTCP": 0,
            "MemoryReservation": 0,
            "MemorySwap": 0,
            "MemorySwappiness": null,
            "OomKillDisable": false,
            "PidsLimit": null,
            "Ulimits": null,
            "CpuCount": 0,
            "CpuPercent": 0,
            "IOMaximumIOps": 0,
            "IOMaximumBandwidth": 0,
            "MaskedPaths": null,
            "ReadonlyPaths": null
        },
        "GraphDriver": {
            "Data": null,
            "Name": "btrfs"
        },
        "Mounts": [
            {
                "Type": "bind",
                "Source": "/mnt/user/AnythingLLM/ollama-webui",
                "Destination": "/app/backend/data",
                "Mode": "rw",
                "RW": true,
                "Propagation": "rprivate"
            }
        ],
        "Config": {
            "Hostname": "21fd7532bfd7",
            "Domainname": "",
            "User": "0:0",
            "AttachStdin": false,
            "AttachStdout": false,
            "AttachStderr": false,
            "ExposedPorts": {
                "8080/tcp": {}
            },
            "Tty": false,
            "OpenStdin": false,
            "StdinOnce": false,
            "Env": [
                "HOST_HOSTNAME=CENSORED",
                "HOST_CONTAINERNAME=open-webui",
                "OLLAMA_BASE_URL=http://192.168.1.115:11434",
                "OPENAI_API_KEY=",
                "TZ=America/New_York",
                "HOST_OS=Unraid",
                "PATH=/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                "LANG=C.UTF-8",
                "GPG_KEY=A035C8C19219BA821ECEA86B64E628F8D684696D",
                "PYTHON_VERSION=3.11.10",
                "ENV=prod",
                "PORT=8080",
                "USE_OLLAMA_DOCKER=false",
                "USE_CUDA_DOCKER=true",
                "USE_CUDA_DOCKER_VER=cu121",
                "USE_EMBEDDING_MODEL_DOCKER=sentence-transformers/all-MiniLM-L6-v2",
                "USE_RERANKING_MODEL_DOCKER=",
                "OPENAI_API_BASE_URL=",
                "WEBUI_SECRET_KEY=",
                "SCARF_NO_ANALYTICS=true",
                "DO_NOT_TRACK=true",
                "ANONYMIZED_TELEMETRY=false",
                "WHISPER_MODEL=base",
                "WHISPER_MODEL_DIR=/app/backend/data/cache/whisper/models",
                "RAG_EMBEDDING_MODEL=sentence-transformers/all-MiniLM-L6-v2",
                "RAG_RERANKING_MODEL=",
                "SENTENCE_TRANSFORMERS_HOME=/app/backend/data/cache/embedding/models",
                "HF_HOME=/app/backend/data/cache/embedding/models",
                "HOME=/root",
                "WEBUI_BUILD_VERSION=f47dffe6e11c6fc63f2dc6029c4cb5e458d22fe7",
                "DOCKER=true"
            ],
            "Cmd": [
                "bash",
                "start.sh"
            ],
            "Healthcheck": {
                "Test": [
                    "CMD-SHELL",
                    "curl --silent --fail http://localhost:${PORT:-8080}/health | jq -ne 'input.status == true' || exit 1"
                ]
            },
            "Image": "ghcr.io/open-webui/open-webui:cuda",
            "Volumes": null,
            "WorkingDir": "/app/backend",
            "Entrypoint": null,
            "OnBuild": null,
            "Labels": {
                "net.unraid.docker.icon": "https://raw.githubusercontent.com/open-webui/open-webui/main/static/favicon.png",
                "net.unraid.docker.managed": "dockerman",
                "net.unraid.docker.webui": "http://[IP]:[PORT:8080]/",
                "org.opencontainers.image.created": "2024-09-21T13:37:24.774Z",
                "org.opencontainers.image.description": "User-friendly WebUI for LLMs (Formerly Ollama WebUI)",
                "org.opencontainers.image.licenses": "MIT",
                "org.opencontainers.image.revision": "f47dffe6e11c6fc63f2dc6029c4cb5e458d22fe7",
                "org.opencontainers.image.source": "https://github.com/open-webui/open-webui",
                "org.opencontainers.image.title": "open-webui",
                "org.opencontainers.image.url": "https://github.com/open-webui/open-webui",
                "org.opencontainers.image.version": "main-cuda"
            }
        },
        "NetworkSettings": {
            "Bridge": "",
            "SandboxID": "ad841821d133bfe355d8de1e3bfc95e2c40e66685f92fea3f6e9ce26e736d673",
            "HairpinMode": false,
            "LinkLocalIPv6Address": "",
            "LinkLocalIPv6PrefixLen": 0,
            "Ports": {
                "8080/tcp": [
                    {
                        "HostIp": "0.0.0.0",
                        "HostPort": "8086"
                    },
                    {
                        "HostIp": "::",
                        "HostPort": "8086"
                    }
                ]
            },
            "SandboxKey": "/var/run/docker/netns/ad841821d133",
            "SecondaryIPAddresses": null,
            "SecondaryIPv6Addresses": null,
            "EndpointID": "e0d39081b8b0419c9833e2f0a86786ea5e7f26fd1dcd3f9167765a4b71430193",
            "Gateway": "172.17.0.1",
            "GlobalIPv6Address": "",
            "GlobalIPv6PrefixLen": 0,
            "IPAddress": "172.17.0.17",
            "IPPrefixLen": 16,
            "IPv6Gateway": "",
            "MacAddress": "02:42:ac:11:00:11",
            "Networks": {
                "bridge": {
                    "IPAMConfig": null,
                    "Links": null,
                    "Aliases": null,
                    "NetworkID": "846849425f2807bd5411bdebe7a083ebb2811d6a4f9d152d16292fc93a8620ea",
                    "EndpointID": "e0d39081b8b0419c9833e2f0a86786ea5e7f26fd1dcd3f9167765a4b71430193",
                    "Gateway": "172.17.0.1",
                    "IPAddress": "172.17.0.17",
                    "IPPrefixLen": 16,
                    "IPv6Gateway": "",
                    "GlobalIPv6Address": "",
                    "GlobalIPv6PrefixLen": 0,
                    "MacAddress": "02:42:ac:11:00:11",
                    "DriverOpts": null
                }
            }
        }
    }
]

Debug logs

To get debug logs, please follow these steps:

  1. Enable debug logging in the tool or function by changing self._debug = False to self._debug = True in the code.
  2. Reproduce the issue in a new chat session.
  3. Download the chat session (triple-dot menu → DownloadExport chat (json))
  4. Attach the resulting .json file to this bug report.
    chat-export-1727105479309.json

Additional context

[Add any other context about the problem here.]

@EtiennePerot
Copy link
Owner

Hmm, can you add --mount=type=bind,source=/sys/fs/cgroup,target=/sys/fs/cgroup,readonly=false? (As listed in "the hard way" section of the setup docs). This shouldn't be necessary in --privileged=true mode, but perhaps I missed something.

Another possibility is that unRAID doesn't support cgroups, in which case that's a bit surprising but could add a workaround for it.

@Timboman
Copy link
Author

Same result when adding that argument to the docker run
chat-export-1727116805808.json

@smuotoe
Copy link

smuotoe commented Sep 23, 2024

I experienced the same error with WSL Ubuntu. I resolved this by adding this line to the .wslconfig file:

kernelCommandLine = cgroup_no_v1=all systemd.unified_cgroup_hierarchy=1

So my .wslconfig file now looks like this:

#.wslconfig
# Settings apply across all Linux distros running on WSL 2
[wsl2]

# Limits VM memory to use no more than 4 GB, this can be set as whole numbers using GB or MB
memory=64GB 

# Sets the VM to use two virtual processors
# processors=8

# Switch from tmpfs to cgroup2
kernelCommandLine = cgroup_no_v1=all systemd.unified_cgroup_hierarchy=1

[experimental]
sparseVhd=true

Saved the wslconfig file and restarted WSL (wsl --shutdown). It worked fine afterwards.

My docker run command:

docker run --privileged=true --cgroupns=host -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

Hope this helps.

@EtiennePerot
Copy link
Owner

Thank you. This means the problem is that the Linux kernel used in this case doesn't have cgroups v2 and is likely still using cgroups v1. I don't think it's worth making the function support both types of cgroups (cgroups v1 is antiquated and much harder to work with). But it could be set up such that it ignores cgroups when cgroups v1 is in use. This means it cannot enforce RAM limits, but still better than having it not work at all.

@EtiennePerot
Copy link
Owner

EtiennePerot commented Oct 3, 2024

I believe this is fixed as of v0.8.0, if you set the REQUIRE_RESOURCE_LIMITING valve to false. This disables resource limiting, so only do this in situations where doing so is OK.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants