BE SMALLER FFS #7
Annotations
1 error
build-local-llm (CUDA, cuda.Dockerfile, linux/amd64, cuda-dev)
buildx failed with: ERROR: failed to solve: process "/bin/sh -c python3 -m pip install --upgrade pip cmake scikit-build setuptools wheel --no-cache-dir && CMAKE_ARGS=\"-DLLAMA_CUBLAS=on\" FORCE_CMAKE=1 pip install llama-cpp-python --no-cache-dir && pip install --no-cache-dir -r cuda-requirements.txt" did not complete successfully: exit code: 1
|