Skip to content

BE SMALLER FFS

BE SMALLER FFS #7

Triggered via push February 5, 2024 18:58
Status Failure
Total duration 4m 2s
Artifacts
Matrix: build-local-llm
Fit to window
Zoom out
Zoom in

Annotations

1 error
build-local-llm (CUDA, cuda.Dockerfile, linux/amd64, cuda-dev)
buildx failed with: ERROR: failed to solve: process "/bin/sh -c python3 -m pip install --upgrade pip cmake scikit-build setuptools wheel --no-cache-dir && CMAKE_ARGS=\"-DLLAMA_CUBLAS=on\" FORCE_CMAKE=1 pip install llama-cpp-python --no-cache-dir && pip install --no-cache-dir -r cuda-requirements.txt" did not complete successfully: exit code: 1