Skip to content

Commit

Permalink
Update llamacpp
Browse files Browse the repository at this point in the history
  • Loading branch information
Josh-XT committed Aug 10, 2024
1 parent c325801 commit c3f0789
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion cuda.Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ ENV HOST=0.0.0.0 \
CUDA_DOCKER_ARCH=all \
LLAMA_CUBLAS=1 \
GGML_CUDA=1
RUN CMAKE_ARGS="-DGGML_CUDA=on" FORCE_CMAKE=1 pip install llama-cpp-python==0.2.85 --no-cache-dir
RUN CMAKE_ARGS="-DGGML_CUDA=on" FORCE_CMAKE=1 pip install llama-cpp-python==0.2.87 --no-cache-dir
RUN git clone https://github.com/Josh-XT/DeepSeek-VL deepseek
RUN pip install torch==2.3.1+cu121 torchaudio==2.3.1+cu121 --index-url https://download.pytorch.org/whl/cu121
COPY cuda-requirements.txt .
Expand Down
2 changes: 1 addition & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -25,4 +25,4 @@ optimum
onnx
diffusers[torch]
torchaudio==2.3.1
llama-cpp-python==0.2.83
llama-cpp-python==0.2.87

0 comments on commit c3f0789

Please sign in to comment.