-
Notifications
You must be signed in to change notification settings - Fork 7.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Quickstart Ollama External API - GET http://ollama:11434/api/tags "HTTP/1.1 403 Forbidden" #2066
Comments
Same here on windows 10. Temporary fix could be to change |
Can you try to disable autopull images in |
I am using Ollama 0.3.8 and getting the same issue. I also tried to disable autopull images and no luck. |
same issue here =/tmp/ollama2036586951/runners |
Even when cloning the repo with the fix, I still get the same 403 error. @jaluma, is that to be expected? |
@MandarUkrulkar check that you changed both You might also need to run |
You have run |
@jaluma thanks for the reply. Indeed I did not have In this thread there is also a 503, which seems to be because traefik is not ready. I added a simple healthcheck and a depends_on condition and private gpt works. My docker-compose modifications below
|
@meng-hui |
Pre-check
Description
Following the Quickstart documentation provided here for Ollama External API on macOS results in a 403 error in the PrivateGPT container when attempting to communicate with Ollama.
I've verified that Ollama is running locally by visiting http://localhost:11434/ and receiving the customary "Ollama is running".
Let me know if there's any additional info I can provide that would be helpful, thanks!
Steps to Reproduce
Expected Behavior
Successful access to Ollama locally installed on host from PrivateGPT
Actual Behavior
HTTP 403 error following issuance of docker-compose --profile ollama-api up command followed by container exit
Environment
macOS 14.6.1, Ollama 0.3.6, ollama-api profile
Additional Information
No response
Version
0.6.2
Setup Checklist
NVIDIA GPU Setup Checklist
nvidia-smi
to verify).sudo docker run --rm --gpus all nvidia/cuda:11.0.3-base-ubuntu20.04 nvidia-smi
)The text was updated successfully, but these errors were encountered: