diff --git a/comps/llms/text-generation/ollama/README.md b/comps/llms/text-generation/ollama/README.md index 1ad636098..0ff46cc82 100644 --- a/comps/llms/text-generation/ollama/README.md +++ b/comps/llms/text-generation/ollama/README.md @@ -40,7 +40,7 @@ All of your local models are automatically served on localhost:11434. Run ollama Send an application/json request to the API endpoint of Ollama to interact. ```bash -curl http://localhost:11434/api/generate -d '{ +curl --noproxy "*" http://localhost:11434/api/generate -d '{ "model": "llama3", "prompt":"Why is the sky blue?" }'