Skip to content

Commit

Permalink
Add cmds to restart ollama service and add proxy settings while launc…
Browse files Browse the repository at this point in the history
…hing docker (#438)

Signed-off-by: Wang, Kai Lawrence <[email protected]>
  • Loading branch information
wangkl2 authored Aug 17, 2024
1 parent 5bd8bda commit 8eb8b6a
Showing 1 changed file with 18 additions and 10 deletions.
28 changes: 18 additions & 10 deletions comps/llms/text-generation/ollama/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,17 +15,25 @@ Follow [these instructions](https://github.com/ollama/ollama) to set up and run
Note:
Special settings are necessary to pull models behind the proxy.

```bash
sudo vim /etc/systemd/system/ollama.service
```
- Step1: Modify the ollama service configure file.

Add your proxy to the above configure file.
```bash
sudo vim /etc/systemd/system/ollama.service
```

```markdown
[Service]
Environment="http_proxy=${your_proxy}"
Environment="https_proxy=${your_proxy}"
```
Add your proxy to the above configure file.

```markdown
[Service]
Environment="http_proxy=${your_proxy}"
Environment="https_proxy=${your_proxy}"
```

- Step2: Restart the ollama service.
```bash
sudo systemctl daemon-reload
sudo systemctl restart ollama
```

## Usage

Expand Down Expand Up @@ -56,7 +64,7 @@ docker build --no-cache -t opea/llm-ollama:latest --build-arg https_proxy=$https
# Run the Ollama Microservice

```bash
docker run --network host opea/llm-ollama:latest
docker run --network host -e http_proxy=$http_proxy -e https_proxy=$https_proxy opea/llm-ollama:latest
```

# Consume the Ollama Microservice
Expand Down

0 comments on commit 8eb8b6a

Please sign in to comment.