Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Docker Compose can't find local Ollama on Linux #32

Closed
psschwei opened this issue Nov 6, 2024 · 4 comments · Fixed by #52
Closed

Docker Compose can't find local Ollama on Linux #32

psschwei opened this issue Nov 6, 2024 · 4 comments · Fixed by #52
Assignees

Comments

@psschwei
Copy link
Contributor

psschwei commented Nov 6, 2024

Trying to use the bee-stack with local ollama on Linux runs into a few issues.

On Windows/Mac, the default value of OLLAMA_HOST (http://host.docker.internal:11434) works fine. However, this doesn't work on Linux without some additional config (using extra_hosts to map host.docker.internal to host-gateway) and even then there's still some network config needed (as I don't think we're using the default bridge network, so the extra_host mapping doesn't help us).

Ideally, we'd have a config that would work across Mac, Windows, and Linux. Though depending on how soon #27 is ready, it may make sense to focus there rather than making more tweaks to the compose.

@psschwei psschwei self-assigned this Nov 20, 2024
@jezekra1
Copy link
Collaborator

Did you change ollama server host address when starting ollama? OLLAMA_HOST=0.0.0.0:11434

Some networking issues are definitely possible, this has always been an issue on different platforms, I remember struggling with this a lot on linux. It can also be firewall, if you use ufw, you need to add a firewall rule for the docker network or try disabling it temproarily and see if it works

@psschwei
Copy link
Contributor Author

I did, but still no luck.

The issue seems to stem from the fact that, on Linux, docker engine can run natively without needing the "magic" supplied by Docker Desktop (or Podman Desktop or Rancher or ...).

There seems to be three different ways to move forward on Linux:

  • Add an ollama service to the compose. This works, but it "feels" slower when running via a container
  • Use Podman Desktop on Linux and see if that has the requisite "magic". Haven't tried this yet
  • Use some linux-specific tweaks in the compose file, adding the extra host for host.docker.internal and setting host-gateway to the proper network via the command recommended here. Also haven't tried this one

In any case, I don't think it makes sense to update the compose file for this, as the majority of users won't run into this. But maybe add a troubleshooting doc and include a solution there for the few who do.

@jezekra1
Copy link
Collaborator

jezekra1 commented Nov 26, 2024

Add an ollama service to the compose. This works, but it "feels" slower when running via a container

I tried this on mac and it was so slow it was unusable.

Use some linux-specific tweaks in the compose file, adding the extra host for host.docker.internal and setting host-gateway to the proper network via the command recommended docker/for-linux#264 (comment). Also haven't tried this one

Could you try this with podman please? Should be enough to add it to the bee-api container.

@psschwei
Copy link
Contributor Author

Can confirm updating /etc/hosts so that host.docker.internal resolves to the bridge network will get ollama working on linux. Will update the troubleshooting doc with this info.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants