-
-
Notifications
You must be signed in to change notification settings - Fork 369
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Failed to connect to the browser instance, will retry in 5 secs #674
Comments
I have a similar error with latest Hoarder version. The app can be used, but when I add a bookmark, it can't retrieve any image or description. |
same question |
everyone using docker desktop? we have seen before, that networking works differently on e.g. windows and linux. |
My deployment system is linux and this is my config file version: "3.8"
networks:
traefiknet:
external: true
services:
web:
image: ghcr.io/hoarder-app/hoarder:release
restart: unless-stopped
container_name: hoarder
volumes:
- /opt/mydocker/hoarder/data:/data
ports:
- 54110:3000
env_file:
- .env
networks:
- traefiknet
labels:
- traefik.docker.network=traefiknet
- traefik.enable=true
- traefik.http.routers.hoarder.rule=Host(`hoarder.my.domain`)
- traefik.http.routers.hoarder.entrypoints=http,https
- traefik.http.routers.hoarder.priority=10
- traefik.http.routers.hoarder.tls=true
- traefik.http.services.hoarder.loadbalancer.server.port=3000
- traefik.http.routers.hoarder.tls.certresolver=mycloudflare
chrome:
image: gcr.io/zenika-hub/alpine-chrome:123
restart: unless-stopped
container_name: chrome
command:
- --no-sandbox
- --disable-gpu
- --disable-dev-shm-usage
- --remote-debugging-address=0.0.0.0
- --remote-debugging-port=9222
- --hide-scrollbars
networks:
- traefiknet
meilisearch:
image: getmeili/meilisearch:v1.11.1
restart: unless-stopped
container_name: meilisearch
env_file:
- .env
environment:
MEILI_NO_ANALYTICS: "true"
volumes:
- /opt/mydocker/hoarder/meilisearch:/meili_data
networks:
- traefiknet |
@Crush-RY can you share the logs from the web container? |
Hmmm, it seems like there are multiple people hitting this now. So I'll label this as a bug until we figure out what's going on. |
Was anyone running hoarder before and faced this problem after an upgrade or is this all new installations? |
I've just pushed 393d097 to log more details on the connection failure reason. It'll take 15mins for the container to be built. Once it's built, can someone switch to the nightly build and capture the error for me? |
Sure, here it is:
Hope that helps. And to answer your first question: I run hoarder since a few days and I have this error since the beginning. |
Yeah, this is actually very helpful. I think I know how I can fix that! |
So basically what's happening here is that for one reason or the other (might be your network policies, or github being blocked, etc), hoarder is failing to download the adblock list used in the crawler. I've sent 378ad9b to ensure that this doesn't block worker startup. And in your case, you might also want to set |
Thanks again for your very quick answer. I tried with the fix you pushed and I also added the line you suggested in .env file, but unfortunately, it does not work. Here are the logs:
|
ok, now it's clear that you have some dns/internet problems in the container :) Basically your container can't resolve dns and this is required for the crawler to work. This is not a hoarder problem at this point. |
As you know,I am in China. Does hoarder access any api i can not access? |
Alright, I will investigate on my side, thank you for your help! |
I'm also in China, and as you mentioned, it turns out to be a network issue. I tried deploying hoarder on a VPS without network restrictions, and it worked perfectly. Thanks a lot for your help! |
I finally managed to fix the error and it was indeed coming from bad default docker configuration. For people facing the same one, here are the steps I followed: https://stackoverflow.com/questions/39400886/docker-cannot-resolve-dns-on-private-network and then I restarted my server. |
In my situation, the web containter does not included the network where chrome and meilisearch in.
I manualy add web to the network, and it works. |
Had the same dns issue with chome-alpine Adding the launch parameter
But that seems to be inconsequential (for now), so the workaround is fine |
Describe the Bug
https://docs.hoarder.app/Installation/docker
i try to run hoarder with docker compose,but failed.
Steps to Reproduce
Expected Behaviour
http://localhost:3000/ is OK
Screenshots or Additional Context
Device Details
Microsoft Edge 版本 131.0.2903.48 (正式版本) (x86_64) On macOS
Exact Hoarder Version
release
The text was updated successfully, but these errors were encountered: