Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Failed to connect to Reaper from 9.0.1 #439

Closed
RareBodhi opened this issue Jan 19, 2023 · 4 comments · Fixed by #483
Closed

[Bug] Failed to connect to Reaper from 9.0.1 #439

RareBodhi opened this issue Jan 19, 2023 · 4 comments · Fixed by #483
Labels
bug Something isn't working

Comments

@RareBodhi
Copy link

Expected Behaviour

Expected behaviour is that the reaper connects as it did in 9.0.0 and prior. I have stress tested older versions with multiple test-re-runs in the same environment and don't have any issues.

Actual Behaviour

Receive error log: Failed to connect to Reaper [id], internally it follows multiple retries and connect ECONNREFUSED ::1:32790.

I believe it is as a result of this "patch" https://github.com/testcontainers/testcontainers-node/pull/412/files which has caused some kind of socket issue, though I am not sure specifically what is causing it within this code change

Testcontainer Logs

02:16:37.256Z testcontainers DEBUG Found applicable Docker client strategy: UnixSocketStrategy
02:16:37.257Z testcontainers DEBUG Testing Docker client strategy URI: unix:///var/run/docker.sock
02:16:37.297Z testcontainers INFO  Using Docker client strategy: UnixSocketStrategy, Docker host: localhost
02:16:37.297Z testcontainers DEBUG Fetching system diagnostics
02:16:37.302Z testcontainers DEBUG Found applicable registry auth locator for registry "https://index.docker.io/v1/": Auths
02:16:37.303Z testcontainers DEBUG Checking if image exists: postgres:14.4-alpine
02:16:37.309Z testcontainers DEBUG Not pulling image as it already exists: postgres:14.4-alpine
02:16:37.310Z testcontainers DEBUG Creating new Reaper for session: 8f6318d4de4e0c719345b275ea7ba304
02:16:37.313Z testcontainers DEBUG Re-using cached auth for registry https://index.docker.io/v1/
02:16:37.313Z testcontainers DEBUG Checking if image exists: testcontainers/ryuk:0.3.2
02:16:37.427Z testcontainers DEBUG Not pulling image as it already exists: testcontainers/ryuk:0.3.2
02:16:37.428Z testcontainers INFO  Creating container for image: testcontainers/ryuk:0.3.2
02:16:37.489Z testcontainers INFO  Starting container testcontainers/ryuk:0.3.2 with ID: ae242fdb1f5947148a8905de929dc1f80d5120cd93776bc4221a454ae43254ce
02:16:37.895Z testcontainers DEBUG Waiting for container to be ready: ae242fdb1f5947148a8905de929dc1f80d5120cd93776bc4221a454ae43254ce
02:16:37.895Z testcontainers DEBUG Waiting for log message "/.+ Started!/"
02:16:37.900Z testcontainers INFO  Container is ready
02:16:37.901Z testcontainers DEBUG Connecting to Reaper (attempt 1) ae242fdb1f5947148a8905de929dc1f80d5120cd93776bc4221a454ae43254ce on localhost:32790
02:16:37.904Z testcontainers ERROR Reaper ae242fdb1f5947148a8905de929dc1f80d5120cd93776bc4221a454ae43254ce socket error: Error: connect ECONNREFUSED ::1:32790
02:16:37.904Z testcontainers ERROR Connection to Reaper ae242fdb1f5947148a8905de929dc1f80d5120cd93776bc4221a454ae43254ce closed with error
02:16:38.203Z testcontainers DEBUG System diagnostics: {"node":{"version":"v18.13.0","architecture":"x64","platform":"linux"},"docker":{"serverVersion":"20.10.22+azure-1","operatingSystem":"Ubuntu 22.04.1 LTS","operatingSystemType":"linux","architecture":"x86_64","cpus":2,"memory":7281278976},"dockerCompose":{"version":"1.29.2"}}
02:16:38.906Z testcontainers DEBUG Connecting to Reaper (attempt 2) ae242fdb1f5947148a8905de929dc1f80d5120cd93776bc4221a454ae43254ce on localhost:32790
02:16:38.908Z testcontainers ERROR Reaper ae242fdb1f5947148a8905de929dc1f80d5120cd93776bc4221a454ae43254ce socket error: Error: connect ECONNREFUSED ::1:32790
02:16:38.908Z testcontainers ERROR Connection to Reaper ae242fdb1f5947148a8905de929dc1f80d5120cd93776bc4221a454ae43254ce closed with error
02:16:39.909Z testcontainers DEBUG Connecting to Reaper (attempt 3) ae242fdb1f5947148a8905de929dc1f80d5120cd93776bc4221a454ae43254ce on localhost:32790
02:16:39.911Z testcontainers ERROR Reaper ae242fdb1f5947148a8905de929dc1f80d5120cd93776bc4221a454ae43254ce socket error: Error: connect ECONNREFUSED ::1:32790
02:16:39.911Z testcontainers ERROR Connection to Reaper ae242fdb1f5947148a8905de929dc1f80d5120cd93776bc4221a454ae43254ce closed with error
02:16:40.912Z testcontainers DEBUG Connecting to Reaper (attempt 4) ae242fdb1f5947148a8905de929dc1f80d5120cd93776bc4221a454ae43254ce on localhost:32790
02:16:40.913Z testcontainers ERROR Reaper ae242fdb1f5947148a8905de929dc1f80d5120cd93776bc4221a454ae43254ce socket error: Error: connect ECONNREFUSED ::1:32790
02:16:40.913Z testcontainers ERROR Connection to Reaper ae242fdb1f5947148a8905de929dc1f80d5120cd93776bc4221a454ae43254ce closed with error
02:16:41.914Z testcontainers DEBUG Connecting to Reaper (attempt 5) ae242fdb1f5947148a8905de929dc1f80d5120cd93776bc4221a454ae43254ce on localhost:32790
02:16:41.916Z testcontainers ERROR Reaper ae242fdb1f5947148a8905de929dc1f80d5120cd93776bc4221a454ae43254ce socket error: Error: connect ECONNREFUSED ::1:32790
02:16:41.916Z testcontainers ERROR Connection to Reaper ae242fdb1f5947148a8905de929dc1f80d5120cd93776bc4221a454ae43254ce closed with error
[fatal]: Error: Failed to connect to Reaper ae242fdb1f5947148a8905de929dc1f80d5120cd93776bc4221a454ae43254ce\n    at /home/runner/work/backend/backend/node_modules/testcontainers/dist/reaper.js:126:56\n    at IntervalRetryStrategy.<anonymous> (/home/runner/work/backend/backend/node_modules/testcontainers/dist/retry-strategy.js:37:28)\n    at Generator.next (<anonymous>)\n    at fulfilled (/home/runner/work/backend/backend/node_modules/testcontainers/dist/retry-strategy.js:5:58)\n    at processTicksAndRejections (node:internal/process/task_queues:95:5)

Steps to Reproduce

  1. Ensure using testcontainer version 9.0.1 or later.

  2. Running testcontainers with simple Postgres and Redis startups defined by the following functions, we get infrequent issues with the testcontainer startup causing a test suite failure.

  3. It does not happen consistently, perhaps 20% of the time. I also have not been able to replicate this locally.

export function createPostgresContainer(): PostgreSqlContainer {
  return new PostgreSqlContainer("postgres:14.4-alpine");
}

export function createRedisContainer(): GenericContainer {
  return new GenericContainer("redis:7-alpine")
    .withExposedPorts(6379)
    .withWaitStrategy(Wait.forLogMessage("Ready to accept connections"));
}

Environment Information

Issue only appears after upgrading to 9.0.1 containing this PR. Issue is resolved when downgrading to 9.0.0 or lower:

https://github.com/testcontainers/testcontainers-node/pull/412/files

  • Operating System: Ubuntu 22.04.1 ( x86_64 Linux - Github Actions )
  • Docker Version: 20.10.22
  • Node version: 18.13
  • Testcontainers version: 9.0.1
@cristianrgreco cristianrgreco added the triage Investigation required label Jan 19, 2023
@sebaplaza
Copy link

sebaplaza commented Jan 19, 2023

Same problem here,

9.0.0 is just hiding the error, but not resolving the issue.

Locally, everything works ok.

I'm having this problem in Gitlab CI Docker Runner (i'm doing docker-in-docker, mapping docker sock to host), seems to be related to concurrent instances (but i'm not sure).

2023-01-19T09:56:16.492Z testcontainers:containers TRACE 772ad226bab09bf7c9c1927c8a0245df09d9475f8e6c4b0430cb9e1460d65d17: 2023/01/19 09:56:16 Pinging Docker...
2023-01-19T09:56:16.498Z testcontainers:containers TRACE 772ad226bab09bf7c9c1927c8a0245df09d9475f8e6c4b0430cb9e1460d65d17: 2023/01/19 09:56:16 Docker daemon is available!
2023-01-19T09:56:16.499Z testcontainers:containers TRACE 772ad226bab09bf7c9c1927c8a0245df09d9475f8e6c4b0430cb9e1460d65d17: 2023/01/19 09:56:16 Starting on port 8080...
2023-01-19T09:56:16.499Z testcontainers:containers TRACE 772ad226bab09bf7c9c1927c8a0245df09d9475f8e6c4b0430cb9e1460d65d17: 2023/01/19 09:56:16 Started!
2023-01-19T09:56:16.506Z testcontainers DEBUG Waiting for container to be ready: 772ad226bab09bf7c9c1927c8a0245df09d9475f8e6c4b0430cb9e1460d65d17
2023-01-19T09:56:16.508Z testcontainers DEBUG Waiting for log message "/.+ Started!/"
2023-01-19T09:56:16.520Z testcontainers INFO  Container is ready
2023-01-19T09:56:16.520Z testcontainers DEBUG Connecting to Reaper (attempt 1) 772ad226bab09bf7c9c1927c8a0245df09d9475f8e6c4b0430cb9e1460d65d17 on 172.17.0.1:32790
2023-01-19T09:57:16.490Z testcontainers:containers TRACE 772ad226bab09bf7c9c1927c8a0245df09d9475f8e6c4b0430cb9e1460d65d17: panic: Timed out waiting for the first connection
2023-01-19T09:57:16.491Z testcontainers:containers TRACE 772ad226bab09bf7c9c1927c8a0245df09d9475f8e6c4b0430cb9e1460d65d17: 
2023-01-19T09:57:16.491Z testcontainers:containers TRACE 772ad226bab09bf7c9c1927c8a0245df09d9475f8e6c4b0430cb9e1460d65d17: goroutine 1 [running]:
2023-01-19T09:57:16.491Z testcontainers:containers TRACE 772ad226bab09bf7c9c1927c8a0245df09d9475f8e6c4b0430cb9e1460d65d17: main.main()
2023-01-19T09:57:16.491Z testcontainers:containers TRACE 772ad226bab09bf7c9c1927c8a0245df09d9475f8e6c4b0430cb9e1460d65d17: /go/src/github.com/testcontainers/moby-ryuk/main.go:50 +0x449
2023-01-19T09:58:27.315Z testcontainers ERROR Reaper 772ad226bab09bf7c9c1927c8a0245df09d9475f8e6c4b0430cb9e1460d65d17 socket error: Error: connect ETIMEDOUT 172.17.0.1:32790
2023-01-19T09:58:27.315Z testcontainers ERROR Connection to Reaper 772ad226bab09bf7c9c1927c8a0245df09d9475f8e6c4b0430cb9e1460d65d17 closed with error
reason: Failed to connect to Reaper 772ad226bab09bf7c9c1927c8a0245df09d9475f8e6c4b0430cb9e1460d65d17
    at .../.pnpm/[email protected]/node_modules/testcontainers/dist/reaper.js:126:56
    at IntervalRetryStrategy.<anonymous> (.../.pnpm/[email protected]/node_modules/testcontainers/dist/retry-strategy.js:37:28)
    at Generator.next (<anonymous>)
    at fulfilled (.../.pnpm/[email protected]/node_modules/testcontainers/dist/retry-strategy.js:5:58)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
  • Operating System: Ubuntu 20.04.5 ( Gitlab CI )
  • Docker Version: 20.10.22, build 3a2c30b
  • Node version: 16.9.0
  • Testcontainers version: 9.1.1

@henrik242
Copy link

SEO: I had the same problem when using testcontainers-node 9.1.1 together with Colima. The problem was that colima didn't support localhost on ipv6. I found two separate workarounds.: Either comment out ::1 localhost in /etc/hosts, or start testcontainers with TESTCONTAINERS_HOST_OVERRIDE=127.0.0.1.

Also commented here: abiosoft/colima#583

@cristianrgreco
Copy link
Collaborator

cristianrgreco commented Feb 28, 2023

It's possible I've reproduced the issue in the pipeline (https://github.com/testcontainers/testcontainers-node/actions/runs/4293412705/jobs/7481054866), I will try to look into this. If anyone is able to reproduce this consistently or can point me in some direction I'd be very grateful!

@cristianrgreco cristianrgreco added bug Something isn't working and removed triage Investigation required labels Mar 2, 2023
@cristianrgreco
Copy link
Collaborator

cristianrgreco commented Mar 4, 2023

Wanted to give an update. The issue isn't with the Reaper. It is related to this issue: moby/moby#42442.

When IPv6 is enabled, Docker will generate a port binding for both 0.0.0.0 and ::1. Testcontainers will pick the 1st port binding, which seems to always be for IPv4.

When the Docker host is localhost, the OS usually prefers to resolve IPv6 (::1) first, so if Docker provides a different port binding for IPv6, we pick the 1st (IPv4) port and try to connect via IPv6, which will not work.

There's a couple of things we can do here:

  1. Instead of setting the host to localhost, we can use 127.0.0.1 instead, to ensure we always connect via IPv4 (similar to [Bug] Failed to connect to Reaper from 9.0.1  #439 (comment)).
  2. Where the host is an IPv4 IP, use the IPv4 port binding. When the host is localhost, perform a DNS resolve to see if IPv6 is preferred. If preferred, use the IPv6 port binding, else use IPv4.

I'm currently in discussion with other TC maintainers to come up with the best solution. Watch this space!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
4 participants