Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error response from daemon: Get "https://mcr.microsoft.com/v2/": context deadline exceeded (Client.Timeout exceeded while awaiting headers) #139

Open
viriatis opened this issue May 23, 2023 · 15 comments

Comments

@viriatis
Copy link

I am trying to pull mcr.microsoft.com/dotnet/framework/sdk:4.8.1 image using docker on my terminal.

  1. I am using windows 11 pro
  2. I switched docker to windows containers
  3. I run "docker pull mcr.microsoft.com/dotnet/framework/sdk:4.8.1" on terminal and get:

Output: "Error response from daemon: Get "https://mcr.microsoft.com/v2/": context deadline exceeded (Client.Timeout exceeded while awaiting headers)"

@viriatis viriatis changed the title net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Error response from daemon: Get "https://mcr.microsoft.com/v2/": context deadline exceeded (Client.Timeout exceeded while awaiting headers) May 23, 2023
@rmjoia
Copy link

rmjoia commented Jun 2, 2023

I'm getting random similar errors in the pipelines, however, every time I try to run on my machine works fine... there are other incidents open on other repos complaining about the same...

@mahdibx
Copy link

mahdibx commented Jun 21, 2023

Similar issues when running docker pull, I've had different error messages:

Get "https://mcr.microsoft.com/v2/": dial tcp 204.79.197.219:443: i/o timeout

Get "https://mcr.microsoft.com/v2/": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)

@juncchen
Copy link

Our team is seeing similar issue when running docker build. There are various error messages and they all indicate something goes wrong with network connectivity.

Step 1/4 : FROM mcr.microsoft.com/dotnet/aspnet:6.0
Head "https://mcr.microsoft.com/v2/dotnet/aspnet/manifests/6.0": dial tcp 204.79.197.219:443: i/o timeout
Step 1/4 : FROM mcr.microsoft.com/dotnet/aspnet:6.0
Get "https://mcr.microsoft.com/v2/": dial tcp 204.79.197.219:443: i/o timeout

It started on 6/19 and has been getting worse. Below is the output of docker version if that helps

Client:
 Version:           20.10.18+azure-2
 API version:       1.41
 Go version:        go1.18.7
 Git commit:        b40c2f6b5deeb11ac6c485c940865ee40664f0f0
 Built:             Thu Sep  8 08:19:02 UTC 2022
 OS/Arch:           linux/amd64
 Context:           default
 Experimental:      true

Server:
 Engine:
  Version:          20.10.18+azure-2
  API version:      1.41 (minimum version 1.12)
  Go version:       go1.18.7
  Git commit:       e42327a6d3c55ceda3bd5475be7aae6036d02db3
  Built:            Thu Sep  8 22:50:10 2022
  OS/Arch:          linux/amd64
  Experimental:     false
 containerd:
  Version:          1.5.13+azure-2
  GitCommit:        a17ec496a95e55601607ca50828147e8ccaeebf1
 runc:
  Version:          1.1.4
  GitCommit:        5fd4c4d144137e991c4acebb2146ab1483a97925
 docker-init:
  Version:          0.19.0
  GitCommit:        

@ste-camp
Copy link

Same here.
Today and in the past days we are experiencing intermittent "timeout" issues while pulling images with docker build.

We are building docker images using Azure Container Registry Agent Pool:

Step 1/20 : FROM mcr.microsoft.com/dotnet/aspnet:6.0 AS base
6.0: Pulling from dotnet/aspnet
759700526b78: Pulling fs layer
cafd06d60458: Pulling fs layer
9e65f86790b6: Pulling fs layer
217953d5b220: Pulling fs layer
9abf5ceb3cbb: Pulling fs layer
217953d5b220: Waiting
9abf5ceb3cbb: Waiting
cafd06d60458: Verifying Checksum
cafd06d60458: Download complete
217953d5b220: Verifying Checksum
217953d5b220: Download complete
9e65f86790b6: Verifying Checksum
9e65f86790b6: Download complete
9abf5ceb3cbb: Verifying Checksum
9abf5ceb3cbb: Download complete
759700526b78: Verifying Checksum
759700526b78: Download complete
759700526b78: Pull complete
cafd06d60458: Pull complete
9e65f86790b6: Pull complete
217953d5b220: Pull complete
9abf5ceb3cbb: Pull complete

error pulling image configuration: download failed after attempts=6: dial tcp 204.79.197.219:443: i/o timeout
2023/06/22 13:08:01 Container failed during run: build. No retries remaining.
failed to run step ID: build: exit status 1

@KurtCuckbain
Copy link

I have the same problem with the exact same server with ip 204.79.197.219 while pulling mcr.microsoft.com/dotnet/aspnet:3.1 and some parts of mcr.microsoft.com/dotnet/aspnet:6.0. But from my home PC there is no problem with pulling those images.

@mengzhiyua
Copy link

[root@localhost ~]# docker pull images.houchangzao.com/****:*****
Error response from daemon: Get "https://images.houchangzao.com/v2/": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
I encountered this issue when using Docker to pull images from the company's private server, but I can pull images such as MySQL and nginx

@jmcdade11
Copy link

This has been happening frequently while leveraging Azure Container Registry's "Task Build" functionality. My registry is on Azure North Central US but it's been reproduced on others as well. After a multi-month support ticket with the Containers team, the issue seemed to have subsided and now it's back worse than ever.

@AndreHamilton-MSFT
Copy link
Contributor

@jmcdade11 could you paste the error message you received when this occurred. Specifically we are trying to understand

  1. which registry this occured with(was it mcr.microsoft.com or another acr registry)
  2. What artifact was being pulled when this occured
  3. do you happen to have times when this recently occurred?

@jmcdade11
Copy link

jmcdade11 commented Jan 17, 2024

@AndreHamilton-MSFT

I triggered a fresh one this morning:

##[debug]Agent running environment resource - Disk: available:470071.00MB out of 507934.00MB, Memory: used 32MB out of 1536MB, CPU: usage 8.76
2024/01/17 13:23:16 Using acb_vol_b69d1d69-48a6-444a-95a6-f086bab209a9 as the home volume
2024/01/17 13:23:16 Setting up Docker configuration...
2024/01/17 13:23:16 Successfully set up Docker configuration
2024/01/17 13:23:16 Logging in to registry: redacted.azurecr.io
2024/01/17 13:23:17 Successfully logged into redacted.azurecr.io
2024/01/17 13:23:17 Executing step ID: build. Timeout(sec): 28800, Working directory: '', Network: ''
2024/01/17 13:23:17 Scanning for dependencies...
2024/01/17 13:23:17 Successfully scanned dependencies
2024/01/17 13:23:17 Launching container with name: build
Sending build context to Docker daemon 387.1kB

Step 1/22 : FROM mcr.microsoft.com/dotnet/aspnet:7.0-bullseye-slim AS base

##[debug]Agent running environment resource - Disk: available:470071.00MB out of 507934.00MB, Memory: used 32MB out of 1536MB, CPU: usage 8.20
##[debug]Agent running environment resource - Disk: available:470071.00MB out of 507934.00MB, Memory: used 32MB out of 1536MB, CPU: usage 7.70
##[debug]Agent running environment resource - Disk: available:470071.00MB out of 507934.00MB, Memory: used 32MB out of 1536MB, CPU: usage 7.27
##[debug]Agent running environment resource - Disk: available:470071.00MB out of 507934.00MB, Memory: used 32MB out of 1536MB, CPU: usage 6.87
##[debug]Agent running environment resource - Disk: available:470071.00MB out of 507934.00MB, Memory: used 32MB out of 1536MB, CPU: usage 6.52
##[debug]Agent running environment resource - Disk: available:470071.00MB out of 507934.00MB, Memory: used 32MB out of 1536MB, CPU: usage 6.21
##[debug]Agent running environment resource - Disk: available:474646.00MB out of 507934.00MB, Memory: used 32MB out of 1536MB, CPU: usage 5.93
Get "https://mcr.microsoft.com/v2/dotnet/aspnet/manifests/sha256:924ca4f007ac0c4583b2d99c93147626d7fdf388f1454e06be666482ad1c47f4": dial tcp: lookup mcr.microsoft.com: i/o timeout
2024/01/17 13:23:58 Container failed during run: build. No retries remaining.
failed to run step ID: build: exit status 1

I've also seen context deadline exceeded:

Step 4/19 : FROM mcr.microsoft.com/dotnet/sdk:6.0 AS build
Get "https://mcr.microsoft.com/v2/": context deadline exceeded
2024/01/17 00:16:42

@AndreHamilton-MSFT
Copy link
Contributor

@jmcdade11 thanks for providing the details. One more question, is the time in utc or a different timezone?

@jmcdade11
Copy link

@AndreHamilton-MSFT It would be UTC.

@jkauppinen
Copy link

Encountering this error randomly in our CI/CD pipeline. In the failing step we build the application and push docker image to ACR.

Step 26/30 : FROM mcr.microsoft.com/dotnet/aspnet:7.0
Get "https://mcr.microsoft.com/v2/": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
2024/01/21 10:26:05 Container failed during run: build. No retries remaining.
failed to run step ID: build: exit status 1`

@ckchessmaster
Copy link

ckchessmaster commented Apr 2, 2024

I am also getting a very similar error intermittently on CI/CD. The docker build will randomly fail with:

INFO[0001] Retrieving image mcr.microsoft.com/dotnet/aspnet:8.0 from registry mcr.microsoft.com
error building image: Get "https://eastus.data.mcr.microsoft.com/aba285c624a04409823b708c7a50e7b9-jttfjm99vo//docker/registry/v2/blobs/sha256/af/af3fdf211f2d6148c5dbc566dd604fedee9d9a0ad47ac5a0d3c630a5fed98e58/data?regid=REDACTED&se=REDACTED&sig=REDACTED&sp=REDACTED&spr=REDACTED&sr=REDACTED&sv=REDACTED": dial tcp 204.79.197.219:443: connect: connection refused

Usually re-running the failed job will get it to succeed but it's 50/50 if it works on the first try.

@AndreHamilton-MSFT
Copy link
Contributor

@ckchessmaster has this improved recently?

@jonathan-fileread
Copy link

Getting similar issues too

"https://abcdefg.azurecr.io/v2/": context deadline exceeded

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests