-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Docker BuildKit caching w/ --cache-from fails (roundly 50% rate), even when using docker-container #2279
Comments
I make more investigation: looks like cache is corrupted before upload stage. Found differenceGood cache
Bad cache
Layer graphgraph
good --> g_38cc3[38cc3: FROM arm64]
good --> g_b71f9[b71f9: FROM amd64]
g_38cc3 --> g_48ffe[48ffe: RUN]
g_b71f9 --> g_52af5[52af5: RUN]
g_b71f9 --> g_7c10d[7c10d: COPY]
g_52af5 --> g_7c10d
g_38cc3 --> g_7c10d
g_48ffe --> g_7c10d
bad --> b_38cc3[38cc3: FROM arm64]
bad --> b_b71f9[b71f9: FROM amd64]
b_b71f9 --> b_2b352[2b352: COPY]
b_38cc3 --> b_2b352
b_b71f9 --> b_52af5[52af5: RUN]
b_52af5 --> b_ca41e[ca41e: COPY]
|
I modified script to check clean cache generation (without #!/bin/bash
REGISTRY=registry.example.net/test-docker/example
# Recreate builder for clear local cache
docker buildx rm cachebug || true
docker buildx create --name cachebug --driver docker-container
docker buildx inspect cachebug --bootstrap
# Create some changed file
date > changed.txt
# Run with cache-to only
docker buildx build \
--builder cachebug \
--cache-to type=registry,ref=$REGISTRY:buildcache,mode=max \
--platform linux/amd64 \
--platform linux/arm64 \
.
# Recreate builder for clear local cache
docker buildx rm cachebug || true
docker buildx create --name cachebug --driver docker-container
docker buildx inspect cachebug --bootstrap
# Create some changed file
date > changed.txt
# Run with cache-from only
docker buildx build \
--builder cachebug \
--cache-from type=registry,ref=$REGISTRY:buildcache \
--platform linux/amd64 \
--platform linux/arm64 \
. And this issue is still reproduced randomly. |
The problem seems to occur when So, I modify scripts to dramatically increase the reproducing the problem. Dockerfile: FROM busybox AS stage-0
FROM stage-0 AS stage-1
RUN echo "Hello, world!!!"; if [ "$(arch)" = "x86_64" ]; then sleep 2; fi
COPY changed.txt /opt/changed.txt
FROM busybox
COPY --from=stage-1 /opt/changed.txt /opt/copy.txt Test script: #!/bin/bash
REGISTRY=registry.example.net/test-docker/example
# Recreate builder for clear local cache
docker buildx rm cachebug || true
docker buildx create --name cachebug --driver docker-container
docker buildx inspect cachebug --bootstrap
# Create some changed file
date > changed.txt
# Build stage-0 to pull busybox image
docker buildx build \
--builder cachebug \
--platform linux/amd64 \
--platform linux/arm64 \
--target stage-0 \
.
# Run with cache-to only
docker buildx build \
--builder cachebug \
--cache-to type=registry,ref=$REGISTRY:buildcache,mode=max \
--platform linux/amd64 \
--platform linux/arm64 \
.
# Recreate builder for clear local cache
docker buildx rm cachebug || true
docker buildx create --name cachebug --driver docker-container
docker buildx inspect cachebug --bootstrap
# Create some changed file
date > changed.txt
# Run with cache-from only
docker buildx build \
--builder cachebug \
--cache-from type=registry,ref=$REGISTRY:buildcache \
--platform linux/amd64 \
--platform linux/arm64 \
. |
I try save
In this case I got single layer for |
I independently encountered some bugs around the parallel creation of blobs for different platforms and have a fix out for PR here: #2273 Have not tried the reproducer here to see if it fixes the problem, but may be worth a try if anyone has time. |
I refactor Dockerfile to produce stable layer on RUN stage: FROM alpine AS base
# RUN command always produce layer with digest `sha256:bcca0bb7...`
RUN echo "Hello, world!!!"; if [ "$(arch)" = "x86_64" ]; then sleep 2; fi; touch -d 2021-01-01 -t 00:00:00 /foo.txt
# COPY command produce differ layer per platform because /opt has differ change time
COPY changed.txt /opt/changed.txt
FROM alpine
COPY --from=base /opt/changed.txt /opt/copy.txt And remove |
I have build locally #2273 and looks like it not fix this issue. |
Ah okay, good to know, thanks for giving it a try |
I created script to one click reproduce: https://github.com/bozaro/buildkit-2279 |
I recheck this issue state by https://github.com/bozaro/buildkit-2279 script and now it not flaky.
|
Given the resolution of #1981 , could you try upgrading your buildx plugin to the 0.8.2 version? |
Has this issue proceeded any further or is there still no known fix? How are people working around this? |
my fix was simply to not use multi-stage builds, but that of course is not a useful solution for everybody |
@n1ngu I tried upgrading and the issue persists, multi-stage is broken |
Ok @ghost-of-montevideo , just check you are not actually under the influence of the confusingly similar #2274 I was under the impression that the present issue is fixed and only affects the outdated buildkit vendored out-of-the-box within docker, as I was unable to reproduce it if I was using an updated buildkit plugin and the docker-container build driver. Unlike the present issue, the reproduction of #2274 is consistent for me. |
@n1ngu The reproduction in https://github.com/bozaro/buildkit-2279 shows that with buildx version 0.9.1 and |
@benlangfeld I can't access the building environment I used at the time, but I now think the culprit of the issue would be multi-stage builds, as @ghost-of-montevideo pointed out. All my builds were single-stage so it worked right. I just feel this isn't highlighted enough in the issue description. |
Initial issue message has "Steps to reproduce" section. What additional highlighting do you expect? |
Given the context of #1981 and #2274, I'd expect the "multi-stage build triggers the problem, single-stage build works fine" bit of information (not literally) to be both in the title and the description, not only in the reproduction source code. |
Is there a solution to this yet? I am also using multistage builds and encountering this issue. |
Is there anybody working on this issue, and is there any way others can help? Either this or the related #2274 |
Im also happy to help a bit if there is a way we can contribute, let us know |
You can debug the code, find where is the root of the issue, fix it and submit a PR. |
Spent a while on this issue and could not find a great resolution using buildkit, and there's lots of unsolved issues about this across the web. After seeing #1981 (comment) and another on SO, I switched our GitLab Cloud CI/CD pipelines to use buildah instead of buildkit and caching is working well now. It's (almost) a drop-in replacement, and using overlay2 as the storage driver, the build performance (caching disabled) seems to be the same as with buildkit. Here's a sample of what that could look like in a .gitlab-ci.yml file:
|
I tried your script and I only got 100 of 100 when I exhausted the number of requests to docker hub, where you logged in when you tried? |
hi guys, any update on this |
Experiencing the same. Tried S3 and registry cache backends. Two builds run in subsequent CodeBuild projects, which use exactly the same environment deployed via IaC. The only difference is in some build flags, e.g. stage target, and some ARGs. First build re-uses all of the cache, and build finishes very fast. Second build, re-uses layer cache of the first few layers, and then ignores the cache of the last command. I did inspect layers thoroughly using dive and the layer preceding the one that is not getting cached has exactly the same digest across runs.
|
Hey @moltar , your experience sounds much like mine. I've been using docker bake with multistage Dockerfiles, and it's been frustrating that cache restoration is quite flaky, even when simply using either the local file or the remote S3 cache backend. Even when all the build inputs and contexts are deterministicly held constant, caching seems to be hit or miss, and often only partial. While sifting through related tickets, I believe the main culprit is the issue discussed here: |
Pulled https://github.com/bozaro/buildkit-2279 and ran ...
Cleanup docker images...
cachebug_registry
cachebug_proxy
Show statistics...
FAILED 54 OF 100 |
Hmm, I can't seem to replicate that failure rate on my workstation, (although I commented out the arm64 bits): $ time ./test.sh
4b2cb5a376919c6b26fe510568c83806ea4b8f89bce4c917f1d49390e439bc9c
Wait for service on endpoint: http://localhost:5001
229f5e7e5e20934dd27f448afdeb70e16397b12d6807c1841e71468b8df8a7a3
Wait for service on endpoint: http://localhost:5002
Check is build for multiple platform supported...
Run tests...
[1] OK
[2] OK
[3] OK
[4] OK
[5] OK
[6] OK
[7] OK
[8] OK
[9] OK
[10] OK
[11] OK
[12] OK
[13] OK
[14] OK
[15] OK
[16] OK
[17] OK
[18] OK
[19] OK
[20] OK
[21] OK
[22] OK
[23] OK
[24] OK
[25] OK
[26] OK
[27] OK
[28] OK
[29] OK
[30] OK
[31] OK
[32] OK
[33] OK
[34] OK
[35] OK
[36] OK
[37] OK
[38] OK
[39] OK
[40] OK
[41] OK
[42] OK
[43] OK
[44] OK
[45] OK
[46] OK
[47] OK
[48] OK
[49] OK
[50] OK
[51] OK
[52] OK
[53] OK
[54] OK
[55] OK
[56] OK
[57] OK
[58] OK
[59] OK
[60] OK
[61] OK
[62] OK
[63] OK
[64] OK
[65] OK
[66] OK
[67] OK
[68] OK
[69] OK
[70] OK
[71] OK
[72] OK
[73] OK
[74] OK
[75] OK
[76] OK
[77] OK
[78] OK
[79] OK
[80] OK
[81] OK
[82] OK
[83] OK
[84] OK
[85] OK
[86] OK
[87] OK
[88] OK
[89] OK
[90] OK
[91] OK
[92] OK
[93] OK
[94] OK
[95] OK
[96] OK
[97] OK
[98] OK
[99] OK
[100] OK
Cleanup docker images...
cachebug_registry
cachebug_proxy
Show statistics...
ALL SUCCESSES
real 12m36.002s
user 1m3.257s
sys 0m48.798s Although for more complex multistage builds, like I use internally, I certainly can. I'll have to make a similar replicator. |
Version information:
Steps to reproduce
Create
Dockerfile
:Run script like (REGISTRY shoud be replaced by actual value):
What I see: When I run the above script multiple times, step
RUN echo "Hello, world!!!"
fails cache roundly every second time for one of platform (I have not seen the problem with the cache at the same time on all platforms).For example:
Update (2021-08-18)
Repository to reproduce issue: https://github.com/bozaro/buildkit-2279 (simply checkout and run
./test.sh
).The text was updated successfully, but these errors were encountered: