-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Skaffold crashes when building Docker image using Minikube Docker daemon with buildkit #1749
Comments
$ docker version
Client: Docker Engine - Community
Version: 18.09.2
API version: 1.39
Go version: go1.11.5
Git commit: 6247962
Built: Mon Feb 11 18:42:20 2019
OS/Arch: darwin/amd64
Experimental: false
Server: Docker Engine - Community
Engine:
Version: 18.09.3
API version: 1.39 (minimum version 1.12)
Go version: go1.10.8
Git commit: 774a1f4
Built: Thu Feb 28 06:40:51 2019
OS/Arch: linux/amd64
Experimental: true |
This seems to happen consistently when the first line of a given Dockerfile is not # syntax = docker/dockerfile:experimental
FROM ... Or simply: #
FROM ... This also triggers the crash: ARG foo
FROM ... Do note that this is not the only thing that seems to trigger the crash. This specific Dockerfile, in addition to the one I originally posted, also did: FROM golang:alpine AS build
COPY bar.go .
RUN go build -o /app/bar
ENTRYPOINT ["/app/bar"] |
@kasperisager I couldn't reproduce any of those with docker 18.09.2. |
They build just fine when using |
Interestingly enough, the issue sometimes persists even after removing everything before the initial |
This may actually be related to my setup. I use a separate Docker machine to build images (created with |
Can't reproduce with your sample either... |
That seems to be it: $ minikube start
$ skaffold build
<crash>
$ minikube stop
$ skaffold build
<success!> |
Wild guess: Could it be that Skaffold is trying to pull the image ID from the Minikube Docker env rather than the CLI Docker env, which is used when Buildkit is enabled? |
To sum up: #1749 (comment) was a red herring and so was Buildkit for that matter. The steps to reproduce seems to be:
The reason why #1749 (comment) was failing for me was that I had an otherwise clean repo and by modifying the Dockerfile caused the image tag to change from |
I have the same issue running on linux and trying to use buildkit with skaffold docker 18.09.3 Like @kasperisager said it happens after the build has finished locally and then skaffold is about to do a helm install it throws the tag error |
Did a little digging and the offending lines seem to be: skaffold/pkg/skaffold/docker/client.go Lines 68 to 70 in 96ac37e
This I could work around by changing the name of the default Minikube context to something like I'm not really sure what the best course of action is to be honest, as it's of course also very convenient having Skaffold automatically pull in the Docker env of Minikube. Maybe make it conditional on there not already being an active Docker env? |
I'm pretty sure the issue comes from the combo of minikube and the local docker CLI. In local docker CLI mode the image does not get built by the daemon in minikube and therefore does not exist in that daemon's context. And since the kubecontext is minikube it automatically does not push it to a registry So there are two workarounds:
|
@cpoole yep you're on it. I think the real fix is for skaffold to figure out if we're running against minikube, and if we are to pull in the minikube docker env and stick that onto the CLI calls we make. this shouldn't be too difficult of a change, if anyone in this thread is interested in sending a PR :) |
@nkubala I'm interested in submitting a PR. I looked over this the other week. But the issue I came to is that the docker daemon in 18.06 does not support buildkit unless the I ended up forking minikube and installing 18.09 https://github.com/kubernetes/minikube/blob/7e6c68811654e0b2482b12fb6a700998e94f835d/deploy/iso/minikube-iso/package/docker-bin/docker-bin.mk#L7 but I started running into bizarre errors when using buildkit over remote daemon. I'll take another stab at this but it felt like a rabbit hole. Does this sound right or am I chasing my tail here? |
mmm yeah these are two separate issues I think. even if you had 18.09 installed in minikube, it still wouldn't work with skaffold because of the way we're calling docker in buildkit mode. I think there's an issue with 18.09 in minikube which is why it's not installed, @balopat might have more info. as far as buildkit over a remote daemon goes, I have no idea :) for now, if you want to fix the issue on the skaffold side, that should at least be pretty straightforward. I think it's probably fine to assume the docker daemon is configured to have buildkit enabled from skaffold's perspective. |
ok, I'll take a stab at fixing on the skaffold side. I'll integration test locally with my own minikube iso |
@cpoole any news on this one? Yeah this does look a bit like a rabbit hole type of thing :) |
I can confirm I get this issue on Linux using buildkit. If I delete m minikube instance it gets past the build but fails on deployment because there is no minikube instance but if I spin up minikube it breaks right after the build is finished. @nkubala I was looking at the debug logs for skaffold and it looks like it pulls the docker env from minikube?
|
I too can confirm the same behavior as @meatherly on Linux. My builds(building the skaffold examples without any change to the Dockerfiles) fail when Minikube is running like:
When I stopped minikube the build went through but as expected the deployment failed. Then I tried to downgrade by running |
This thread seemed a little to quite so I opened this issue: #2187 and referenced it in the skaffold slack channel: https://kubernetes.slack.com/archives/CABQMSZA6/p1559010140012200. The issue was due to my minikube VM using an old ISO so |
osx too |
Same problem for me. I've tried to delete minikube and completely reinstall it. No luck:
|
I accidentally reproduced this with @haf's repro on a separate issue: https://github.com/haf/skaffold-dockerignore - change the skaffold yaml to |
I also have it set to apiVersion: skaffold/v1beta13
kind: Config
profiles:
- name: dev
activation:
- command: dev
deploy:
kubectl:
manifests:
- api/app/k8s/dev/api.k8s-config.yaml
- api/app/k8s/dev/db.k8s-config.yaml
...
- web/k8s/dev/web.k8s-deployment.yaml
- web/k8s/dev/web.k8s-service.yaml
build:
local:
useDockerCLI: true
artifacts:
- image: dl-logstash
sync:
manual:
- src: 'api/app/logstash/config/dev/pipeline/*'
dest: pipeline
docker:
dockerfile: api/app/logstash/Dockerfile
target: dev-env
cacheFrom:
- docker.elastic.co/logstash/logstash-oss:6.6.1
- image: init-db
docker:
dockerfile: api/Dockerfile
target: init-db
buildArgs:
DL_ORG_DEV_NAMESPACE: '{{.DL_ORG_DEV_NAMESPACE}}'
DL_ORG_AWS_ACCOUNT_ID: '{{.DL_ORG_AWS_ACCOUNT_ID}}'
DL_ORG_AWS_IAM_APP_USER_ACCESS_KEY_ID: '{{.DL_ORG_AWS_IAM_APP_USER_ACCESS_KEY_ID}}'
DL_ORG_AWS_IAM_APP_USER_SECRET_ACCESS_KEY: '{{.DL_ORG_AWS_IAM_APP_USER_SECRET_ACCESS_KEY}}'
# API service
- image: dl-api.dev
context: .
sync:
manual:
- src: 'api/app/src/**/*.ts'
dest: .
strip: 'api'
- src: 'api/app/src/**/*.json'
dest: .
strip: 'api'
docker:
dockerfile: api/Dockerfile
target: dev-env
buildArgs:
DL_ORG_DEV_NAMESPACE: '{{.DL_ORG_DEV_NAMESPACE}}'
DL_ORG_AWS_ACCOUNT_ID: '{{.DL_ORG_AWS_ACCOUNT_ID}}'
DL_ORG_AWS_IAM_APP_USER_ACCESS_KEY_ID: '{{.DL_ORG_AWS_IAM_APP_USER_ACCESS_KEY_ID}}'
DL_ORG_AWS_IAM_APP_USER_SECRET_ACCESS_KEY: '{{.DL_ORG_AWS_IAM_APP_USER_SECRET_ACCESS_KEY}}'
- image: dl-api.test
context: .
sync:
manual:
- src: 'api/test/**/*'
dest: .
strip: 'api'
docker:
dockerfile: api/Dockerfile
target: test
buildArgs:
DL_ORG_DEV_NAMESPACE: '{{.DL_ORG_DEV_NAMESPACE}}'
DL_ORG_AWS_ACCOUNT_ID: '{{.DL_ORG_AWS_ACCOUNT_ID}}'
DL_ORG_AWS_IAM_APP_USER_ACCESS_KEY_ID: '{{.DL_ORG_AWS_IAM_APP_USER_ACCESS_KEY_ID}}'
DL_ORG_AWS_IAM_APP_USER_SECRET_ACCESS_KEY: '{{.DL_ORG_AWS_IAM_APP_USER_SECRET_ACCESS_KEY}}'
- image: dl-web
context: .
sync:
manual:
- src: 'web/src/**/*'
dest: .
strip: 'web'
- src: 'web/e2e/**/*'
dest: .
strip: 'web'
docker:
dockerfile: web/Dockerfile
target: dev-env
- resourceType: service
resourceName: postgres
port: 5432
- resourceType: service
resourceName: elasticsearch
port: 9200 |
@demisx You don't need to use the docker CLI for passing env vars: https://github.com/haf/skaffold-dockerignore/blob/master/skaffold.yaml#L6-L10 works for me using docker on desktop. |
I had this issue when I specified an image tag in the image property. I removed |
@haf Oh, nice! Thanks man. I was told here it was necessary, but I guess no more. Maybe this was a requirement in an earlier version of skaffold. UPDATE: The older v0.31 skaffold requires the |
Also, I'd like to confirm that removing |
Using |
Fix GoogleContainerTools#1749 Signed-off-by: David Gageot <[email protected]>
Fix GoogleContainerTools#1749 Signed-off-by: David Gageot <[email protected]>
Should be all good now |
Fix GoogleContainerTools#1749 Signed-off-by: David Gageot <[email protected]>
Fix GoogleContainerTools#1749 Signed-off-by: David Gageot <[email protected]>
I'm having the same problem on version 0.40.0 of Skaffold. skaffold.yamlapiVersion: skaffold/v1beta16
kind: Config
build:
local:
push: false
useBuildkit: true
artifacts:
- image: nest
sync:
infer:
- src/**/*.ts
- .env.json
custom:
buildCommand: 'docker build -t nest -f Dockerfile.dev --ssh default .'
dependencies:
paths:
- src
- .env.json
deploy:
kubectl:
manifests:
- k8s/nest-node-port.yaml
- k8s/nest-deployment.yaml Dockerfile.dev# syntax=docker/dockerfile:experimental
FROM node:current-alpine
RUN apk add --no-cache \
openssh-client git \
build-base \
bash python \
expect
RUN mkdir -m 0600 /root/.ssh/
RUN touch /root/.ssh/known_hosts
RUN ssh-keyscan gitlab.com >> /root/.ssh/known_hosts
RUN ssh-keyscan github.com >> /root/.ssh/known_hosts
WORKDIR /app
COPY package.json ./
COPY .env.json ./
COPY jest.* ./
COPY nodemon* ./
COPY tsconfig* ./
COPY scripts scripts
COPY src src
RUN --mount=type=ssh npm install
EXPOSE 3000
EXPOSE 3001
CMD ["npm", "run", "start:dev"] |
@dolsem It looks like you are using another builder, the |
Solved for me as well.. |
When building (using
skaffold dev
) the following image with Buildkit enabled, Skaffold crashes:Expected behavior
The image is built successfully.
Actual behavior
Skaffold crashes with the following error:
FATA[0004] exiting dev mode because first run failed: build failed: building [foo]: Error parsing reference: "" is not a valid repository/tag: invalid reference format
Information
The text was updated successfully, but these errors were encountered: