Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add caching to kaniko builder #1287

Merged
merged 2 commits into from
Nov 19, 2018

Conversation

priyawadhwa
Copy link
Contributor

I updated the version of the kaniko image so that I could add caching.
To use caching, the user has to specify 'cache' in their skaffold.yaml,
with an optional field for 'repo', which is where cached layers will be
stored.

Fixes #1275

I updated the version of the kaniko image so that I could add caching.
To use caching, the user has to specify 'cache' in their skaffold.yaml,
with an optional field for 'repo', which is where cached layers will be
stored.
@codecov-io
Copy link

codecov-io commented Nov 16, 2018

Codecov Report

Merging #1287 into master will decrease coverage by 0.03%.
The diff coverage is 0%.

Impacted file tree graph

@@            Coverage Diff            @@
##           master   #1287      +/-   ##
=========================================
- Coverage   44.24%   44.2%   -0.04%     
=========================================
  Files         104     104              
  Lines        4629    4633       +4     
=========================================
  Hits         2048    2048              
- Misses       2372    2376       +4     
  Partials      209     209
Impacted Files Coverage Δ
pkg/skaffold/build/kaniko/run.go 0% <0%> (ø) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 7ca9f94...6861fa1. Read the comment docs.

Copy link
Contributor

@balopat balopat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, excited about the speed up in Kaniko!

Copy link
Contributor

@balopat balopat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

one thing: maybe we should turn this on by default for integration tests. Can we do that?

@priyawadhwa
Copy link
Contributor Author

For sure, should be an easy change.

Copy link
Contributor

@dgageot dgageot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM with a small nit

@@ -46,7 +46,7 @@ const (

DefaultKustomizationPath = "."

DefaultKanikoImage = "gcr.io/kaniko-project/executor:v0.4.0@sha256:0bbaa4859eec9796d32ab45e6c1627562dbc7796e40450295b9604cd3f4197af"
DefaultKanikoImage = "gcr.io/kaniko-project/executor@sha256:434bbb1d998ba1bd8ebc04c90d93afa859fd5c7ff93326bca9f6e7da0d6277ff"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you leave the version here, in addition to the sha256? It makes it easier to find which version we are using.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

kaniko actually hasn't released a version with the new caching stuff yet -- I was going to come back to this and add the new version in once that happened.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll open an issue for it!

@priyawadhwa priyawadhwa merged commit 6667a2e into GoogleContainerTools:master Nov 19, 2018
@priyawadhwa priyawadhwa deleted the kaniko-cache branch November 19, 2018 16:56
@balopat balopat mentioned this pull request Nov 21, 2018
3 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants