Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Basic support for using kubectl and helm together #586

Merged
merged 1 commit into from
Sep 21, 2018

Conversation

ajbouh
Copy link
Contributor

@ajbouh ajbouh commented May 23, 2018

This is the minimal change needed to do deploys with both kubectl and helm.

If only one configuration is present, it falls back to existing logic. If both configurations are present, a new deployer is returned that runs both helm and kubectl in sequence.

It is naive in a number of ways:

  • It does not determine which artifacts are used by which deployer, so a change in any artifact redeploys both kubectl and helm
  • A failure in one deployer means it does not attempt to deploy the other
  • No test coverage

Addresses #528

@googlebot
Copy link

Thanks for your pull request. It looks like this may be your first contribution to a Google open source project (if not, look below for help). Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).

📝 Please visit https://cla.developers.google.com/ to sign.

Once you've signed (or fixed any issues), please reply here (e.g. I signed it!) and we'll verify it.


What to do if you already signed the CLA

Individual signers
Corporate signers

@dlorenc
Copy link
Contributor

dlorenc commented May 24, 2018

@ajbouh Would you be able to sign the CLA?

@ajbouh
Copy link
Contributor Author

ajbouh commented May 24, 2018

Done

@dgageot
Copy link
Contributor

dgageot commented Jun 5, 2018

@ajbouh Can you please rebase your code? Thanks a lot!

@dlorenc
Copy link
Contributor

dlorenc commented Jun 19, 2018

@ajbouh can you check that you signed the CLA with the right email address? The bot is still saying it's not signed.

@ajbouh
Copy link
Contributor Author

ajbouh commented Jun 19, 2018 via email

@dgageot
Copy link
Contributor

dgageot commented Jun 20, 2018

@ajbouh yes, that can be the issue. Can you also rebase your code, please? Thanks!

@r2d4 r2d4 added the cla: no label Jun 20, 2018
@balopat
Copy link
Contributor

balopat commented Aug 3, 2018

@ajbouh ping, do you still have issues with the CLA/rebasing?

@googlebot
Copy link

CLAs look good, thanks!

@ajbouh
Copy link
Contributor Author

ajbouh commented Sep 19, 2018

@balopat rebased. Also, looks like the cla bot doesn't recheck until you repush

@codecov-io
Copy link

codecov-io commented Sep 19, 2018

Codecov Report

Merging #586 into master will decrease coverage by 0.42%.
The diff coverage is 13.63%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #586      +/-   ##
==========================================
- Coverage   40.61%   40.19%   -0.43%     
==========================================
  Files          68       68              
  Lines        2969     3005      +36     
==========================================
+ Hits         1206     1208       +2     
- Misses       1639     1671      +32     
- Partials      124      126       +2
Impacted Files Coverage Δ
pkg/skaffold/deploy/deploy.go 25% <0%> (-75%) ⬇️
pkg/skaffold/runner/runner.go 54.63% <54.54%> (+0.18%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update f728fd9...2182fc0. Read the comment docs.

@r2d4 r2d4 added kokoro:run runs the kokoro jobs on a PR and removed cla: no labels Sep 19, 2018
@kokoro-team kokoro-team removed the kokoro:run runs the kokoro jobs on a PR label Sep 19, 2018
@dgageot dgageot merged commit c8eec1b into GoogleContainerTools:master Sep 21, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants