Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Report Events on Pods that Karpenter is making decisions for #1894

Closed
diranged opened this issue Jun 3, 2022 · 2 comments
Closed

Report Events on Pods that Karpenter is making decisions for #1894

diranged opened this issue Jun 3, 2022 · 2 comments
Labels
feature New feature or request

Comments

@diranged
Copy link

diranged commented Jun 3, 2022

Tell us about your request

The cluster-autoscaler reports back to application owners about how and why Pods were or were not schedulable for hardware. I would like to see Karpenter write out events for each Pod that it's making decisions on... these events would be incredibly useful for application owners, as well as administrators. I could imagine events like:

  • `Launched ip-10-10-10-10.ec2.internal, Pod scheduled to host'
  • No provisioners match Pod hardware requirements, not launching any new hardware
  • Existing capacity exists for Pod (nodes ip-10-10-10-10, ip-10-11-10-10, ...), letting kube-scheduler handle scheduling
  • Scheduling pod on existing capacity (node ip-10-10-10-10...)

Tell us about the problem you're trying to solve. What are you trying to do, and why is it hard?

Scanning logs is useful for cluster administrators ... but when a developer launches a Pod and cannot get hardware, we want them to be able to fish for themselves and find the answer to what went wrong. Pod events seem like the natural place to put these kinds of events, as we can directly pass information to the end-user.

Are you currently working around this issue?
We're not yet solving this... :/

Additional context
This is a wish-list item, but it applies to PRs like #1887 and #1888.

Community Note

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
  • Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment
@tzneal
Copy link
Contributor

tzneal commented Jun 3, 2022

I agree :) Closing as a duplicate of #1584 Feel free to add any details that are missing there.

@tzneal tzneal closed this as completed Jun 3, 2022
@diranged
Copy link
Author

diranged commented Jun 3, 2022

Ah, thank you.. sorry I missed that!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants