Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

reflector going OOM caused by spiky behaviour #187

Closed
aeimer opened this issue Jun 1, 2021 · 20 comments · Fixed by #223
Closed

reflector going OOM caused by spiky behaviour #187

aeimer opened this issue Jun 1, 2021 · 20 comments · Fixed by #223
Labels

Comments

@aeimer
Copy link

aeimer commented Jun 1, 2021

Hi guys,

my reflector pod just exited hard due to an out-of-memory event.
There are also regular errors popping up in the log.

Instana event:
grafik

Instana pod details:
grafik

Logs:
grafik

These kind of logs repeat every few hours.

So one thing that came up, maybe the pod just needs more RAM than you specified here: https://github.com/emberstack/kubernetes-reflector/blob/master/src/helm/reflector/values.yaml#L58
But in general this works quite good, but I don't know where the spikes come from.
I have the second smaller cluster with reflector running, the RAM fills up there also, but not that fast and I couldn't see any OOMs so far.

Thank you for helping.

BR
Alex

@stale
Copy link

stale bot commented Jun 9, 2021

Automatically marked as stale due to no recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale label Jun 9, 2021
@aeimer
Copy link
Author

aeimer commented Jun 11, 2021

@winromulus can you have a look into? What can I do about it?

@stale
Copy link

stale bot commented Jun 11, 2021

Removed stale label.

@stale stale bot removed the stale label Jun 11, 2021
@stale
Copy link

stale bot commented Jun 18, 2021

Automatically marked as stale due to no recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale label Jun 18, 2021
@aeimer
Copy link
Author

aeimer commented Jun 19, 2021 via email

@stale
Copy link

stale bot commented Jun 19, 2021

Removed stale label.

@scarby
Copy link

scarby commented Jun 21, 2021

@aeimer Are you using the reflector to reflect certificates from LetsEncrypt?

@aeimer
Copy link
Author

aeimer commented Jun 21, 2021

@scarby Yes, I actually do. Maybe #191 is the cause for this issue. Good point.

@stale
Copy link

stale bot commented Jun 30, 2021

Automatically marked as stale due to no recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale label Jun 30, 2021
@aeimer
Copy link
Author

aeimer commented Jun 30, 2021

@klimisa is there any chance that you can have a look into this issue?

@stale
Copy link

stale bot commented Jun 30, 2021

Removed stale label.

@stale stale bot removed the stale label Jun 30, 2021
@stale
Copy link

stale bot commented Jul 8, 2021

Automatically marked as stale due to no recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale label Jul 8, 2021
@brokenjacobs
Copy link

Is the issue just the memory limit? Or is there some larger issue at play here causing a problem?

@stale
Copy link

stale bot commented Jul 8, 2021

Removed stale label.

@stale stale bot removed the stale label Jul 8, 2021
@aeimer
Copy link
Author

aeimer commented Jul 9, 2021

@brokenjacobs AFAIK everyhting seems to work. We have cert-manager with LE running.

Edit: @scarby suspected a link to #191

@stale
Copy link

stale bot commented Jul 16, 2021

Automatically marked as stale due to no recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale label Jul 16, 2021
@aeimer
Copy link
Author

aeimer commented Jul 16, 2021

no stale. Problem seems to persist

@stale
Copy link

stale bot commented Jul 16, 2021

Removed stale label.

@stale stale bot removed the stale label Jul 16, 2021
@stale
Copy link

stale bot commented Jul 23, 2021

Automatically marked as stale due to no recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale label Jul 23, 2021
@stale
Copy link

stale bot commented Jul 31, 2021

Automatically closed stale item.

@stale stale bot closed this as completed Jul 31, 2021
winromulus added a commit that referenced this issue Oct 16, 2021
- New multi-arch pipeline with proper tagging convention
- Removed cert-manager extension (deprecated due to new support from cert-manager) Fixes: #191
- Fixed healthchecks. Fixes: #208
- Removed Slack support links (GitHub issues only). Fixes: #199
- Simplified startup and improved performance. Fixes: #194
- Huge improvements in performance and stability. Fixes: #187 #182 #166 #150 #138 #121 #108
winromulus added a commit that referenced this issue Oct 16, 2021
- New multi-arch pipeline with proper tagging convention
- Removed cert-manager extension (deprecated due to new support from cert-manager) Fixes: #191
- Fixed healthchecks. Fixes: #208
- Removed Slack support links (GitHub issues only). Fixes: #199
- Simplified startup and improved performance. Fixes: #194
- Huge improvements in performance and stability. Fixes: #187 #182 #166 #150 #138 #121 #108
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Development

Successfully merging a pull request may close this issue.

3 participants