-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix python loadgenerator memory leak #816
Conversation
@puckpuck please take a look and trying running on your cluster |
towards: #771 |
oooh... this looks super interesting. Will give it a try on a long running tests Wonder if we can do the same for the emailservice now 😛 |
I have this running now. Loadgenerator was OOMing every 55-ish hours before. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Image size grew 100MB, but for the benefit that this is bringing I don't have anything against it.
Nice finding @cartersocha!
@puckpuck im guessing no news is good news on restarts! Are we good to merge once I resolve the conflicts or have any issues popped up? |
I'll confirm tonite one way or the other. |
I still see a memory leak, at about the same rate as before :( |
My good friend ChatGPT and I tried a couple different solutions to eliminate the memory leak here.
The most effective method seemed to be switching to the locustio image as our base image from a generic python option.
I ran both versions locally for ~40 mins. The locustio image stabilized around 67 MB usage. The regular python image continued to add memory usage overtime and was around 86 MB usage when I stopped it.