Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

memory leaks with monitors #194

Closed
ashwinr opened this issue Sep 10, 2013 · 9 comments
Closed

memory leaks with monitors #194

ashwinr opened this issue Sep 10, 2013 · 9 comments

Comments

@ashwinr
Copy link

ashwinr commented Sep 10, 2013

Looks like the monitor code is holding references in memory indefinitely. This is causing us to leak a lot of memory. Please look at the attached screenshot.

image

@briancavalier
Copy link
Member

Hi @ashwinr. Even though it's a debugging aid, I'd certainly love to squash any memory leaks caused by the monitor. It's not clear to me from the screenshot, though, what exactly is being leaked, or by whom. Could you please provide more detail on what you think is happening?

The monitor will keep some references in a hash until a promise is either fulfilled, rejected, or then'd, at which point it releases the associated promise from the hash. It also keeps a partial graph of all the ancestral promise relationships. Hmmm, that may be where the problem is. My intent is that the graph gets pruned as promise are fulfilled and rejected, but maybe that isn't happening like it should be.

I'll look into the graph, but if you can also provide more info, that would be very helpful. Thanks!

@ashwinr
Copy link
Author

ashwinr commented Sep 13, 2013

sure, the problem is that saving off the promise creation stack retains references to objects along the stack. And this is never cleaned up. My "fix"[https://github.com/ashwinr/when/commit/23fcbeb170492bf496ce65add7bdd4d7a76859dc] was to just remove the creation stack altogether as the failure stack is more useful than the stack at the time of promise creation.

@briancavalier
Copy link
Member

the problem is that saving off the promise creation stack retains references to objects along the stack

Hmm, I don't believe this is the case. Saving an exception or stack trace does not pin objects that were allocated on those stack frames into memory. In fact, stack traces are generated lazily in most VMs now (the .stack property is actually a getter function that generates the stack trace on demand).

That said, the change you made also prevents the monitor from stitching together long stack traces. It uses the in-memory promise graph to do that (see formatStackJumps, which traverses promise ancestor relationships, and is no longer called), so it could be that something about that graph traversal (or maybe the graph creation & maintenance) is using more memory than I thought.

Thanks for linking to your change--it gives me a place to start looking. I should have time this week to look into it more, and I'll let you know what I find. If you find any more clues, let me know!

@ashwinr
Copy link
Author

ashwinr commented Sep 19, 2013

Cool, thanks for looking into it.

Sent from a mobile phone

On Sep 16, 2013, at 18:05, Brian Cavalier [email protected] wrote:

the problem is that saving off the promise creation stack retains references to objects along the stack

Hmm, I don't believe this is the case. Saving an exception or stack trace does not pin objects that were allocated on those stack frames into memory. In fact, stack traces are generated lazily in most VMs now (the .stack property is actually a getter function that generates the stack trace on demand).

That said, the change you made also prevents the monitor from stitching together long stack traces. It uses the in-memory promise graph to do that (see formatStackJumps, which traverses promise ancestor relationships, and is no longer called), so it could be that something about that graph traversal (or maybe the graph creation & maintenance) is using more memory than I thought.

Thanks for linking to your change--it gives me a place to start looking. I should have time this week to look into it more, and I'll let you know what I find. If you find any more clues, let me know!


Reply to this email directly or view it on GitHub.

@briancavalier
Copy link
Member

Hey @ashwinr, the monitor in 3.0 (just released) is much lighter than 2.x. We decided to start over with something much simpler, and carefully evolve it as needed. If you have a chance to run your same profiling use cases with it, I'd be very interested to see the results. Thanks!

@briancavalier
Copy link
Member

@ashwinr I'm gonna close this since the monitor in 3.x, at least so far, as been much better on memory. If you see the problem again in 3.x, please feel free to reopen or create a new issue. Thanks!

@ashwinr
Copy link
Author

ashwinr commented Apr 2, 2014

Cool sounds good.

Sent from a mobile phone

On Apr 1, 2014, at 20:12, Brian Cavalier [email protected] wrote:

@ashwinr I'm gonna close this since the monitor in 3.x, at least so far, as been much better on memory. If you see the problem again in 3.x, please feel free to reopen or create a new issue. Thanks!


Reply to this email directly or view it on GitHub.

@rkaw92
Copy link
Contributor

rkaw92 commented Apr 13, 2016

Hi. I'm seeing a very similar problem on when 3.7.7.
This is a minimal test case, which resembles the code that I've been having problems with in production:

var when = require('when');
require('when/monitor/console');

function read() {
    return when.resolve().delay(0).then(function() {
        callRead();
    });
}

function callRead() {
    read();
}

callRead();

It simulates a ReadableStream's control flow, where callRead() would typically be the .push() to the stream's internal buffer (and .read() is automatically called again in flowing mode). This resemblance may be unimportant, but the general case is that it leaks memory when using Promises as a trampoline for recursion.

The code's memory usage goes up to arbitrary numbers of megabytes, until the GC kicks in. Commenting out the require('when/monitor/console') makes the process float from 25MB to 32MB and back. Tested on Node 4.4.2, though I have been tracking this memory leak since we deployed our app on 0.10 and a few versions of when earlier.

I see two possibilities: either the code really is leaking references to Promises, and the console monitor only brings about the demise of the process faster, or it's the monitor who is leaking something (I've only seen Promise references in Chrome inspector so far). Will keep watching the "stable" process for a few hours to determine if there is a memory leak without the monitor module, but so far, it seems there is none.

@rkaw92
Copy link
Contributor

rkaw92 commented Apr 13, 2016

I've been watching the process without the when/monitor/console for a few hours now, and it is hanging around 31MB of RSS. The monitor module must be the culprit.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants