Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

event emitter leak in compression #58

Closed
usernameisalreadytaken2014 opened this issue Oct 25, 2015 · 5 comments
Closed

event emitter leak in compression #58

usernameisalreadytaken2014 opened this issue Oct 25, 2015 · 5 comments
Assignees

Comments

@usernameisalreadytaken2014

Here is my keep-alive function.

var ka = setInterval(function() {
    res.write(": keep-alive\n");
    res.flush();
}, 15000);
res.on('close', function() {
    clearInterval(ka);
});

Used on top of a HTTP response that delivers a stream of JSON updates, formatted as text/event-stream (hence the flush).

The keep-alive is useful for SSE streams that only update once in a while, such as transmitting new data points on graphs that have only one data point per minute.

Usually there is a (network) router or two among the hops between client and server, that has a TCP state table with a timeout of 30 seconds, so the above helps to keep the channel open until the next event is ready for the client.

The expressjs/compression module, when enabled, has this side effect:

(node) warning: possible EventEmitter memory leak detected. 11 drain listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
    at Gzip.addListener (events.js:239:17)
    at Gzip.Readable.on (_stream_readable.js:665:33)
    at Gzip.once (events.js:265:8)
    at Gzip.Zlib.flush (zlib.js:448:10)
    at ServerResponse.flush (compression/index.js:70:16)
    at null._repeat (sample.js:2:8)
    at wrapper [as _onTimeout] (timers.js:264:19)
    at Timer.listOnTimeout (timers.js:92:15)

I'm not adding any event listeners, however. Just flushing the output stream after writing.

Hmm, thinking about it, the keep-alive function is irrelevant. When using text/event-stream, the first big chunk of data will usually be pushed with just a single flush at the end, but once the stream gets "up to date" or whatever, every complete JSON object written will be followed by a flush().

Probably just random chance that this happened to me inside the keep-alive function...

Anyway, the error disappears if the compression module is disabled.

@dougwilson dougwilson self-assigned this Oct 25, 2015
@dougwilson
Copy link
Contributor

Calling res.flush() will result in a listener leak in certain versions of Node.js due to a Node.js bug we cannot work-around. Which version of Node.js are you running?

@dougwilson
Copy link
Contributor

Looks like Node.js has never fixed this issue yet. Here is a PR to Node.js core to fix: nodejs/node-v0.x-archive#25679

@usernameisalreadytaken2014
Copy link
Author

Hmm, I am running v.4.2.1 on armv6l

@dougwilson
Copy link
Contributor

@usernameisalreadytaken2014 , correct, it is not fixed. There is a PR to fix this in Node.js core: nodejs/node-v0.x-archive#25679 if you want to get them to accept the patch :)

@usernameisalreadytaken2014
Copy link
Author

Hmm okay then. I left a note over in the nodejs tracker. Fingers crossed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants