-
Notifications
You must be signed in to change notification settings - Fork 29.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
lib: revert primordials in a hot path #38246
Conversation
There are definitely more to be reverted, I did not touch the http client, fs, util, and many other subsystems. |
@nodejs/tsc Should we put a pause on further porting to primordials until we have clear guidelines where they are safely applicable and what the workflow is otherwise? |
@@ -88,7 +72,7 @@ const { CRLF } = common; | |||
|
|||
const kCorked = Symbol('corked'); | |||
|
|||
const nop = FunctionPrototype; | |||
const nop = function () {}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
const nop = function () {}; | |
const nop = function() {}; |
+1. I think we need a volunteer to investigate what needs to be done in V8 to make |
@@ -234,7 +249,7 @@ function Writable(options) { | |||
// the WritableState constructor, at least with V8 6.5. | |||
const isDuplex = (this instanceof Stream.Duplex); | |||
|
|||
if (!isDuplex && !FunctionPrototypeSymbolHasInstance(Writable, this)) | |||
if (!isDuplex && !realHasInstance.call(Writable, this)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These are not strictly the same. I guess this is from the revert?
@@ -132,7 +120,7 @@ const DEFAULT_IPV6_ADDR = '::'; | |||
|
|||
const isWindows = process.platform === 'win32'; | |||
|
|||
const noop = FunctionPrototype; | |||
const noop = function () {}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
const noop = function () {}; | |
const noop = function() {}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We've learned from #37168 that primordials added some performance benefits to streams, what about those?
I've spent days trying to investigate which changes were "safe" performance-wise, I'm sorry I failed. But I feel that reverting the whole commit is a bit "lazy", I'd prefer if we investigate what are the actual changes that are introducing performance penalty – I reckon that means running loads of HTTP benchmark runs (one for each change), which would take a long time.
I'm putting a "Request change", I'm open to dismiss it if the consensus is that my ask is not reasonable.
A valid point. However, even if I did agree (which I'm not sure I do) I think the "lazy" approach is appropriate at this point as we are so close to a v16 release and (I assume?) we want to resolve the primary performance regression before that. @mcollina has already put a huge amount of time into sorting out the performance regressions. Also before we put more energy in primordials, I think it would be a good idea to see what the TSC's preliminary thoughts are about the whole situation. |
Every single run of the http benchmarks is around 9/10 hours on my workstation and I have been working around the clock to reach this stage. Running those for every individual change is unrealistic given the effort required to validate them all. My analysis is that some of the primordials prevent functions to be inlined and/or optimized as well as their non-primordial counterparts. After identifying this fact, I took a lazy approach as much as I could and removed them from the "serving http calls hot path". |
I've opened an alternative PR (#38248) with a smaller diff. PTAL. |
I appreciate the reluctance to revert the entire commit but I think it's the right thing to do here. We can then open new prs that make these changes more incrementally, allowing us to benchmark individual changes as we go. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To be clear, I only reverted one commit as-is. For the rest, I was driven by a combination of flamegraphs and benchmarks.
ObjectDefineProperty, | ||
Promise, | ||
ReflectApply, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@aduh95 the entirety of this file is a critical hot path.
@@ -586,7 +574,7 @@ Socket.prototype._read = function(n) { | |||
|
|||
|
|||
Socket.prototype.end = function(data, encoding, callback) { | |||
ReflectApply(stream.Duplex.prototype.end, this, [data, encoding, callback]); | |||
stream.Duplex.prototype.end.call(this, data, encoding, callback); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@aduh95 note the difference here. ReflectApply
was allocating an additional array.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I agree using ReflectApply
here was a mistake. Although, in most cases I've encountered, replacing it with FunctionPrototypeCall
was good enough. I'll update the other PR.
@@ -141,13 +139,13 @@ function slowCases(enc) { | |||
case 4: | |||
if (enc === 'UTF8') return 'utf8'; | |||
if (enc === 'ucs2' || enc === 'UCS2') return 'utf16le'; | |||
enc = StringPrototypeToLowerCase(`${enc}`); | |||
enc = `${enc}`.toLowerCase(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All those changes where needed because slowCases
did not inline anymore.
@@ -22,11 +22,7 @@ | |||
'use strict'; | |||
|
|||
const { | |||
ArrayPrototypePushApply, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All the ArrayPrototype operations where clearly problematic in my analysis.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm surprised that all of them would be, but I guess if that's the case it's OK to remove them all and try to add them back later if V8 optimize this away.
lib/_http_server.js
Outdated
@@ -625,7 +616,7 @@ function socketOnData(server, socket, parser, state, d) { | |||
function onRequestTimeout(socket) { | |||
socket[kRequestTimeout] = undefined; | |||
// socketOnError has additional logic and will call socket.destroy(err). | |||
ReflectApply(socketOnError, socket, [new ERR_HTTP_REQUEST_TIMEOUT()]); | |||
Reflect.apply(socketOnError, socket, [new ERR_HTTP_REQUEST_TIMEOUT()]); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note fore me, `ReflectApply is not actualy needed here, we can avoid an array allocation.
@@ -645,7 +636,7 @@ function onParserTimeout(server, socket) { | |||
socket.destroy(); | |||
} | |||
|
|||
const noop = FunctionPrototype; | |||
const noop = function() {}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have some good suspicion that using a noop function defined elsewhere is not optimized away. No hard data to prove it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We've seen that before. Not sure what causes it but reused noops definitely cause issues.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All noop functions have been replaced by the FunctionPrototype
in most of the files I checked :/.
@aduh95 sorry for the (probably silly) question. Given we can't and probably won't be able to tamper-proof the code - is the primordials work and the (negative) impact on code readability is even worth it at this point? Like, what actual value are we actually getting from these changed at this point? |
@benjamingr why wouldn't we be able to tamper-proof the code, modulo performance concerns? It is entirely achievable for node to be robust against the code it's trying to run, and arguably it's a massive bug that it's never been. |
It appears that for the foreseeable future big 2-3x performance regressions are going to prevent this from happening to the entire Node.js API. I think there are enough people in the project that feel that a significant performance regression is not worth adding that additional guarantee to the API. The guarantee is only meaningful to some people (like me) if it actually provides a security guarantee - if it's impossible to provide one providing the pretence of one is actually worse.
Our security model never made such a guarantee. Node does not currently enable (certainly out of the box) running untrusted code. The docs are super explicit about this in places where it might be implied (like the vm module). |
@benjamingr I think identifying which primordials are harmful for performance would help V8 improve their optimization engines, and I'm still hoping Node.js core internals will be tamper-proof one day. |
I would prefer to not stall this. The whole premise of primordials is to increase safety while not slowing things down. The vast majority of those PR were never benchmarked against http, which is one of our primary use cases. Seeing a 20% drop in #37937 was definitely a significant surprise. We are still far from getting back on track and the longer we wait the harder it gets, as we risk introducing new bottlenecks without noticing. |
I agree with @mcollina's sentiment of not stalling. I do think the optics of Node.js v16.0.0 going out with a known significant performance regression is relevant. I imagine when a new major release goes out a number of our users will run benchmarks, if we can patch (or revert) any known regressions beforehand then I think it makes sense to do so. |
@@ -379,7 +376,7 @@ function howMuchToRead(n, state) { | |||
return 0; | |||
if (state.objectMode) | |||
return 1; | |||
if (NumberIsNaN(n)) { | |||
if (Number.isNaN(n)) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you have any evidence that "static" primordials can lead to performance issues? I'm okay with reverting the prototype ones but this is too much imo.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm asking because NumberIsNan
is literally Number.isNaN
(same function identity)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't. The approach I used is to remove all of them from hot paths, literally with %s/primordial/replacement/g
. If you want I can try adding those back in.
I'm 99.9% sure on the ones from *Prototype*
are the major culprit (functions and arrays specifically).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Closing in favor of #38248. It is a good compromise and it focus on all the primordials we identified causing the slowdown. |
As part of #37937, I have been tracking all regressions that happened in the HTTP response hot path. I have strong evidence that a part of that is caused by the introduction of primordials through our codebase.
I recommend we stop "primordial" activities and we benchmark things extensively before adding them. Specifically there have been a few cases where PRs landed without benchamrks or without benchmarking the hot path where those are going to be used. Quite a few of those areas touched by this PR are extremely hot paths.