-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
setImmutably: don't clone all objects #56612
Conversation
Size Change: +626 B (0%) Total Size: 1.72 MB
ℹ️ View Unchanged
|
7467bf0
to
343e776
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this looks good from the outset, though I don't feel competent enough to review it adequately. I don't find any objection to it.
} | ||
return acc[ key ]; | ||
}, newObject ); | ||
shallowClone[ first ] = setImmutably( shallowClone[ first ], rest, value ); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a way to achieve the same without recursivity? I think recursivity has a performance overhead that is best avoided in low level functions that get called often.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unlike cloneObjects
, which was using a recursive function in for every object inside, this one is only called for every item in the path though. But I can check.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ChatGPT gave me this, is that equivalent :P
export function setImmutably(object, path, value) {
path = Array.isArray(path) ? path : [path];
if (path.length === 0) {
return value;
}
const shallowClone = Array.isArray(object)
? [...object]
: { ...object };
let currentObj = shallowClone;
for (let i = 0; i < path.length - 1; i++) {
const key = path[i];
currentObj = currentObj[key] && typeof currentObj[key] === 'object'
? { ...currentObj[key] }
: {};
currentObj[key] = currentObj;
}
currentObj[path[path.length - 1]] = value;
return shallowClone;
}
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm guessing it's wrong for nested arrays.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The for loop is weird, it should be
for ( let i = 0; i < path.length - 1; i++ ) {
const key = path[ i ];
currentObj[ key ] = Array.isArray( currentObj[ key ] )
? [ ...currentObj[ key ] ]
: { ...currentObj[ key ] };
currentObj = currentObj[ key ];
}
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The number of nested functions = path.length
. Even if you'd do something like a forEach or reduce loop you'd have the same number of nested function. It really seems ok to me. We're not deep cloning here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
non-tail-recursive
yes, this is the main thing for me. I agree that it's harder to read and for this particular use-case, it might not be much recursivity (path is short in most cases), so I'm fine shipping this but I think in JavaScript specifically, it's best to avoid recursion. functions arguments get copied over and over again from caller to callee... and same for return values.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I spent some time looking at the original code and it used to do:
_.setWith( object ? _.clone( object ) : {}, path, value, _.clone )
That means that we indeed did only a shallow clone, and then a deep clone at the specified path.
Thanks for locating this discrepancy, it seems like we were way more opportunistic when we did the migration.
My take on the recursion approach here is that it will still be better than what we had before - always deep cloning the entire input object. I also find it quite more readable than the loop alternative.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pushed 7bdcfd0
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tweaked it even further after that, I think it makes it more readable and avoids the if statement. Also cached prev[ key ]
. Anyway at this point we're probably micro optimising for performance and readability. :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is a neat simplification and optimization of the existing approach 👍
I've left some thoughts, but don't see anything that should block us from shipping it.
Thanks for improving it @ellatrix! 🚀
} | ||
return acc[ key ]; | ||
}, newObject ); | ||
shallowClone[ first ] = setImmutably( shallowClone[ first ], rest, value ); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I spent some time looking at the original code and it used to do:
_.setWith( object ? _.clone( object ) : {}, path, value, _.clone )
That means that we indeed did only a shallow clone, and then a deep clone at the specified path.
Thanks for locating this discrepancy, it seems like we were way more opportunistic when we did the migration.
My take on the recursion approach here is that it will still be better than what we had before - always deep cloning the entire input object. I also find it quite more readable than the loop alternative.
@@ -93,24 +45,20 @@ function cloneObject( object ) { | |||
* @return {Object} Cloned object with the new value set. | |||
*/ | |||
export function setImmutably( object, path, value ) { | |||
const normalizedPath = normalizePath( path ); | |||
const newObject = object ? cloneObject( object ) : {}; | |||
path = Array.isArray( path ) ? path : [ path ]; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we use a new constant here? Mutation can introduce potential issues as we refactor in the future, and storing in a new constant is cheap.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's not mutatating in this case, just reassigning
@@ -93,24 +45,20 @@ function cloneObject( object ) { | |||
* @return {Object} Cloned object with the new value set. | |||
*/ | |||
export function setImmutably( object, path, value ) { | |||
const normalizedPath = normalizePath( path ); | |||
const newObject = object ? cloneObject( object ) : {}; | |||
path = Array.isArray( path ) ? path : [ path ]; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Furthermore, I see we're ditching the numeric path support, and I confirm that this is fine. I can't think of a use case where it won't work properly without that specific handling that we used to do.
What?
Currently,
setImmutably
is deep cloning all objects blindly, while only the objects/arrays lying in the path to be updated should be cloned.Why?
How?
Testing Instructions
Testing Instructions for Keyboard
Screenshots or screencast