Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question: Normalizr usage with nested objects in React/Redux #1255

Closed
elado opened this issue Jan 20, 2016 · 15 comments
Closed

Question: Normalizr usage with nested objects in React/Redux #1255

elado opened this issue Jan 20, 2016 · 15 comments
Labels

Comments

@elado
Copy link

elado commented Jan 20, 2016

My data models look like this:

user
message has sender: user and recipient: user
conversation has messages: arrayOf(message) and lastMessage: message

And UI structure is roughly this:

<Messenger>
  <ConversationList conversations={conversations}>
    {conversations.map(conversation => (
      <ConversationItem conversation={conversation}>
        <Message message={conversation.lastMessage}>
          <UserAvatar user={message.sender} /> {message.body}
        </Message>
      </ConversationItem>)
    )}
  </ConversationList>
  <Conversation conversation={selectedConversation}>
    {selectedConversation.messages.map(m => (
      <Message message={m}>
        <UserAvatar user={message.sender} /> {message.body}
      </Message>
    ))}
  </Conversation>
</Messenger>   

(tl;dr convesation list, each item shows last message and selected conversation shows list of messages. Each component is in a separate file)

When fetching all entities go to to the entities reducer as expected. Relationships turn into IDs.

I want to reload a component if any of its nested objects have been updated, so if some user has its avatar updated, a <UserAvatar> inside a <Message> anywhere should update.

Question:

Where does it make sense to do denormalize data (using @connect) and minimize unnecessary updates?
I can do it in in each component individually, but essentially that would make all my components @connected.

@connect(
  (state, ownProps) => {
    return {
      // ownProps.user can be an ID, that would fetch from entities
      user: typeof ownProps.user === 'string' ? state.entities.user[ownProps.user] : ownProps.user
    }
  }
)
export default class UserAvatar {
  render() {
    return <img src={this.props.user.avatarUrl} />
  }
}

Is it a good approach? Would it scale with many flowing updates?

@gaearon
Copy link
Contributor

gaearon commented Jan 20, 2016

Not specific to Redux but here's the approach I recommend.

Let every component receive the data it uses as props, but retrieve the data only children need by using connect(). This is a very nice tradeoff and I've found it pleasant to work with.

This is also a duplicate of #815.

@markerikson
Copy link
Contributor

For what it's worth: I'm prototyping my pieces for my first React/Redux app right now, and I have a use case where the opposite seems to work well. In particular, I've got multiple types of items that I need to represent in a treeview, that can be nested as: A>B, A>C>D>B, A>E>B. A is basically just a containing "folder" in the tree, B/C/D/E are actual entity types. My initial implementation has each parent rendering tree nodes for the children and passing in IDs, and each child has an appropriate mapStateToProps to pull in the entity it cares about by ID, plus some related data. Since a given normalized entity already has just the IDs of its children, and individual nodes need to update separately (say, drawing a highlight if it's the currently selected tree node), this seems to be working out well for me.

Again, still a newbie and still just prototyping, but tossing this out there as a counter-example.

@gaearon
Copy link
Contributor

gaearon commented Jan 20, 2016

The problem with passing just an ID is it's hard to render arbitrary data. In some cases it can be useful to render the "draft" or "fake" state instead of what's currently in the store by the given ID. If you want to render empty states in list of items, you can't, because you don't know reliably in the parent component whether the data for the corresponding IDs has been fetched. It's also harder to test components that require IDs. Finally it's harder to create something like a "visual palette" of components because instead of passing static fake props you have to prefill the store.

So not saying it's inherently bad but there are downsides to always passing IDs.

@elado
Copy link
Author

elado commented Jan 22, 2016

Thanks.

I also don't like the IDs as properties.
However, making sure each parent denormalizes the data for its children is also a lot of work.

Before integrating Redux, my models just had definedProperty with the relationship (I use js-data for models), so it was easy.

Maybe normalizr should have a denormalize method that uses properties and reads from some collection (e.g. redux state from the entities reducer), so it doesn't run into infinite loops while denormalizing, nor reading entities unless requested, lazily, and also keeping everything flat and immutable in store.

I'll sketch something.

@gaearon
Copy link
Contributor

gaearon commented Jan 22, 2016

I don't think preemptive denormalization is a good approach. You'll likely end up recreating many objects that haven't really changed, thus killing performance optimizations at the component level. Creating objects gradually as components need them is likely to be more performant because you'll utilize existing performance optimizations at the component level and won't recreate more objects than necessary.

@elado
Copy link
Author

elado commented Jan 22, 2016

@gaearon

It can be something like:

let conversations = state.messenger.activeConversations
  .map(c => state.entities.conversations[c])
  .map(c => denormalize(c, schema.conversation))

const denormalize = (entity, schema, bag) => {
  const instance = { ...entity }

  for (let relation of /*schema realtions*/) {
    let relationId = entity[relation.key]
    if (relationId) {
      Object.defineProperty(instance, key, {
        // TODO: memoize value, handle arrays, handle polymorphic etc.
        get() { return denormalize(bag[relation.type][relationId], relation, bag) }
      })
    }
    else {
      instance[relation.key] = null
    }
  }

  return instance
}

When denormalizing a conversation only a single object is created, with properties. If a property is accessed only then it fetches the relationship lazily and denormalizes it too. Returned can be cached as long as the entity in the bag didn't change, so no extra denormalization happen each time property is accessed.

What do you think?

@gaearon
Copy link
Contributor

gaearon commented Jan 23, 2016

If you can make it work efficiently please feel free to make an attempt (either inside or outside normalizr).

@elado
Copy link
Author

elado commented Jan 23, 2016

@gaearon
Initial commit, please take a look: elado/normalizr@35c8acb
It needs to be inside normalizr for the schema types references which are not exposed. I'm also thinking about changing the way the relationships are saved in the schema, because now I need to skip all the keys that start with _ in order to get the names.

Before making a PR I'll try it in my own app to see performance and usage.

@gaearon
Copy link
Contributor

gaearon commented Jan 23, 2016

Yeah, this would be best with an example proving shouldComponentUpdate() optimizations still work good and that large updates are efficient.

@markerikson
Copy link
Contributor

@gaearon , just took a look at your new tree example, and saw you'd chosen to implement it by passing IDs to the child components. I'm a bit surprised, given your comments here just a few days ago. Your comments seemed to suggest that you had reservations about that approach, although maybe I'm over-interpreting what you said.

Still, nice example, and thanks for taking the time to put it together.

@gaearon
Copy link
Contributor

gaearon commented Jan 25, 2016

Haha yeah I was just about to check in again in this thread :-). So normally I'd use the approach I suggested but it's harder to make it performant when there are thousands of entities and arrays are involved. So it's a tradeoff of performance for convenience.

@elado
Copy link
Author

elado commented Jan 26, 2016

So far, with hundreds/thousands of entities my approach seems promising :)
No tradeoffs -- root container denormalizes root entities and all other components remain dumb.

@diegohaz
Copy link

Any updates on this?

@gaearon
Copy link
Contributor

gaearon commented May 16, 2016

Why would there be any updates on this? What kind of updates do you expect?

@diegohaz
Copy link

Hi, @gaearon, I was expecting for the @elado's denormalize PR, but I have found the denormalizr.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants