Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support custom field name for validation #17

Open
jaycle opened this issue May 1, 2018 · 9 comments
Open

Support custom field name for validation #17

jaycle opened this issue May 1, 2018 · 9 comments

Comments

@jaycle
Copy link

jaycle commented May 1, 2018

Sometimes the convention from the server in json doesn't directly map to the styles on the front end. For example, data from the API may be in snake_case when the javascript is using camelCase. Since this library is already parsing and converting raw json to correctly formed typescript compatible types, it seems this would be a great layer to adapt to these differences.

For example, something like the following syntax would be ideal:

const userDecoder: Decoder<User> = object({
  firstName: string('first_name'),
  lastName: string('last_name'),
  username: string()
)};

where the the string decoder has an optional string parameter.

@cpjolicoeur
Copy link
Member

Tagging @mulias for follow-up discussion.

@cpjolicoeur
Copy link
Member

Also, @jaycle we'd love to get a PR submitted with any changes for discussion if you are up for it.

@mulias
Copy link
Contributor

mulias commented May 1, 2018

Hey @jaycle, thanks for pointing out this use case. The main project I've been using this tool on involves a pretty direct mapping from the input JSON to the objects I want to use, so this issue has not yet come up for me personally.

First off, this is how I would solve your problem with the tools already provided by the library:

const userDecoder: Decoder<User> = object({
  first_name: string(),
  last_name: string(),
  username: string()
}).map(user => ({
  ...user, 
  firstName: user.first_name,
  lastName: user.last_name
}));

I understand that this isn't a perfect solution, and you're probably looking for something more robust because you have lots of fields to map.

While your suggested syntax would be ideal for your situation, I hesitate to move forward with it for a few reasons. For one, adding the optional field name argument would allow the user to create decoders like this:

const stringDecoder: Decoder<string> = string('what_does_this_mean');

and then if you have an array, which of these would be the valid syntax? How do you communicate that in documentation?

const userDecoder: Decoder<User> = object({
  favoriteWords: array(string('favorite_words')),
  leastFavoriteWords: array(string(), 'least_favorite_words')
})

These issues are solvable, but I'm seeing a lot of new complexity which doesn't look promising.

In a situation like this the first place I'd look is an existing json combinator library. Here's what I'm thinking about right now, based on Elm's solution (http://package.elm-lang.org/packages/elm-lang/core/5.1.1/Json-Decode#map2). I don't love it, but maybe you'll see something worth pursuing?

const userDecoder: Decoder<User> = objectFields({
  firstName: valueAt(['first_name'], string()),
  lastName: valueAt(['last_name'], string()),
  username: valueAt(['username'], string())
})

For added ergonomics we could change valueAt to take a single string or an array.

I'm going to keep thinking about this for a bit, but I'm interested to hear your thoughts. If there's a direction you're feeling confident about I'd also love to see a PR.

@jaycle
Copy link
Author

jaycle commented May 2, 2018

First, thank you for providing this library. It seems to be really well designed and thought out and I really appreciate you considering my use case and also wanting to balance it with how it fits into the overall design of the library. I'm fairly new to typescript and javascript but I've looked around the source and it certainly looked like a non-trivial request. I'd love to take a stab at an implementation but perhaps more discussion around the best (and most feasible/flexible) syntax should preclude that. On the feasibility piece I would certainly defer to you @mulias.

I'm not opposed to the valueAt syntax but I'm with you in the sentiment that perhaps it could use some more thought. This isn't an urgent request for me and I'll try to investigate a bit further, too. Thanks for being willing to think through this with me.

@mulias
Copy link
Contributor

mulias commented May 22, 2018

Hey @jaycle have you had a chance to use this library more on your project? Have you tried out using map for the misnamed fields, or using some other helper method? Any other difficulties you've ran into?

@jaycle
Copy link
Author

jaycle commented May 23, 2018

I've been using the field with their names as they come from the server at the moment. I didn't like the map solution because it meant I had duplicate fields varying only in style. I did a bit more playing around with destructing and found something a bit better along the same lines:

const userDecoder: Decoder<User> = object({
  first_name: string(),
  last_name: string(),
  username: string()
}).map(user => {
  const { first_name, last_name, ...splitUser } = user;
  return {
    ...splitUser,
    firstName: first_name,
    lastName: last_name
  };
});

Still a bit cumbersome but workable. Other than just looking at the code, I haven't had much time to play around with real solutions within the library. I think valueAt as you proposed is certainly better than the map solution. But since this isn't urgent I would hate for my use case to push you into a non-ideal solution if you have any hesitations. Have you had any better ideas come to mind?

@Yakimych
Copy link

Yakimych commented Apr 16, 2019

Hi. Looks like an issue is a year old, but I've just discovered this library while looking for alternatives to bs-json in ReasonML and that's exactly the functionality I was looking for :)

let line = json =>
  Json.Decode.{
    start:     json |> field("start", point),
    end_:      json |> field("end", point),
    thickness: json |> optional(field("thickness", int))
  };

Is there any reason to not have the string as a required parameter (rather than optional)?

@mixvar
Copy link

mixvar commented May 21, 2019

For added ergonomics we could change valueAt to take a single string or an array.

I think this would be nice. Since the valueAt Decoder already exists I'd expect it to handle this use case.

@mulias would you like me to try to implement this?

@mulias
Copy link
Contributor

mulias commented May 22, 2019

@mixvar That sounds fine to me -- the argument should be changed to something like paths: string | number | Array<string | number>. The reason why I didn't go that route was because I was following the design philosophy of ramda, which generally prefers having separate functions such as prop and path, instead of overloading the function signature. Since then I've started adding some other convenient overloads, so I think it's ok.

@Yakimych Ultimately the object decoder provided by this library works differently from bs-json. While a number of libraries like bs-json and elm's Json.Decode expect you to pluck specific fields to build your object, I decided to make the assumption that the json object getting decoded and the object returned by the decoder have the same fields. I had reasons for doing it that way, but I do still feel conflicted about it.

What I proposed in my initial response was a different decoder, which I named objectFields, which would work like the bs-json style of object decoding. That would solve the direct problem people are talking about in this thread, but I'm always sensitive to how adding new decoders might impact library api and documentation overhead.

The possible routes to take as I see them are:

  1. Do nothing, the library already has a fine solution for this situation with object({...}).map(...).
  2. Add a new decoder that is a variant on object but doesn't couple the resulting decoder's type to the shape of the json object. This might be the way to go, but if nothing else this new decoder needs a good name.
  3. Change the behavior of the object decoder. This would be a pretty major breaking change.
  4. Not only change the object decoder, keep the old object as a variant decoder with a new name.

At this point I'm in no rush to make design decisions for this library, so if anyone wants to advocate for a particular option or put together a PR go for it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants