Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

mutations with different files in parallel, results lost #649

Closed
gabor opened this issue Mar 20, 2020 · 8 comments · Fixed by #650
Closed

mutations with different files in parallel, results lost #649

gabor opened this issue Mar 20, 2020 · 8 comments · Fixed by #650
Labels
bug 🐛 Oh no! A bug or unintented behaviour.

Comments

@gabor
Copy link

gabor commented Mar 20, 2020

codesandbox links:
server: https://codesandbox.io/s/graphql-server-1-l6ry3
client: https://codesandbox.io/s/graphql-client-1-dtnkx

i have a schema with a single mutation:

type Mutation {
   createMessage:String!

the server always returns a random string (it contains a random uuid).

then i have an urql-client, where if you press a button,
i execute two createMessage mutations at the same time,
and display their result.

expected behavior:

  • i get two different messages

actual behavior:

  • i get the same message two times

i looked into the code,and what seems to be happening is this:

  • two ajax-requests are sent out (the two mutations), and both ajax-requests return with different messages. so far so good
  • but, both ajax-requests contain the same operation-key, so (it seems) as soon as one finishes, URQL finishes both mutations with the same result.
  • i tried to use a fetchPolicy: 'network-only', but that did not help
  • it seems the operation-key is a hash from the mutation and variables. so, when the mutation and variables are the same, the operation-key is the same.

am i doing something wrong here? for me it seems mutations should not be deduplicated this way.

i understand that i can just send some mutation-variable that always changes and it willl solve the problem, but that seems ugly :(

p.s: this is just a minimal example, the place where i found the issue was with using the new multipart-form-data-exchange, i was uploading pictures, and only the picture changed in the mutations. i assume the key-hash does not take the separately traveling blobs into account.

@gabor gabor added the bug 🐛 Oh no! A bug or unintented behaviour. label Mar 20, 2020
@andyrichardson
Copy link
Contributor

andyrichardson commented Mar 20, 2020

Hey @gabor I would guess the reason you're getting this is because we have a dedupExchange which prevents multiple identical requests from being in-flight at the same time.

You could try removing the dedup exchange from the client which would look something like this in your create client call:

{ 
  url: // ...
  exchanges: [cacheExchange, fetchExchange]
}

@andyrichardson
Copy link
Contributor

andyrichardson commented Mar 20, 2020

So I just forked the example and it looks like it might be somehow related to the fetch exchange is unrelated to the exchanges you're using.

@gabor do you have a real world example for this use case where this is desired functionality? If you are looking to send many real-time mutations to a server, I think websocket communication might be the way to go.

@kitten kitten changed the title mutations in parallel, results lost mutations with different files in parallel, results lost Mar 20, 2020
@gabor
Copy link
Author

gabor commented Mar 20, 2020

(sorry for the long response, i am trying to respond to multiple replies here :-)

@andyrichardson honestly i consider not deduplicating-mutation-responses desired functionality for every mutation i write. for me they are conceptually similar to POST requests in a REST api. when those are triggered,they should happen, their response should be returned.

for queries (not mutations), i see why one wants to deduplicate. also, please note,what here is happening, is not really deduplication. the second mutation is not dropped.the second mutation is allowed to fire, it's the result from it that is dropped. i checked the dedup-exchange source code, and here https://github.com/FormidableLabs/urql/blob/master/packages/core/src/exchanges/dedup.ts#L13 it seems mutations are allowed through always...?

even if i think in use-cases:

  • imagine a mutation like 'createNote', where i send a string, it creates a note from it, and it returns the ID of the newly created note. if i write two notes too quickl;y, or the backend responds too slowly, the ID of the second note will be lost. or, in other words, there is an observable difference between writing the two notes fast, or writing them slowly. i think there should not be.

trying out the recommended workaround: @andyrichardson @JoviDeCroock
i tried this:

const client = new Client({
  url: ...
  exchanges: [cacheExchange, fetchExchange]
});

but it was still the same.i tried adding

client.mutation(MUTATION, null, { requestPolicy: "network-only" })

but it still did the "deduplication".

regarding the recommendation to use websockets:
my use-case does not really send that many mutations that fast.i am uploading images,and sometimes i upload multiple images at the same time. (i see #650 was opened to handle that case too)

summary:

  • in my opinion, mutations should never be cached.( of course i realize this may not be everyone's opinion)
  • in my opinion, what currently is happening with mutations is a strange middle-ground, where they are somehow "half-deduplicated": the requests are sent out, but the responses are "deduplicated". i think it should be either not-deduplicated or fully-deduplicated, like queries (i do have my preference :-)
  • even if current behavior is considered correct,would be nice to find a workaround, to be able to reach the behavior i prefer

@kitten
Copy link
Member

kitten commented Mar 20, 2020

So to summarise this a little:

mutations should never be cached

they are not. But we recognise a response by its key, which is based on the query document and variables.

what currently is happening with mutations is a strange middle-ground, where they are somehow "half-deduplicated"

So we did find the File case to be bugged so I sent a quick fix PR :)

is considered correct,would be nice to find a workaround

Depends 😅 Is the File fix sufficient for you? In general, I haven't seen multiple mutations that are identical and needed to be sent multiple times. But I was planning to implement a "mutation series exchange" to let mutations wait for one another

@gabor
Copy link
Author

gabor commented Mar 20, 2020

thanks a lot for working on this problem.

what currently is happening with mutations is a strange middle-ground, where they are somehow "half-deduplicated"

So we did find the File case to be bugged so I sent a quick fix PR :)

thank you, that improves the situation. but..hmm.. i find it confusing/surprising, that two mutations go out, and one's response is just ignored. for me that feels simply wrong. i mean, if you write code like this:

client.mutation(MUTATION).toPromise().then(response => console.log(response));
client.mutation(MUTATION).toPromise().then(response => console.log(response));

then the intuitive thing would be to get two different responses, right?

is considered correct,would be nice to find a workaround

Depends 😅 Is the File fix sufficient for you? In general, I haven't seen multiple mutations that are identical and needed to be sent multiple times. But I was planning to implement a "mutation series exchange" to let mutations wait for one another

may i ask a question in the opposite direction? what is the value of deduplicating mutations? i think we both agree that they seldom happen. if that's so,why not do the less surprising thing there? i assume that maybe it's hard to do with how internally urql works.

to answer your question: yes, the file-fix will very probably fix my real-world problem. but, if i could flip a switch and make all my mutations not-deduplicated, i would flip the switch immediately. because then i just do not have to think about this problem. ( if i could just send a new uuid() into every mutation as key, i would 😄 )

@kitten
Copy link
Member

kitten commented Mar 20, 2020

then the intuitive thing would be to get two different responses, right?

I'd argue in this case the intuitive result isn't necessarily correct. This is a pretty rare case already and given the semantics of urql as documented in "Concepts" it makes sense that this isn't the case 😅

But as I've said, that's more in the scope of another exchange that runs mutations in series and treats them as different, I believe.

if that's so,why not do the less surprising thing there? i assume that maybe it's hard to do with how internally urql works.

It's not particularly hard per se and we could go the extra mile to make it work, but it goes against a lot of the concepts we've established and is all-in-all a pretty unexplored use-case in GraphQL.

Mutations are often operations on data. That also means that they're not necessarily unique by operation, but unique by intention, which is very much what more assumptions in our normalized cache (or any normalized cache) is based on 😉

We assume that if we go down the route of every operation being unique then we're in the realm of fetch basically.

i would flip the switch immediately. because then i just do not have to think about this problem.

I mean, suppose though that you wouldn't have run into this — My interpretation of real-world cases is that I find it unlikely that you would've run into it in the first place 😅 And in this case I even think that it's something that this failure case is otherwise easy to spot 🤔

@gabor
Copy link
Author

gabor commented Mar 22, 2020

(i started to write this a couple minutes before the ticket was closed with the fix, so i might as well send it)

@kitten thanks for the response. i will think more about how to design mutations for my use-cases. for this ticket i think #650 can be considered as fixing it.

p.s: i see a Math.random() in that pull-request, so perhaps our opinions are not that different after all 😉

@kitten
Copy link
Member

kitten commented Mar 22, 2020

@gabor we were evaluating making mutations distinct all the time, but I believe keeping that for when it’s strictly necessary may be more intuitive for now 🙌 however we’ll definitely come back to this when it’s time for getting breaking changes in 👍

The interesting thing about not treating mutations as distinct are use-cases where it’s implied by the user which makes for pretty interesting use-cases. For instance in an example of ours where we implemented an unlimited upvote, users could press upvote as often as they’d like and see the number tick up.

Usually for optimistic updates to work in that case we’d have to manually block the button when the mutation is in progress, but with automatic deduplication, we don’t 😅 so there are also unseen advantages to this limitation for optimistic updates

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug 🐛 Oh no! A bug or unintented behaviour.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants