-
Notifications
You must be signed in to change notification settings - Fork 462
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
mutations with different files in parallel, results lost #649
Comments
Hey @gabor I would guess the reason you're getting this is because we have a dedupExchange which prevents multiple identical requests from being in-flight at the same time. You could try removing the dedup exchange from the client which would look something like this in your create client call: {
url: // ...
exchanges: [cacheExchange, fetchExchange]
} |
So I just forked the example and it looks like it @gabor do you have a real world example for this use case where this is desired functionality? If you are looking to send many real-time mutations to a server, I think websocket communication might be the way to go. |
(sorry for the long response, i am trying to respond to multiple replies here :-) @andyrichardson honestly i consider not deduplicating-mutation-responses desired functionality for every mutation i write. for me they are conceptually similar to POST requests in a REST api. when those are triggered,they should happen, their response should be returned. for queries (not mutations), i see why one wants to deduplicate. also, please note,what here is happening, is not really deduplication. the second mutation is not dropped.the second mutation is allowed to fire, it's the result from it that is dropped. i checked the dedup-exchange source code, and here https://github.com/FormidableLabs/urql/blob/master/packages/core/src/exchanges/dedup.ts#L13 it seems mutations are allowed through always...? even if i think in use-cases:
trying out the recommended workaround: @andyrichardson @JoviDeCroock
but it was still the same.i tried adding
but it still did the "deduplication". regarding the recommendation to use websockets: summary:
|
So to summarise this a little:
they are not. But we recognise a response by its
So we did find the
Depends 😅 Is the File fix sufficient for you? In general, I haven't seen multiple mutations that are identical and needed to be sent multiple times. But I was planning to implement a "mutation series exchange" to let mutations wait for one another |
thanks a lot for working on this problem.
thank you, that improves the situation. but..hmm.. i find it confusing/surprising, that two mutations go out, and one's response is just ignored. for me that feels simply wrong. i mean, if you write code like this:
then the intuitive thing would be to get two different responses, right?
may i ask a question in the opposite direction? what is the value of deduplicating mutations? i think we both agree that they seldom happen. if that's so,why not do the less surprising thing there? i assume that maybe it's hard to do with how internally urql works. to answer your question: yes, the file-fix will very probably fix my real-world problem. but, if i could flip a switch and make all my mutations not-deduplicated, i would flip the switch immediately. because then i just do not have to think about this problem. ( if i could just send a new uuid() into every mutation as |
I'd argue in this case the intuitive result isn't necessarily correct. This is a pretty rare case already and given the semantics of But as I've said, that's more in the scope of another exchange that runs mutations in series and treats them as different, I believe.
It's not particularly hard per se and we could go the extra mile to make it work, but it goes against a lot of the concepts we've established and is all-in-all a pretty unexplored use-case in GraphQL. Mutations are often operations on data. That also means that they're not necessarily unique by operation, but unique by intention, which is very much what more assumptions in our normalized cache (or any normalized cache) is based on 😉 We assume that if we go down the route of every operation being unique then we're in the realm of
I mean, suppose though that you wouldn't have run into this — My interpretation of real-world cases is that I find it unlikely that you would've run into it in the first place 😅 And in this case I even think that it's something that this failure case is otherwise easy to spot 🤔 |
(i started to write this a couple minutes before the ticket was closed with the fix, so i might as well send it) @kitten thanks for the response. i will think more about how to design mutations for my use-cases. for this ticket i think #650 can be considered as fixing it. p.s: i see a |
@gabor we were evaluating making mutations distinct all the time, but I believe keeping that for when it’s strictly necessary may be more intuitive for now 🙌 however we’ll definitely come back to this when it’s time for getting breaking changes in 👍 The interesting thing about not treating mutations as distinct are use-cases where it’s implied by the user which makes for pretty interesting use-cases. For instance in an example of ours where we implemented an unlimited upvote, users could press upvote as often as they’d like and see the number tick up. Usually for optimistic updates to work in that case we’d have to manually block the button when the mutation is in progress, but with automatic deduplication, we don’t 😅 so there are also unseen advantages to this limitation for optimistic updates |
codesandbox links:
server: https://codesandbox.io/s/graphql-server-1-l6ry3
client: https://codesandbox.io/s/graphql-client-1-dtnkx
i have a schema with a single mutation:
the server always returns a random string (it contains a random uuid).
then i have an urql-client, where if you press a button,
i execute two
createMessage
mutations at the same time,and display their result.
expected behavior:
actual behavior:
i looked into the code,and what seems to be happening is this:
fetchPolicy: 'network-only'
, but that did not helpam i doing something wrong here? for me it seems mutations should not be deduplicated this way.
i understand that i can just send some mutation-variable that always changes and it willl solve the problem, but that seems ugly :(
p.s: this is just a minimal example, the place where i found the issue was with using the new multipart-form-data-exchange, i was uploading pictures, and only the picture changed in the mutations. i assume the key-hash does not take the separately traveling blobs into account.
The text was updated successfully, but these errors were encountered: