-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Request Streaming is extremely slow #1587
Comments
which version of Sanic did you run this on? |
Sanic version 19.3.1. As was mentioned in #1535, the correct behavior would be to honor client's request for |
@Leonidimus Sorry, I didn't see you had provided the details in your original post. I edited it to move the I am taking a look at this also as it relates to ASGI #1475. |
My machine:
My results when I run your same code. Granted I am not sure how big your
Regardless, I am not seeing the long delay times that you are on the Sanic server. I tried the test also using ASGI servers
It does not work out of the box without sending the
Same results with hypercorn
As a side note, hypercorn is also okay without the
So, when 19.6 is released (assuming we complete ASGI support by then), this will be "fixed" by using one of the ASGI servers. The questions that I believe still needs to be answered:
My thoughts are that (1) yes, we should respond to I could be wrong on point 2, and I am open to debate, but my reading of RFC 7231 is that the response is not required if it is not requested.
-- I am removing the Bug label because I do not think this is a bug per se, and more of a feature request. |
It should also be noted that the last go round with 'sanic is slow when I test it with curl' the culprit ended up being curl; as people tested with other methods the slowness could not be reproduced, but could be reproduced with curl. |
We probably could add this support |
@ahopkins Thanks for quick responses! Now that I see the project is actively supported, I can safely continue using the framework :) My BTW, I tested with a popular JS library and a 22M file and it worked very quickly.
Output:
|
There is a core team of developers that are working on Sanic in a community. Part of our reasoning for moving to the community supported model was to foster an ongoing group of developers so that it would stay active. I am glad you feel this way 😄 I updated my
|
I added this to 19.9, but if anyone thinks then can handle providing |
Just to give more insight, libcurl by default waits 1 second for a 100 response before timing out and continuing the request. There's a thread about this behaviour: https://curl.haxx.se/mail/lib-2017-07/0013.html @ahopkins Is anyone currently working on this? I would be glad to help. |
@LTMenezes Would be happy to have you help take a stab at this. Check with @yunstanford, looks like he self-assigned this so he may already be working on it. |
Describe the bug
I created a simple POC app to accept streaming binary data. The syntax is compliant with the official docs. The app works as expected but the response time is more than 1000 ms for even small data sizes (1.1K). In the snippet below tried uploading 2 files -
tiny.txt
(57B) andsmall.txt
(1137B). Filetiny.txt
took 0.021s andsmall.txt
took 1.026s on average. Testing was done on the same host via a loopback, so no network delay involved. I think the issue is cause by Sanic not responding with 100-continue so the client wastes time waiting for it.Code snippet
Source code:
curl testing with Sanic:
curl testing with a similar Golang/go-chi based app that returns 100-continue:
Expected behavior
I'd expect all small POSTs to take less than 50 ms
Environment (please complete the following information):
The text was updated successfully, but these errors were encountered: