-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
basic request streaming support with flow control #1423
basic request streaming support with flow control #1423
Conversation
Merge upstream master branch
…ntation fix StreamBuffer buffer_size
Codecov Report
@@ Coverage Diff @@
## master #1423 +/- ##
==========================================
+ Coverage 84.62% 84.74% +0.11%
==========================================
Files 17 17
Lines 1704 1717 +13
Branches 322 322
==========================================
+ Hits 1442 1455 +13
Misses 203 203
Partials 59 59
Continue to review full report at Codecov.
|
Anyone take a look ? |
@yunstanford can review tomorrow |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would it be possible to add a full-fledged example showcasing the use of this feature so that it can come handy to the end users?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hmmm I'm wondering if StreamBuffer
is a leaky abstraction, I feel like a "Buffer" that defines a read
method should take an n
bytes argument detailing at most how many bytes the call should return. Without that this becomes a thin wrapper around a Queue
and doesn't really look like a "Buffer" or support the same workflows.
This may be a misunderstanding of how on_body
gets invoked on my part but it seems like buffer_size
actually is a pretty loose limit because there is no control on how big each item on the queue is. I think this solution as stands is a good way of asking for "any amount of bytes as soon as they arrive" but I'm not sure it provides the finer grained traffic control you might expect from this streaming.
TBH I don't have a whole lot of experience with data streaming to the server so I may be overthinking the use case.
@abuckenheimer agree in general, but httptools doesn't provide the transport that we can control how many bytes to read from the socket. We can build another layer in This is a simple fix for #546, providing basic data flow control. We can look into |
Gotcha, I still think REQUEST_BUFFER_QUEUE_SIZE is kind of meaningless though, we need to restrict based on amounts of bytes in the queue somehow. Making an actual async buffer is kinda complicated, I'm fine with moving forward on this as long as there is some sort of "alpha/beta" marker on streaming incoming requests because I have to imagine the API will change. |
yeah, the API doesn't need change as we can set default value for |
Each item could potentially be any size in bytes, how do you plan on rechunking the bytes across items in the queue? Also ideally the default would be return all bytes like a real buffer. I realize this is kind of an anti pattern because your really no longer streaming the request but, I'd expect |
We'll need introduce another layer in StreamBuffer for rechunking, but didn't think deep into this. and introduce another API like Also in reality, the common use case, like uploading large file, usually is in a loop, users might be not very interested in the exact bytes in each single read. We can definitely look into improving this in the future as mentioned before (not small amount of work). and we can provide the functionality/support first. |
@ahopkins any other feedbacks ? |
@yunstanford @ahopkins Can we get this feature in next release ? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍 Looks good to me.
@pypycoder Unfortunately not. We put a freeze on features a few weeks back for the 18.12 release (which is ready to ship). It should be added to 19.03 |
@yunstanford @ahopkins this PR broke the graceful shutdown of streaming responses in 19.03.1 (Perhaps I'll create an issue or PR later) UPD: #1558 |
basic request streaming support, could be useful in cases like large file upload.