Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to send response to client while streaming. #1153

Closed
sunbarve opened this issue Mar 8, 2018 · 11 comments
Closed

How to send response to client while streaming. #1153

sunbarve opened this issue Mar 8, 2018 · 11 comments
Labels

Comments

@sunbarve
Copy link

sunbarve commented Mar 8, 2018

I need to know how we send response to client for every chunks in streaming to overcome timeout error.

@sourcepirate
Copy link
Contributor

Could you be more precise ? If you are asking about the streaming api of sanic here is the docs for it.
https://sanic.readthedocs.io/en/latest/sanic/streaming.html.
Or if your question is something else please elaborate so that we can help you.

@sunbarve
Copy link
Author

sunbarve commented Mar 8, 2018

@sourcepirate Yes i want to use streaming API for upload file from client and i also check that sanic streaming API. but my question is how we send response to client while processed every chunks..i.e "First chunk is save" for making connection between client and server.

@sourcepirate
Copy link
Contributor

sourcepirate commented Mar 8, 2018

I think you should take a look at #546 and
#697. Thanks.

@sunbarve
Copy link
Author

sunbarve commented Mar 8, 2018

As per streaming API of sanic, It take whole request and then stream that request in body one by one, after completing that request send only one response to client. I want to use this for web application like file upload functionality, and get response for every stream while upload.

@sourcepirate
Copy link
Contributor

I believe you could stream the request in chunks with await request.stream.get().

@sourcepirate
Copy link
Contributor

sourcepirate commented Mar 9, 2018

If your question is about how to make client send in chunks of a file. You could refer to https://www.html5rocks.com/en/tutorials/file/filesystem/

@sunbarve
Copy link
Author

@sourcepirate Thanks

@sunbarve
Copy link
Author

@sourcepirate,I want send a stream content to the client with the stream method because as per sanic response streaming, it stream all data in body and final send one response to client instead of send every stream.

@sjsadowski
Copy link
Contributor

As I'm interpreting it, I think that the question is how to accept chunked streams and respond as those chunks complete. But I'm not 100% sure.

@vltr
Copy link
Member

vltr commented Oct 1, 2018

As Sanic is "architectured" today, this might be possible with some knowledge of the inner protocols and stuff. The Sanic server waits for all chunks of the body to be completed prior to calling the request handler unless the same is a stream, so it would need to read the stream data instead of the body but, to send data (even if it's a \0) as it receives from the stream, I don't really know (if it's possible or if it'll not break the connection). Perhaps you can implement a timer in a task, but I'm really not sure of what all this means in terms of functionality (or if this is even possible with a "clean solution"). I tried to keep up on this (as this kind of "issue" grew with a lot of other issues and PRs). Probably @r0fls and @ashleysommer (specially) can enlighten us better regarding this ... Personally speaking, this should need some documentation (and/or examples for any "Tips & Tricks" future page).

@sjsadowski
Copy link
Contributor

Closing, thread has stagnated and requester has not added to comments since org move.

viniciusd added a commit to viniciusd/sanic that referenced this issue Sep 20, 2019
Even if it were fixed, it would then face mypy bug with try/except conditional imports sanic-org#1153
python/mypy#1153
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants