Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

curl is not agree wit return in chunk mode #1722

Closed
HeeroYui opened this issue Nov 27, 2019 · 7 comments · Fixed by #1876
Closed

curl is not agree wit return in chunk mode #1722

HeeroYui opened this issue Nov 27, 2019 · 7 comments · Fixed by #1876
Labels

Comments

@HeeroYui
Copy link

for uploading large file > 200 MB I write:

@app.post('/data', stream=True)
async def handler(_request):
async def streaming(_response):
total_size = 0
while True:
body = await _request.stream.read()
if body is None:
break
total_size += len(body)
print("body " + str(len(body)) + "/" + str(total_size))
return response.stream(streaming)

And send the file with 

curl -F 'file=@Totally_Spies.mp4' -H 'transfer-encoding:chunked' 127.0.0.1:15080/data -X POST -O


The curl send #95MB befor stoping with the error:

curl: (56) Illegal or missing hexadecimal sequence in chunked-encoding


I do not understand what is wrong!
@Tronic
Copy link
Member

Tronic commented Nov 28, 2019

Sanic's default request maximum size is 100 MB. Set REQUEST_MAX_SIZE=10000000000 for up to 10 GB.

Security precaution: a remote attacker could consume all your RAM by sending a very large request to your non-streaming endpoints.

IMO the main limit ought not to apply to streaming handlers, where the handler itself may at any phase stop the transfer, and where the whole request doesn't need to get buffered in RAM. But this requires some structural changes in Sanic because now such requests are rejected as soon as a header states content-size larger than the limit. Routing should be done as soon as the path is received, so that the limit can be ignored for streaming handlers.

@HeeroYui
Copy link
Author

Thank you very much.
this is not a problem, the memory will be lock by the virtualisation of the service (for me).
But is-it possible to configure this limit in dynamic, temporary model that permit to unlock limit and after transfert lock again?

@Tronic
Copy link
Member

Tronic commented Nov 29, 2019

No, it is not possible to configure it per-request in any meaningful way. Even if you could hack chunked-encoding to work around the limit, any upload which provides body size in request header (content-size instead of chunked encoding) would still get rejected before routing, middleware or your handler are called.

@stale
Copy link

stale bot commented Feb 27, 2020

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. If this is incorrect, please respond with an update. Thank you for your contributions.

@stale stale bot added the stale label Feb 27, 2020
@Tronic
Copy link
Member

Tronic commented Feb 27, 2020

This is to be sorted out in #1791

@stale stale bot removed the stale label Feb 27, 2020
@Tronic Tronic mentioned this issue Feb 28, 2020
@Tronic
Copy link
Member

Tronic commented Mar 1, 2020

This is now sorted out in #1791 where streaming handlers may change the limit before reading the body:

@app.post("/bigdata", stream=True)
async def bigdata(request):
    request.stream.request_max_size = 1_000_000_000
    async for data in request.stream:
        # do something

Changing the limit before await request.stream.read() or the async for loop affects content-length and chunked modes equivalently. The maximum amount buffered by Sanic is still limited by app.config.REQUEST_MAX_SIZE, which should always be set to a low value.

@stale
Copy link

stale bot commented May 30, 2020

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. If this is incorrect, please respond with an update. Thank you for your contributions.

@stale stale bot added the stale label May 30, 2020
@stale stale bot closed this as completed Jun 30, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants