-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Large file upload (streaming support) #546
Comments
@nszceta did you checkout the |
@r0fls input, not output. Request, not response streaming to file. |
ah yes, that's why you said upload :) |
I have successfully implemented a method for this in Flask, using a server side session due to the insane way browsers stream files to the server. Here is a snippet of my Flask code I used in one of my projects @app.route('/image', methods=['POST'])
def image_post():
rf = request.files['files[]']
ext = rf.mimetype.split('/')[1]
dst = session['image'] = 'tmp/{}.{}'.format(session['key'], ext)
try:
# extract starting byte from Content-Range header string
range_str = request.headers['Content-Range']
start_bytes = int(range_str.split(' ')[1].split('-')[0])
with open(dst, 'ab') as f:
f.seek(start_bytes)
f.write(rf.stream.read())
return jsonify({})
except KeyError:
with open(dst, 'wb') as f:
f.write(rf.stream.read())
return jsonify({}) A few issues:
I'd love to see support for this in Sanic, and I may have time to implement it as well, but I would like more feedback first. |
I would like to see this too. I miss the "request.stream" feature a lot. (similar to request.body but a file-like object not already in-memory) |
@frnkvieira what would be the best way to get started? |
I am happy to contribute some changes to Sanic to address this. It looks like some rework is needed on not quite a "pretty big refactoring", am I missing something? |
This could be covered by #697 but I'm not sure. @nszceta do you care to test with your example that was larger than the machine's ram? cc @38elements |
@r0fls sure. I can also add a compact unit test soon too. |
What's the current state on this? |
I use Get to override default |
Sanic stores request body in memory even for streamed requests: https://github.com/channelcat/sanic/blob/master/sanic/server.py#L278 The framework has no flow control at all unfortunately. |
@asvetlov |
The mentioned above line pushes a data into the queue. The data is in memory already |
@Jeffwhen I agree with @asvetlov . You need to |
Are there any way to do flow control using |
Sure, there is but somebody should make a Pull Request :) |
one temporary solution for nginx users: nginx-upload-module |
What a pity is that aiohttp supports it. |
one temporary advice for flow control in # line 247 nearby
def on_headers_complete(self):
from multidict import CIMultiDict
self.request = self.request_class(
url_bytes=self.url,
headers=CIMultiDict(self.headers),
version=self.parser.get_http_version(),
method=self.parser.get_method().decode(),
transport=self.transport
)
# Remove any existing KeepAlive handler here,
# It will be recreated if required on the new request.
if self._keep_alive_timeout_handler:
self._keep_alive_timeout_handler.cancel()
self._keep_alive_timeout_handler = None
if self.is_request_stream:
self._is_stream_handler = self.router.is_stream_handler(
self.request)
if self._is_stream_handler:
################################## fix-1 start
self.request.stream = asyncio.Queue(10) # whatever size you like
################################## fix-1 end
self.execute_request_handler()
# line 267 nearby
def on_body(self, body):
if self.is_request_stream and self._is_stream_handler:
#################################### fix-2 start
async def put_body(body):
if self.request.stream.full():
self.transport.pause_reading()
await self.request.stream.put(body)
self.transport.resume_reading()
else:
await self.request.stream.put(body)
self._request_stream_task = self.loop.create_task(put_body(body))
#################################### fix-2 end
return
self.request.body.append(body) |
Closed per #1423 |
This issue has been mentioned on Sanic Community Discussion. There might be relevant details there: https://community.sanicframework.org/t/large-file-upload-using-sanic-api/1326/1 |
I started the above thread on sanic community after going through the documentation and the issues reported on github. I can't see any option to upload file larger than ram in sanic. Just wanted to know if this is limitation of sanic or there is way to enable file upload larger than ram. |
There is currently no support for uploading very large files greater than the amount of RAM available to the machine.
The text was updated successfully, but these errors were encountered: