Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Browser memory usage #89

Closed
ylt opened this issue Jul 27, 2022 · 9 comments · Fixed by #95
Closed

Browser memory usage #89

ylt opened this issue Jul 27, 2022 · 9 comments · Fixed by #95

Comments

@ylt
Copy link

ylt commented Jul 27, 2022

Been finding that when uploading large files (i.e. 6gb), the browsers RAM usage increases to include the filesize. This can exceed the users RAM and end up with the users browsers tab crashing or their computer crashing.

Looking at this myself, I've been able to reproduce it on both Firefox (102) and Chrome (103) on both Windows and Mac M1.

Firefox's dev tools are able to identify a bunch of ArrayBuffers that take up as much RAM as the filesize, but is unable to attribute it to a js call stack(maybe browser internals). Chrome dev tools however are unable to see the memory at all.
image

Googling, I'm not finding much authoritive, but it does seem like file.slice as done by Upchunk can be problematic.

this.reader.readAsArrayBuffer(this.file.slice(start, start + length));
. Suggestions I'm finding appear to be to use a ReadableStream instead as that's stream based and is probably more suited for reading out chunks without loading in the whole file.

@dylanjha
Copy link
Contributor

This is a really good find @ylt, we will dig into it and see what the best fix is.

Thanks for the clear debugging steps 🙏🏼

@cjpillsbury
Copy link
Contributor

@ylt echoing @dylanjha thanks for the sleuthing. Confirmed this is currently an issue. I'll be investigating possible solutions here, where we'll want to make sure we're striking a good balance between performance/time to upload w/memory efficiency. I'll be sure to update here on any direction we end up taking.

@digitalscream
Copy link

@cjpillsbury - any has there been any update on this?

@jfbaraky
Copy link

jfbaraky commented Sep 1, 2022

We had some problems trying to upload some large files (+10GB) because we could only store up to 2GB inside the node Buffer.
So, instead of using UpChunk, we changed the code to the "manual approach":

  • Using fs.createReadStream to create a stream instead of loading the whole file into the memory. Instead of the default chunk size, we used the highWaterMark option to increase the size of each chunk;
  • Buffer.byteLength to measure the size of each chunk;
  • axios to PUT each chunk to the upload url (warning: AXIOS identifies 308 as an error);

Hope it helps while UpChunk don't handle the stream directly;

@cjpillsbury
Copy link
Contributor

@digitalscream None yet unfortunately. Our team is split across multiple projects, so this may be a few weeks out still before we can work on it in due diligence.

@cjpillsbury
Copy link
Contributor

If anyone on this thread wants to take a stab at migrating upchunk to use read streams, PRs are always welcome!

@cjpillsbury
Copy link
Contributor

Hey all! I know this has been a long time coming, but we've got a refactor of upchunk to use ReadableStream. I haven't yet cut a new release (will be a major release since it's a fundamental rearchitecture of upchunk), but I'll keep you posted post.

@digitalscream
Copy link

@cjpillsbury - thanks for that...I'll keep an eye out!

@cjpillsbury
Copy link
Contributor

It's official:
https://www.npmjs.com/package/@mux/upchunk
https://github.com/muxinc/upchunk/releases/tag/v3.0.0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants