You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I don't see how we can easily do that. crates have a fairly predictable compression ratio and that maximum size looks reasonable for it, but we could be downloading archives that are legitimately gigabytes large. i guess we could set a default limit with a cli flag to increase it?
I don't see how we can easily do that. crates have a fairly predictable compression ratio and that maximum size looks reasonable for it, but we could be downloading archives that are legitimately gigabytes large.
Perhaps we could add that for downloading from crates.io
For downloading binaries, I was thinking about applying different limit for each compression methods, based on the max compression rate.
Though now I think of it again, it doesn't sound like that useful except for preventing bugs in the implementation.
i guess we could set a default limit with a cli flag to increase it?
Yes I think a flag to limit ratio/max data to write can be useful, as many bins are actually quite small.
cargo has recently added size limit to unpacking to migrate against zip bomb rust-lang/cargo#11337
I wonder shall we also do the same in binstalk-downloader?
The text was updated successfully, but these errors were encountered: