Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support limits in PNG animation decoding #2112

Merged
merged 13 commits into from
Jan 21, 2024

Conversation

Shnatsel
Copy link
Contributor

@Shnatsel Shnatsel commented Jan 16, 2024

Implements memory limit for animation decoding and dimensions limits for individual animation frames, similar to how
it's done for GIF.

Notable changes:

  • Added a helper function reserve_buffer on Limits to be reused in PNG and GIF animation decoders
  • Implemented allocation tracking in ApngDecoder
  • Changed the PNG decoder's new() to default to no limits (consistently with other decoders) now that the limits are actually enforced

Out of scope of this PR:

Open questions:

let info = self.reader.info();
limits.check_dimensions(info.width, info.height)?;
self.limits = limits;
// TODO: add `png::Reader::change_limits()` and call it here
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can't really implement a change_limits method on the png side.

The problem is that many of the allocations in that crate happen when you create the png::Reader by calling png::Decoder::read_info because that's where the image metadata is read and saved into the inner Info struct. Ironically, the crate allocates relatively little memory for decoding pixel data, because it is quite good at decoding that data in a streaming fashion.

Rather, the current plan is to switch to calling Decoder::set_limits in this method, and only later call read_info when the user queries metadata beyond what's stored in the IHDR.

An alternative possibility would be for the png to actually not store metadata chunks as it reads them, but rather to store the offsets of those chunks, and then Seek past them. This would bound the memory used by the decoder while still enabling the metadata to be loaded later. In that case, the limits could safely be set later. Adding the Seek bound would, however, be a breaking change both here and in the png crate. (An added complication is that BufReader's seek() implementation has some performance gotchas. It should be possible to work around them once Seek::seek_relative is stabilized, but I don't know if/when that'll happen.)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The idea is that change_limits() would look at the amount of memory already allocated by the png crate and report an error if the crate is already over budget. This would let us easily bail out of the total memory usage limit is exceeded (including all the metadata).

I don't think the Seek bound is a good idea (too limiting, largely defeats streaming decoding use cases) but a refactor to move the compositing logic to png and track the allocations there could be viable.

@fintelia
Copy link
Contributor

I think that at some point we should probably move the frame compositing logic into the png and gif crates respectively, which would also solve the issues around sharing the allocation limit across the crate boundary. But that'll be a larger refactor, so I'm OK with merging this for now

@Shnatsel
Copy link
Contributor Author

Then let's go ahead and merge this. I plan to do some more work on the png decoder and I don't want to run into editing conflicts with myself 😆

@fintelia fintelia merged commit 4989d5f into image-rs:master Jan 21, 2024
35 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants