-
-
Notifications
You must be signed in to change notification settings - Fork 799
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add configurable processing limits for JSON parser (StreamReadConstraints
)
#637
Comments
Since 2.13 has been released marking as 2.14 (earliest) due to need for API additions and possible compatibility concerns for some maximum limits. Also: I think this issue is still important even if databind tackles some of more immediate concerns (via FasterXML/jackson-databind#2816). |
Some thinking out aloud:
As to configuration settings, there probably needs to be a shared base type ( One immediate concern I have, however, is that of adding state to |
Will not happen for 2.14 but should be the first thing for 2.15. |
Implemented for inclusion in 2.15:
|
@cowtowncoder are there any more of these that you want to do for 2.15? |
The total size could readily be limited using stream wrappers like those in https://github.com/pjfanning/json-size-limiter/tree/main/src/main/java/com/github/pjfanning/json/util . That code is borrowed. It should also be possible to find ByteBuffer equivalents. There is an argument that many web service frameworks have support for limiting the acceptable size of inputs - and that users should prefer those approaches. The earlier that the large input size is spotted, the better |
Yes, ideally we'd be able to target (1) by I think existence of stream wrappers is useful and can work for now even if eventually in-core limits were added. But deeper nesting checks should definitely be added in core at some point. Just not sure if timing works for 2.15. And field names... that's an interesting thing. Not commonly reported as attack but conceivably it would be vector same way (... and more) as max String token length. |
It seems like (2) is something that can be left to users. I think it is reasonable for jackson to concentrate on the cases where comparatively fewer bytes in input can lead to bad outcomes. |
@pjfanning It could be, altough I implemented this with Woodstox few years back and it really is rather easy to add. Then again Jackson-core supports decorators so it's... even easier to wrap by users. But I agree it's not a high priority at this point. |
Filed #1046 for max document length -- I think it is worth adding, after thinking it through. Hope to work on it for 2.16. |
StreamReadConstraints
)
Out of 6 ideas, 5 now implemented and included in Closing this issue as completed. |
(note: related to/inspired by FasterXML/jackson-databind#2816)
Some aspects of input document are prone to possible abuse, so that malicious sender can create specifically crafted documents to try to overload server. This includes things like:
StreamReadConstraints.maxNestingDepth()
to constraint max nesting depth (default: 1000) #943StreamReadConstraints
(fixessonatype-2022-6438
) -- default 1000 chars #827StreamReadConstraints
limit for longest textual value to allow (default: 5M) #863and although streaming parser can typically handle many of these cases quite well, they can be very problematic for higher-level processing -- and even for streaming, for highly parallel processing.
So. It would be good to create a configurable set of options that:
JsonFactory
, although it'd be really nice if per-parser overrides were possible)Further reading: related material.
Here are some relevant links:
The text was updated successfully, but these errors were encountered: