Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Fix handling of very large numbers in json. (#864)
The fix was very easy: enable `arbitrary_precision` in JSON. The issue was reported by a customer and manifested in a very annoying and obscure manner. Essentially what's going on: massive numbers in the schema (e.g. -1.7976931348623157e+308) were being handled correctly when passed this way, but not when passed as the full number with all of the 308 zeros after it. This won't be a problem usually, as these numbers are wacky, but there was a bug that was triggered with event type schemas. As mentioned, we accept the `e+308` form, but rejected the full number with all the zeros. The problem is that when saving schemas to postgres we are saving it to a jsonb field which doesn't save the original json structure. The field was saved without a problem (because of the short representation), but once read from postgres, it was being read as the long form, and our code was failing to parse that and just failing when fetching this from database. It seems like we aren't the first ones to hit this issue, and serde_json already supports very large integers. Here are the docs for this feature: > Use an arbitrary precision number representation for serde_json::Number. This > allows JSON numbers of arbitrary size/precision to be read into a Number and > written back to a JSON string without loss of precision. <!-- Thank you for your Pull Request. Please provide a description above and review the requirements below. Bug fixes and new features should include tests. --> ## Motivation <!-- Explain the context and why you're making that change. What is the problem you're trying to solve? If a new feature is being added, describe the intended use case that feature fulfills. --> ## Solution <!-- Summarize the solution and provide any necessary context needed to understand the code change. -->
- Loading branch information