-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Serialization of decimals does not respect precision #1726
Comments
That is expected behavior. Json.NET always serializes floats and decimals with a decimal point. You could write a JsonConverter for |
You see, if a value becomes something else after we serialize and de-serialize it -- this is generally not good. Don't you agree? And I had a real problem with this, so it is not just my perfectionism. Of course, the change is not desirable if it is potentially breaking. I do not know how much client code may relay on having .0 in serialized decimals... JSON follows JavaScript data types, and in JavaScript it is all the same data type: Can you please just think one more time about it? Thank you for the hint about custom converter, I will try that. |
When you explicitly deserialize to C# type definitions, you're right there would be no difference, but many people use Json.NET's LINQ to JSON functionality, see here, which would indeed cause issues on client code as the type to deserialize to is determined to be an integer or floating point number by if there is a decimal point in the json. |
If the type is determined dynamically during deserialization, this is a different story. You already have some uncertainty: you can not distinguish between float, double or decimal, right? In this situation, the type of deserialized value must be just enough to accommodate the value. And as the value is actually an integer, it can be deserialized to integer without loosing information. Again, this is different from the case I described: in your case, exact deserialization is not possible, so nobody will expect it. In my case, exact reconstruction of the initial data type is doable, and the user will intuitively expect it, as I did. |
I agree that this is unexpected behaviour at the very least, and imho it is also a bug. For I wholly understand that this could be a major breaking change, but can you please reconsider it? |
If you try to serialize the |
This code works just fine for me. var json = JsonConvert.SerializeObject(decimal.MaxValue);
var v = JsonConvert.DeserializeObject<decimal>(json); |
@JamesNK please correct me, if I'm wrong, but it does not seem possible to do this with a JsonConverter, without inadvertently changing deserialization too. When deserializing a decimal the reader checks if a converter exists and instead of reading the value with ReadAsDecimal(), it uses just Read() - even if the JsonConverter is CanRead = false. That results in reading the decimal as a double and losing precision. I could then use FloatParseHandling = Decimal, but that changes the behavior globally and results in problems elsewhere. Would it be possible to change this line Newtonsoft.Json/Src/Newtonsoft.Json/Serialization/JsonSerializerInternalReader.cs Line 155 in 9be95e0
EDIT: it does seem possible => it does not seem possible 🤦♂️ |
This behaviour puzzled me too. I hit a bug introduced by a new use of round-tripping via JSON, which messed up the recipient code when it tried to examine the digits. I solved it by switching to |
I was struggling with this today too. We're generating a SHA512 checksum of the data we're passing over the wire... Since it is passed as 15.0 over the wire it will be de-serialized as such on the other side, and when the generation of the verification checksum occurs, it uses 15.0 instead of 15 which we had on the caller side. Unfortunately the serialization/deserialization logic is .NET framework and servicestack logic it's out of my hands to modify so this issue messes up things a bit since 15.0 is handled different on different development environments. The best would have been to respect the value of the decimal and if it was 15.0, then send 15.0, if it was 15, then send 15... I will however try @danielearwicker 's solution and switch to System.Text.Json.... |
I burned two hours on a similar programming issue today. It is very important the precision not be changed. However JsonConvert deserialize may change the precision. I am trying to deserialize an object that comes from a database as a string, which contains an array notation and comma-separated float values... "[0.00013546789876565,.... and so on]". To use these values I had to deserialize them, which I chose JsonConvert. It is unreliable as to whether it casts the string object to its actual values or some rounded version. I don't know the details of how Newtonsoft is written. I know that C# has an issue with Cast in that it means different things in different places. I am not using Cast anymore. When I inspect the object that JsonConvert produces it may yield a slightly different precision value. It's completely unacceptable. To resolve this, I chose a hacky way of replacing the unwanted array notation ("[","]") and then splitting the string to an array. It would be preferable to take the Deserialized string and convert the object it generates to the form I needed. It's still a point to consider that precision should be controlled by the programmer, not the library. |
You
You may try to check this what I did in Rextester.com just like the screenshot below: https://rextester.com/YSCN86101 Try copying the entire code and put it on Visual Studio in a console application. The "coma" there should be "dot". It was converted by the Rextester I think. But in VS like this below: |
Problem
C#
decimal
value serialized to JSON and de-serialized back todecimal
gives a number with different precision.Explanation
Decimals in .NET are tricky: besides the number itself, they store the number of digits necessary to represent it. For example, numbers 15 and 15.0 stored in decimal variable will be represented differently in memory, though considered equal in comparison. When we serialize and de-serialize numbers, it is important to keep this information.
Steps to reproduce
Possible solution
The issue can be solved by keeping the necessary number of decimal digits in JSON representation of the number, e.g. serialize decimal 15 as integer "15", and decimal 15.0 as "15.0". This is exactly how
Decimal.ToString()
works. Then the number of digits can be respected when de-serializing back to decimal.The text was updated successfully, but these errors were encountered: