Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Serialization of decimals does not respect precision #1726

Closed
Alex-Hagen-Thorn-at-Experieco opened this issue May 28, 2018 · 12 comments
Closed

Comments

@Alex-Hagen-Thorn-at-Experieco
Copy link

Alex-Hagen-Thorn-at-Experieco commented May 28, 2018

Problem

C# decimal value serialized to JSON and de-serialized back to decimal gives a number with different precision.

Explanation

Decimals in .NET are tricky: besides the number itself, they store the number of digits necessary to represent it. For example, numbers 15 and 15.0 stored in decimal variable will be represented differently in memory, though considered equal in comparison. When we serialize and de-serialize numbers, it is important to keep this information.

Steps to reproduce

using Newtonsoft.Json;
using System;

namespace JsonDecimalIssue
{
    class Program
    {
        static void Main(string[] args)
        {
            decimal before = 15;
            string serialized = JsonConvert.SerializeObject(before); // produces "15.0" <- incorrect
            decimal after = JsonConvert.DeserializeObject<decimal>(serialized);

            Console.WriteLine(before); // Writes "15"
            Console.WriteLine(after);  // Writes "15.0"
            Console.ReadKey();
        }
    }
}

Possible solution

The issue can be solved by keeping the necessary number of decimal digits in JSON representation of the number, e.g. serialize decimal 15 as integer "15", and decimal 15.0 as "15.0". This is exactly how Decimal.ToString() works. Then the number of digits can be respected when de-serializing back to decimal.

@JamesNK
Copy link
Owner

JamesNK commented May 29, 2018

That is expected behavior. Json.NET always serializes floats and decimals with a decimal point.

You could write a JsonConverter for decimal and write it without the trailing .0

@JamesNK JamesNK closed this as completed May 29, 2018
@Alex-Hagen-Thorn-at-Experieco
Copy link
Author

Alex-Hagen-Thorn-at-Experieco commented May 29, 2018

  1. I would say, this behavior is not expected unless you are aware of it. This is special processing of decimal data type, which an average user won't expect.

  2. When the expected behavior causes a problem, it is still a problem. The question is: can this expected behavior be changed?

You see, if a value becomes something else after we serialize and de-serialize it -- this is generally not good. Don't you agree? And I had a real problem with this, so it is not just my perfectionism.

Of course, the change is not desirable if it is potentially breaking. I do not know how much client code may relay on having .0 in serialized decimals... JSON follows JavaScript data types, and in JavaScript it is all the same data type: Number. When we de-serialize, we are guided by C# type definitions, so we do not need extra hint (like having .0) to distinguish integer numbers from floating point and decimals. So I do not see any negative impact of the change I proposed.

Can you please just think one more time about it?

Thank you for the hint about custom converter, I will try that.

@TylerBrinkley
Copy link
Contributor

TylerBrinkley commented May 29, 2018

When you explicitly deserialize to C# type definitions, you're right there would be no difference, but many people use Json.NET's LINQ to JSON functionality, see here, which would indeed cause issues on client code as the type to deserialize to is determined to be an integer or floating point number by if there is a decimal point in the json.

@Alex-Hagen-Thorn-at-Experieco
Copy link
Author

If the type is determined dynamically during deserialization, this is a different story. You already have some uncertainty: you can not distinguish between float, double or decimal, right? In this situation, the type of deserialized value must be just enough to accommodate the value. And as the value is actually an integer, it can be deserialized to integer without loosing information.

Again, this is different from the case I described: in your case, exact deserialization is not possible, so nobody will expect it. In my case, exact reconstruction of the initial data type is doable, and the user will intuitively expect it, as I did.

@stijnherreman
Copy link

I agree that this is unexpected behaviour at the very least, and imho it is also a bug. For 15, the precision is 2 and the scale is 0. For 15.0, the precision is 3 and the scale is 1. They're two different things.

I wholly understand that this could be a major breaking change, but can you please reconsider it?

@AntiPasha
Copy link

If you try to serialize the Decimal.MaxValue which is 79228162514264337593543950335 and deserialize it back, you'll get an Overflow.Exception, because trailing .0 doesn't allow to parse the 79228162514264337593543950335.0

@TylerBrinkley
Copy link
Contributor

@AntiPasha

This code works just fine for me.

var json = JsonConvert.SerializeObject(decimal.MaxValue);
var v = JsonConvert.DeserializeObject<decimal>(json);

@asgerhallas
Copy link

asgerhallas commented Apr 30, 2020

@JamesNK please correct me, if I'm wrong, but it does not seem possible to do this with a JsonConverter, without inadvertently changing deserialization too.

When deserializing a decimal the reader checks if a converter exists and instead of reading the value with ReadAsDecimal(), it uses just Read() - even if the JsonConverter is CanRead = false.

That results in reading the decimal as a double and losing precision.

I could then use FloatParseHandling = Decimal, but that changes the behavior globally and results in problems elsewhere.

Would it be possible to change this line

if (reader.TokenType == JsonToken.None && !reader.ReadForType(contract, converter != null))
to check if the converter can actually be used for reading?

EDIT: it does seem possible => it does not seem possible 🤦‍♂️

@danielearwicker
Copy link

This behaviour puzzled me too. I hit a bug introduced by a new use of round-tripping via JSON, which messed up the recipient code when it tried to examine the digits.

I solved it by switching to System.Text.Json, which doesn't output an unnecessary .0 suffix. Note that it still doesn't fully round-trip the precision, it just renders the minimum digits needed to preserve the value. So 15.0 comes back as 15.

@dozer75
Copy link

dozer75 commented Dec 15, 2020

I was struggling with this today too. We're generating a SHA512 checksum of the data we're passing over the wire...

Since it is passed as 15.0 over the wire it will be de-serialized as such on the other side, and when the generation of the verification checksum occurs, it uses 15.0 instead of 15 which we had on the caller side.

Unfortunately the serialization/deserialization logic is .NET framework and servicestack logic it's out of my hands to modify so this issue messes up things a bit since 15.0 is handled different on different development environments.

The best would have been to respect the value of the decimal and if it was 15.0, then send 15.0, if it was 15, then send 15...

I will however try @danielearwicker 's solution and switch to System.Text.Json....

@wendyjboss
Copy link

wendyjboss commented Dec 30, 2020

I burned two hours on a similar programming issue today. It is very important the precision not be changed. However JsonConvert deserialize may change the precision. I am trying to deserialize an object that comes from a database as a string, which contains an array notation and comma-separated float values... "[0.00013546789876565,.... and so on]". To use these values I had to deserialize them, which I chose JsonConvert. It is unreliable as to whether it casts the string object to its actual values or some rounded version. I don't know the details of how Newtonsoft is written. I know that C# has an issue with Cast in that it means different things in different places. I am not using Cast anymore. When I inspect the object that JsonConvert produces it may yield a slightly different precision value. It's completely unacceptable. To resolve this, I chose a hacky way of replacing the unwanted array notation ("[","]") and then splitting the string to an array. It would be preferable to take the Deserialized string and convert the object it generates to the form I needed. It's still a point to consider that precision should be controlled by the programmer, not the library.

@csharpibocbina
Copy link

csharpibocbina commented Apr 22, 2021

Problem

C# decimal value serialized to JSON and de-serialized back to decimal gives a number with different precision.

Explanation

Decimals in .NET are tricky: besides the number itself, they store the number of digits necessary to represent it. For example, numbers 15 and 15.0 stored in decimal variable will be represented differently in memory, though considered equal in comparison. When we serialize and de-serialize numbers, it is important to keep this information.

Steps to reproduce

using Newtonsoft.Json;
using System;

namespace JsonDecimalIssue
{
    class Program
    {
        static void Main(string[] args)
        {
            decimal before = 15;
            string serialized = JsonConvert.SerializeObject(before); // produces "15.0" <- incorrect
            decimal after = JsonConvert.DeserializeObject<decimal>(serialized);

            Console.WriteLine(before); // Writes "15"
            Console.WriteLine(after);  // Writes "15.0"
            Console.ReadKey();
        }
    }
}

Possible solution

The issue can be solved by keeping the necessary number of decimal digits in JSON representation of the number, e.g. serialize decimal 15 as integer "15", and decimal 15.0 as "15.0". This is exactly how Decimal.ToString() works. Then the number of digits can be respected when de-serializing back to decimal.

You

I burned two hours on a similar programming issue today. It is very important the precision not be changed. However JsonConvert deserialize may change the precision. I am trying to deserialize an object that comes from a database as a string, which contains an array notation and comma-separated float values... "[0.00013546789876565,.... and so on]". To use these values I had to deserialize them, which I chose JsonConvert. It is unreliable as to whether it casts the string object to its actual values or some rounded version. I don't know the details of how Newtonsoft is written. I know that C# has an issue with Cast in that it means different things in different places. I am not using Cast anymore. When I inspect the object that JsonConvert produces it may yield a slightly different precision value. It's completely unacceptable. To resolve this, I chose a hacky way of replacing the unwanted array notation ("[","]") and then splitting the string to an array. It would be preferable to take the Deserialized string and convert the object it generates to the form I needed. It's still a point to consider that precision should be controlled by the programmer, not the library.

You may try to check this what I did in Rextester.com just like the screenshot below: https://rextester.com/YSCN86101

image

Try copying the entire code and put it on Visual Studio in a console application. The "coma" there should be "dot". It was converted by the Rextester I think. But in VS like this below:
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants