-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Internal json converters cannot function independently of their original options #50205
Comments
Tagging subscribers to this area: @eiriktsarpalis, @layomia Issue DetailsInternal json collection converters for instance assume Line 42 in 79ae74f
When we start at the point of our custom converter's Read() method (see below) we get the following steps to an NRE:
using System;
using System.Collections.Immutable;
using System.Text.Json;
using System.Text.Json.Serialization;
namespace StjRepro
{
public enum Cases
{
One,
Two,
Three,
Four
}
class JsonImmutableArrayConverter<T> : JsonConverter<ImmutableArray<T>>
{
JsonConverter<ImmutableArray<T>> _originalConverter;
public JsonImmutableArrayConverter()
=> _originalConverter = (JsonConverter<ImmutableArray<T>>)new JsonSerializerOptions().GetConverter(typeof(ImmutableArray<T>));
public override ImmutableArray<T> Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
// We're passing the current options and not the default options because we do want our value converters to work
// in this example, from strings to the enum 'Cases'.
=> _originalConverter.Read(ref reader, typeToConvert, options);
public override void Write(Utf8JsonWriter writer, ImmutableArray<T> value, JsonSerializerOptions options)
{
throw new NotSupportedException();
}
}
class JsonImmutableArrayConverter : JsonConverterFactory
{
public override bool CanConvert(Type typeToConvert)
=> typeToConvert.IsGenericType && typeToConvert.GetGenericTypeDefinition() == typeof(ImmutableArray<>);
public override JsonConverter CreateConverter(Type typeToConvert, JsonSerializerOptions options)
=> (JsonConverter)Activator.CreateInstance(typeof(JsonImmutableArrayConverter<>).MakeGenericType(typeToConvert.GenericTypeArguments[0]));
}
class Program
{
static void Main(string[] args)
{
var options = new JsonSerializerOptions();
options.Converters.Add(new JsonImmutableArrayConverter());
options.Converters.Add(new JsonStringEnumConverter(JsonNamingPolicy.CamelCase));
JsonSerializer.Deserialize<ImmutableArray<Cases>>(@"[""one"",""two"",""three"",""four""]", options);
}
}
}
The reason for pulling out an 'original' converter like this is because some code paths in a custom converter should just be able to default to existing converters, merely wrapping over them. The only way I see to hack around this without a proper fix is passing a specially crafted options instance into the framework converter that has this custom converter removed, it will resolve the correct public override ImmutableArray<T> Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
{
var optionsDiff = new JsonSerializerOptions(options);
JsonConverter factory = null;
foreach (var converter in optionsDiff.Converters)
{
if (converter.GetType() == typeof(JsonImmutableArrayConverter))
factory = converter;
}
if (factory != null)
optionsDiff.Converters.Remove(factory);
// We're passing the current options and not the default options because we do want our value converters to work
// in this case from strings to the enum 'Cases'.
return _originalConverter.Read(ref reader, typeToConvert, optionsDiff);
} If there is some other method by which to achieve what I want, I'd be glad to use it. In any case I think its good to add some documentation on how to call into the framework converters from a custom converter, this has been something I've wanted to do (and done to various degrees of success) multiple times now. /cc @layomia
|
@NinoFloris thanks! I haven't fully grokked this yet but @eiriktsarpalis and I discussed potential issues related to this just yesterday. I'll take a look. |
@NinoFloris I bumped into the very same issue yesterday and agree with your analysis. I couldn't really work around it without making changes to System.Text.Json internals so we need to fix the issue in the general case. I don't think relying on a "default" options instance is a good solution, since while it may (in some cases) ensure that Fundamentally this stems from weakness in the |
That's such an unlikely coincidence, amusing! ^_^
Evidently it's a hack to get what I want, though I'm not sure what your suggested alternative is? The converter override and caching process won't suddenly give me the behavior I want — getting the converter that would have been resolved if not for the custom one overriding it and getting cached — without api additions. To 'quickly' fix the
Obviously when going down the composite key route the slower lookup will have some unknown effect on overall perf — I'm expecting tiny? — which still needs careful analysis. If we expect it to be unfruitful it might make sense to keep most as-is and to push resolution of these differences into Anything else seems like it would require a larger overhaul of the internals, do you have any suggestions? Either way thanks for the quick reply! |
A workaround here based your original approach is to create and cache a separate options instance, populated with the class JsonImmutableArrayConverter<T> : JsonConverter<ImmutableArray<T>>
{
JsonSerializerOptions _originalOptions;
JsonConverter<ImmutableArray<T>> _originalConverter;
public JsonImmutableArrayConverter()
{
_originalOptions = new JsonSerializerOptions()
{
Converters = { new JsonStringEnumConverter(JsonNamingPolicy.CamelCase) }
};
_originalConverter = (JsonConverter<ImmutableArray<T>>)_originalOptions.GetConverter(typeof(ImmutableArray<T>));
}
public override ImmutableArray<T> Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
=> _originalConverter.Read(ref reader, typeToConvert, _originalOptions);
public override void Write(Utf8JsonWriter writer, ImmutableArray<T> value, JsonSerializerOptions options)
{
_originalConverter.Write(writer, value, _originalOptions);
}
} Then, the enum converter doesn't need to be specified at the root call to the serializer: static void Main(string[] args)
{
var options = new JsonSerializerOptions();
options.Converters.Add(new JsonImmutableArrayConverter());
//options.Converters.Add(new JsonStringEnumConverter(JsonNamingPolicy.CamelCase));
ImmutableArray<Cases> deserialized = JsonSerializer.Deserialize<ImmutableArray<Cases>>(@"[""one"",""two"",""three"",""four""]", options);
string serialized = JsonSerializer.Serialize(deserialized, options);
Console.WriteLine(serialized);
} Still investigating the issue here, but generally, multiple options instances shouldn't be passed arbitrarily as inputs to converters or the serializer. The mitigation here might be for the serializer to detect and guard against unsupported options patterns, and throw meaningful exceptions.
Yes, we'll provide some documentation for this. We can add it to this page - https://docs.microsoft.com/dotnet/standard/serialization/system-text-json-converters-how-to?pivots=dotnet-5-0#error-handling. cc @tdykstra |
In principle, I do agree that custom converters should be able to compose nicely with framework/STJ internal converters. I'm just curious about this specific scenario. What motivated the use of a custom converter here? Why not just: static void Main(string[] args)
{
var options = new JsonSerializerOptions();
options.Converters.Add(new JsonStringEnumConverter(JsonNamingPolicy.CamelCase));
ImmutableArray<Cases> deserialized = JsonSerializer.Deserialize<ImmutableArray<Cases>>(@"[""one"",""two"",""three"",""four""]", options);
Console.WriteLine(JsonSerializer.Serialize(deserialized, options));
} |
@layomia obviously this is not a tenable solution, the ImmutableArrayConverter should not have to know about all Ts (and the right type of converter) for the entire application. It's also common enough to create converters that could get shipped as a package like this https://github.com/Tarmil/FSharp.SystemTextJson.
Proper compositionality is important, doing the easy thing here seems like it undermines the custom converter model of STJ a lot (and it's already — no doubt with good intentions — much more limited than the internal converters).
ImmutableArray is slow to serialize due to it falling into the IEnumerable path, yet as we like immutability we have many instances where we use it, so we have a converter that optimizes writes (via GetMemory and MemoryMarshal.TryGetArray which is entirely safe), but leaves reads as-is. I've also had cases around discriminated unions, custom collections or other functorial types where I just want to fall back to the framework converters when some pattern match or condition is hit, most of the times this is in the read path, as going from a reader to arbitrary .net objects is a pain I try to avoid (at least without the public metadata apis that were talked about). Anyway, I have at least somewhat optimized the 'hacky fix' I showed in my original comment by caching the diffed options on an instance field. We can then do an |
The value converter model based on directly deriving from A couple of workarounds have been mentioned so I hope for now your scenario is unblocked. In the meantime, we're evaluating adding a safe-guard in the code that bridges the Lines 23 to 24 in e503be0
It may look like this: ReadStack state = default;
state.Initialize(typeToConvert, options, supportContinuation: false);
if (state.Current.JsonPropertyInfo.ConverterBase != this)
{
throw new InvalidOperationException();
}
TryRead(ref reader, typeToConvert, options, ref state, out T? value); The original repro would fail with this exception. This exception may help uncover more interesting patterns/dependencies that our callers have. cc @steveharter We'll look into improving the performance of (de)serializing immutable collections. They are currently based on the |
Thanks!
I'll be sure to subscribe to #36785 :)
Nice to see some love for deserializing into immutable collections! For us writing them fast is important as we have http read heavy workloads. Hopefully you could look into using MemoryMarshal as well for ImmutableArray. This is 100% binary compatible as long as you error out or fall back to IEnumerable when TryGetArray returns false. In the highly unlikely event ImmutableArray suddenly starts using native memory ;) |
We don't have work planned to address this for .NET 6.0, we should consider in 7. |
One possible fix is to have the internal converters encapsulate their corresponding Tagging @krwq who might be interested in this topic. |
We won't have time to work on this for .NET 7, moving to Future. |
As part of a recent AOT-compatibility update, Json.More.Net now includes several |
Internal json collection converters for instance assume
JsonClassInfo.ElementInfo
is not null, however it can actually be null if the converter has been pre-aquired from another options instance.runtime/src/libraries/System.Text.Json/src/System/Text/Json/Serialization/Converters/Collection/IEnumerableDefaultConverter.cs
Line 42 in 79ae74f
ElementInfo
returns null ifElementType
is null andElementType
will only be filled under specific circumstances.When we start at the point of our custom converter's Read() method (see below) we get the following steps to an NRE:
options.GetOrAddClassForRootType(type)
JsonClassInfo(type)
.ClassType.None
because all custom converters do.ElementType = null
because it didn't fall into theClassType.Collection
armJsonClassInfo.ElementInfo
The reason for pulling out an 'original' converter like this is because some code paths in a custom converter should just be able to default to existing converters, merely wrapping over them.
The only way I see to hack around this without a proper fix is passing a specially crafted options instance into the framework converter that has this custom converter removed, it will resolve the correct
JsonClassInfo
and includes the required custom value converters but in terms of perf (and usability) it seems far from ideal.If there is some other method by which to achieve what I want, I'd be glad to use it. In any case I think its good to add some documentation on how to call into the framework converters from a custom converter, this has been something I've wanted to do (and done to various degrees of success) multiple times now.
/cc @layomia
The text was updated successfully, but these errors were encountered: