-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Performance regression when deserializing compressed binary XML streams using data contract serializers #75437
Comments
I couldn't figure out the best area label to add to this issue. If you have write-permissions please help me learn by adding exactly one area label. |
Can you try wrapping the |
Yes, this is the same as #39233. Here Deserialize is your benchmark and Deserialize2 is your benchmark but with a BufferedStream added in between:
|
Thank you for the help. Using the BufferedStream does indeed significantly improve the performance. I am porting a large code base from .NET Framework 4.8 to .NET 6 and came across this issue. I am not sure if there are other issues that could have negative performance implications when porting from Framework. Is there documentation I should have read that highlights potential performance regressions when porting from Framework and the solutions? Thanks again for your help. |
@PriyaPurkayastha do we have a place for such info? @StephenBonikowsky I know such migration bumps are something you have an interest in also. This is similar to the discussion we had in #72266 - where do we record speed bumps that aren't breaking by a stricter definition. Cc @ericstj |
@danmoseley I am aware of "What's New in .NET 6", "Known Issues" and "Breaking changes" published per release. It appears that nobody is comfortable using the existing documentation channels for such issues, so we might just need to figure out a path forward. Since such speed bumps might be spread out over different Fundamental areas, I think it would be valuable to hear the opinions/thoughts from respective Fundamentals area owners. e.g This one as well as #72266 would be something that we can talk to @sblom about. I will start a discussion on this and also include documentation team since they have valuable inputs as well. |
I think this makes sense as a known issue; we certainly didn't want this particular usage to be so much slower, and it happened because of a change in one of our dependencies. The downside is it'll likely be a known issue in several releases, as long as we're using that dependency and it itself makes the same tradeoffs it currently does. |
Tagging subscribers to this area: @dotnet/area-system-io Issue DetailsDescriptionWhen compared to .NET Framework 4.8, the output from BenchmarkDotNet shows execution time has greatly increased in .NET6/7 when deserializing from compressed binary XML streams using data contract serializers for large collections of objects (>1000). For example, using my configuration, deserializing 10,000 objects is around 200 times slower in .NET6/7 when compared to .NET Framework 4.8. Data Contract Code[DataContract(IsReference = true)]
public class Person
{
[DataMember]
public Person Parent { get; set; }
[DataMember]
public List<Person> Children { get; set; }
} Benchmark Code[SimpleJob(RuntimeMoniker.Net48, baseline: true)]
[SimpleJob(RuntimeMoniker.Net60)]
[SimpleJob(RuntimeMoniker.Net70)]
public class Benchmark
{
[Params(10, 100, 1000, 10000)]
public int N { get; set; }
private byte[] _serialized;
private readonly DataContractSerializer _serializer = new (typeof(Person));
[GlobalSetup]
public void Serialize()
{
var person = new Person();
person.Children = Enumerable.Range(0, N-1).Select(_ => new Person {Parent = person}).ToList();
using var compressed = new MemoryStream();
using var compressor = new DeflateStream(compressed, CompressionMode.Compress);
using var writer = XmlDictionaryWriter.CreateBinaryWriter(compressor);
_serializer.WriteObject(writer, person);
writer.Close();
_serialized = compressed.ToArray();
}
[Benchmark]
public object Deserialize()
{
using var compressed = new MemoryStream(_serialized);
using var decompressor = new DeflateStream(compressed, CompressionMode.Decompress);
using var reader = XmlDictionaryReader.CreateBinaryReader(decompressor, XmlDictionaryReaderQuotas.Max);
return _serializer.ReadObject(reader);
}
} ConfigurationBenchmarkDotNet=v0.13.2, OS=Windows 11 (10.0.22000.856/21H2)
11th Gen Intel Core i7-11800H 2.30GHz, 1 CPU, 16 logical and 8 physical cores
.NET SDK=7.0.100-preview.7.22377.5
[Host] : .NET 7.0.0 (7.0.22.37506), X64 RyuJIT AVX2
.NET 6.0 : .NET 6.0.8 (6.0.822.36306), X64 RyuJIT AVX2
.NET 7.0 : .NET 7.0.0 (7.0.22.37506), X64 RyuJIT AVX2
.NET Framework 4.8 : .NET Framework 4.8 (4.8.4510.0), X64 RyuJIT VectorSize=256 Data
|
Moving this to System.IO so the appropriate team can decide how they want to document this recommendation for BufferedStream when moving from 6.0 -> 7.0. |
I assume this regression has been around since we bumped to v1.2.11 dotnet/corefx#32732? |
yes |
Description
When compared to .NET Framework 4.8, the output from BenchmarkDotNet shows execution time has greatly increased in .NET6/7 when deserializing from compressed binary XML streams using data contract serializers for large collections of objects (>1000). For example, using my configuration, deserializing 10,000 objects is around 200 times slower in .NET6/7 when compared to .NET Framework 4.8.
Data Contract Code
Benchmark Code
Configuration
Data
The text was updated successfully, but these errors were encountered: