-
-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Very Long Deserialization Time on Few Systems #18
Comments
Hi, the only thing I can suspect is BufferPool.cs related to byte[] allocation:
Please use a profiler. |
First thanks a lot for your extremely quick reply! :-) Good point, for testing I could just change that line to always use the "32 bit" way and see if that helps, right? That's easier as a first test than using a profiler on the customer machine. ;-) |
I just send a changed version ( Really annoying if you cannot reproduce a problem yourself and need to wait for customer feedback... Thanks again so far! |
Just got feedback from the customer: Do you have an idea how to profile that easily on the customer computer? Thanks again a lot for your support |
I prefer to use JetBrains dotTrace, they have a trial version so you should be able to run it there. Or may be even connect remotely. Good luck! |
Thanks for the suggestion! I will try to get on the client computer via TeamViewer or so to collect some data with dotTrace... |
Just had the chance to access the customer PC via TeamViewer and recording some dotTrace samples here: The "Sampling" and "Tracing" dumps contain only the first 10 seconds or so of the database loading process, for the "Tracing" dump I really waited the full time (I think is was 7-8 min. or so) till the database loading has been completed in 64 bit mode. Please note that the AquaSerializer version used is not your very latest code, but the version after your commits from May 6, 2016: Thanks a lot in advance for any ideas on this strange issue! |
I checked the samples, it looks like GC takes very long time: 9.6 of 14.8 seconds. There may be some unnecessary allocations. There is a feature in dotTrace called "Compare Snapshots" which could be usable here if you profile it again in 32bit mode (sampling, timeline, tracing). Also to track the memory allocation issue I suggest you to capture dotMemory samples too (both for 64bit and 32bit). Note that you may improve performance by precompilling serialization dlls (RuntimeTypeModel.Compile) instead of using |
First thanks a lot for your analyzing! OK, I will then re-record the dumps on the client computer in 32 bit to be able to compare them. Additionally I will get some memory dumps as well. |
So, now I captured the 3 dotTrace sampling variants both for 64 bit and 32 bit: Additionally also 1 memory snapshot while running in 64 bit is included. I hope this sheds some light on this strange issue. :-) Thanks again for your support! |
Please add also the memory profiling results for 32bit as I asked previously. I need to compare which objects are allocated more in 64bit. |
Also I have a suspect that |
Interesting that the OK, I try to collect some more data according what you wrote: Thanks again! |
You can skip only 1 - "Sampling" - because "Tracing" replaces it but I need "Timeline" anyway to find memory allocations. |
I think we really found the problem - it's the auto compilation! As you requested I've tested now with Still I recorded all the snapshots you requested as it might still be interesting to find out why I'm currently waiting for the snapshots to be transferred: Still I wanted to let you know already that it obviously has something to do with the |
@ab-tools, for The fast solution for you would be to pre-generate serialization dlls with
In both cases you need to actually use the model returned by |
Thanks for your quick reply again!
The customer computer is quite slow (only an old Core2Duo), so it took about 10 sec. or so, but still A LOT quicker then the about 10 min. with I just uploaded the new dumps here: I will prepare the two tests you mentioned and ask the customer to run them. |
Just trying to prepare the test versions for the customer. Maybe stupid question, but I never used this pre-compiled assemblies yet: Till now I only registered the required surrogates on type model, nothing else. And the parameter |
Right, the default behavior. But you don't need to add them manually. Just export once the state of your model after deserializing your data (which auto-adds types) - Then instead of adding types modify your code to use
Depending on your situation you may need to reorder the surrogates registration after More https://github.com/AqlaSolutions/AqlaSerializer/wiki/Batch-types-registration |
Can you also check it with |
OK, I see that's still a bit more to test again, so it's probably better I'll do that via TeamViewer on the customer computer again myself: Just wanted to let you know that you don't think I forgot it when I don't reply quickly again: |
Just wanted to prepare the tests for the customer already, but when trying to so something like
it seems that the How can I access this? Best regards |
@ab-tools RuntimeTypeModel has |
Ah, I see, thanks! |
Unfortunately I have still a lot problems getting the Then the compile function worked fine, but it could no longer read the existent database. So I thought I'm completely rebuilding the database with this "new" type model, but also the rebuilt database cannot be read afterwards again resulting in this exception upon deserialization:
All types were added before as you suggested by:
This also seemed to work fine - at least there was no exception, but it just fails upon deserialization when I called Do you have any idea what I need to do to get the database deserialized again after Tomorrow afternoon a would have another chance to test something on the customer computer. Thanks |
It's hard to say anything from the code you provided.
|
Thanks for your quick reaction again! Your point 1 was the problem: OK, as I already mentioned, I should have tomorrow afternoon the next chance to do tests on the customer computer: |
So, now I did all the tests on the customer PC that you requested with following results:
As I had time differences +/- 60 sec. in the slow 64 Bit tests anyway the slightly faster database load of Please also find attached the two type model DLLs as requested. |
I still have access to the customer computer right now (for the next hour or so): |
Thanks for testing it. I checked the dlls you sent me and they are 100% identical except GUID field. The same code just runs with different speed.
|
@ab-tools if the reason is JIT compiler this could be solved by updating .NET framework + latest patches. There is an issue on how x64 JIT handles long methods https://connect.microsoft.com/VisualStudio/feedback/details/508748/memory-consumption-alot-higher-on-x64-for-xslcompiledtransform-transform-then-on-x86 but since the issue is not reproducable on your machine it could be already fixed in .NET
|
OK, I will try to get access to the customer PC for another test round. ;-)
In fact not:
What's the best (and reliable) way to check the .NET 4.0 version number? |
@ab-tools may be this https://docs.microsoft.com/en-us/dotnet/framework/migration-guide/how-to-determine-which-net-framework-updates-are-installed , compare the outputs on your and your customer machines. Or just ensure that windows update works properly. |
OK, I will check that. I just to be sure I got the "cold" and "hot" start right: Here I think I can directly say that this does not have a considerable influence: |
Hi guys, |
During the previous test It was not exactly the same assembly, those assemblies were produced by different compilation calls so JIT-ed each time.
Do you mean that writes are same slow? Thanks, @aienabled |
Thanks, @aienabled, for helping. :-) It is till now reproducible on about 3-4 customer computers I know of.
@AqlaSolutions, Why writing? |
@ab-tools, if the issue is reproducible on other computers than it's more likely to be a software issue than hardware. Other than updating and repairing .NET Framework I would also recommend looking into the system monitor to check if there are any background applications which could affect the performance. It could be an antivirus software. |
@ab-tools in the
what do you mean by:
|
@aienabled: Good point, but there was no anti-virus or firewall program running - that's always the first thing I check. :-) But I will create a SystemInfo dump the next time I'm on this customer PC to check the running processes in more detail. @AqlaSolutions: Sorry, this sentence was really a bit confusing. But anyway, I can do such a test again just to be sure. |
@ab-tools ok, ensure that you are not making new models with |
I had the chance to test on the customer computer again and I can't believe it, but we really found the problem: On the customer computer all Windows Updates were displayed (with the last update end of 2014) which is normal as it's a flight simulation computer - everybody that has a system that "just needs to run as it is" will at first disable all Windows Updates. :-) But due to another application the customer uses he had to install .NET 4.6.0 on this computer manually. Of course, he did not want to enable Windows Update, but I suggested to install the latest bug fix update .NET 4.6.2 manually again - and this did the trick! Although our product is just based on .NET 4.0 after upgrading from .NET 4.6.0 to 4.6.2 the massive database loading delay on 64 Bit was immediately gone. I really didn't think that .NET could be the problem here, but obviously the version 4.6.0 was a bit buggy. :-) That was a hard way to go - thanks a lot for your support in finding this issue which at the end was not related to our both products at all! |
@ab-tools, we learned the same lesson at least two times... Since then the first thing we do is running .NET Framework Repair tool and installing latest patch version (i.e. 4.5->4.5.2, 4.6->4.6.2). |
@ab-tools cool, finally we resolved this |
@aienabled: We didn't look into .NET Core yet to be honest. Our product is still WinForms based. Is that supported by .NET Core 2.0? @AqlaSolutions: Two short other questions:
Thanks again! |
Unfortunately, no, and it seems it will never be supported. AFAIK, only UWP (which is very similar to WPF) and Xamarin Forms have support now. |
|
Thank you both! I now also got the DLL loading to work - you just need to use But as the runtime compilation works fast, I probably won't use that, makes everything only more complicated adding another source of errors. ;-) Thanks again for your great support! |
Hello @AqlaSolutions again!
We are still using your great serialization library, just as a reminder, we are the ones have the bigger graph database as discussed some time ago here:
#1
Everything really works well with a database file size of about 100 MB and a deserialization time of about 3-5 seconds on most modern computers which is fine.
But since with changed our product to use 64 bit if available (compiled with "Any CPU"), we do have few customers - all running Windows 7 64 bit - that complain about extremely long database loading times:
Instead of 3-5 seconds it takes about 5-10 minutes for them!
For testing we then forced our product to start in 32 bit mode (via "CorFlags.exe") and the database loads "normally" fast again in just 3-5 seconds, but when switching back to 64 bit mode it gets immediately extremely slow again.
We cannot reproduce this problem ourselves, also not on a Windows 7 64 bit system. So we are running a bit out of ideas what could really cause this issue and how to track that down.
Do you have any idea what could cause such an extreme performance drop when deserialization in a 64 bit process instead of a 32 bit process?
Thanks in advance for any ideas on that
Andreas
The text was updated successfully, but these errors were encountered: