-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Test failure Regressions\\coreclr\\GitHub_22348\\Test22348\\Test22348.cmd #81120
Comments
Hi @mangod9, it is a one time failure in outerloop. PTAL. Please feel free to remove blocking label. |
This failed in #81121. Closed the duplicate there.
|
I did a brief investigation at #11360 (comment). The result was that the runtime/src/coreclr/vm/comutilnative.cpp Lines 949 to 977 in 3d61658
|
This comment is related to #81109.
|
Failed again in: runtime-coreclr crossgen2 20230201.1 Failed test"
Error message:
|
Failed again in: runtime-coreclr r2r 20230211.1 Failed tests:
Error message:
Stack trace:
|
@dotnet/crossgen-contrib Another case of this failure happened yesterday:
Unfortunately, the failures seem to occur when compiling various tests, so we can't just disable one or two specific tests to get the CI clean. It looks like most reports have been win-arm64, but some are win-arm. Can someone investigate? @trylek @markples It looks like you have investigated the same issue over in #81109? |
Thanks @janvorli for your feedback. I have reassigned the issue to myself and I'll investigate what's going on. |
Another case: JIT/Methodical/xxobj/sizeof/sizeof64_Target_64Bit_and_arm_il_d/sizeof64_Target_64Bit_and_arm.cmd R2R-CG2 windows arm64 Checked @ Windows.11.Arm64.Open
|
Encouraging news - I've finally managed to locally repro this on my SurfacePro X after about 500 iterations of the DelegateTest. I'm looking into improving internal diagnostics and I'll be hopefully able to shed more details on the issue soon. |
Another case: Interop\StringMarshalling\VBByRefStr\VBByRefStrTest\VBByRefStrTest.cmd |
I understand this a bit more now. The exception thrown from |
Another case: ilasm\PortablePdb\IlasmPortablePdbTests\IlasmPortablePdbTests.cmd R2R-CG2 windows arm64 Checked jitstress1_tiered @ Windows.11.Arm64.Open |
Hmm, I tried to repro this locally but while I'm usually able to repro this after 5000~10000 iterations when running crossgen2 through dotnet, I was unable to repro this after about 30000 iterations when running crossgen2 through corerun. I tried it with checked build of corerun so I guess I'll try once more with release build of the runtime and, if that fails, I'll try to figure out how to catch the dotnet crash in the debugger. |
Failed again in: runtime-coreclr r2r-extra 20230226.1 Failed tests:
Error message:
Stack trace:
|
@trylek Have you had any more luck with the repro attempts? We have both this issue and #81109 open. It appears that both have the same initial symptom (resolveToken). 81109 then fails in
whereas this one has
Until previously, 81109 was arm and this one was arm64, but 81109 now also has an arm64 repro. Neither call stack makes a lot of sense (agree?) so do you think it is reasonable to guess that both are coming from some sort of corruption (possibly something arm{,64}-specific like memory model)? |
I am able to reproduce this locally on my arm64 Surface Pro X but the repro rate is low - it typically takes 5000 to 20000 iterations to trigger the failure - and so far I have been mostly unsuccessful trying to repro this using instrumented coreclr that would let me investigate the issue in more detail in the debugger. Yesterday was the only time I managed to hit a repro with the instrumented runtime so it opened the Just-in-Time debugging dialog but when I confirmed I want to JIT debug the failure in the installed VS 2022 preview, it hung so it was no use. I keep trying but I don't have any great ideas how to debug the failure without using an instrumented local build with symbols etc. |
This is crashing in the LKG runtime that comes with SDK (8.0.0-alpha.1.23058.2 built on Jan 8 2023). The LKG runtime that comes with the SDK was updated last week: https://github.com/dotnet/runtime/pull/82599/files#diff-8df3cd354bc584349d04ad5675b33c042d8b99b741b8b95af394c55e0f5001bfL3 Have we seen any instances of these crashed since the LKG runtime was updated? |
This is an interesting observation as I've spent several days trying to repro this with the newest CoreCLR runtime without any success while I'm able to make it crash relatively easily with the older SDK. It would be great to hear that it's fixed now. |
a40d411#diff-8df3cd354bc584349d04ad5675b33c042d8b99b741b8b95af394c55e0f5001bf updated it on Jan 12 from
to
Earliest known failure was in runtime-coreclr outerloop 20230119.3, which seems to support this theory. |
I'm keeping my fingers crossed... |
Failed again in: runtime-coreclr r2r 20230306.1 Failed test:
Error message:
|
Failed in Run: runtime-coreclr outerloop 20230315.3 Failed tests:
Error message:
Stack trace:
|
Failed in Run: runtime-coreclr r2r-extra 20230325.1 Failed tests:
Error message:
Stack trace:
|
Closing alongside #81109 as SDK has been updated to preview.3 |
Failed in run: runtime-coreclr outerloop 20230119.3
Failed tests:
Error message:
Stack trace:
The text was updated successfully, but these errors were encountered: