-
Notifications
You must be signed in to change notification settings - Fork 2.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DotNetCoreCLI@2's dotnet test
hangs indefinitely when a child process inherited the STDIO streams and has not yet exited
#17548
Comments
I cannot verify this currently, but this might be fixed by this agent bug fix. |
Same issue here. |
With the previous version, Azure Pipeline sometimes failed the execution of the tests, not completing the test step. Looking into these failures, I also found the following message in the logs from Azure Pipelines: > The STDIO streams did not close within 10 seconds of the exit event from process 'C:\hostedtoolcache\windows\dotnet\dotnet.exe'. This may indicate a child process inherited the STDIO streams and has not yet exited. For discussions of the problem, see also: + microsoft/azure-pipelines-tasks#17548 + microsoft/azure-pipelines-tasks#13033 + microsoft/azure-pipelines-tasks#18476
With the previous version, Azure Pipeline sometimes failed the execution of the tests, not completing the test step. Looking into these failures, I also found the following message in the logs from Azure Pipelines: > The STDIO streams did not close within 10 seconds of the exit event from process 'C:\hostedtoolcache\windows\dotnet\dotnet.exe'. This may indicate a child process inherited the STDIO streams and has not yet exited. For discussions of the problem, see also: + microsoft/azure-pipelines-tasks#17548 + microsoft/azure-pipelines-tasks#13033 + microsoft/azure-pipelines-tasks#18476
With the previous version, Azure Pipeline sometimes failed the execution of the tests, not completing the test step. Looking into these failures, I also found the following message in the logs from Azure Pipelines: > The STDIO streams did not close within 10 seconds of the exit event from process 'C:\hostedtoolcache\windows\dotnet\dotnet.exe'. This may indicate a child process inherited the STDIO streams and has not yet exited. For discussions of the problem, see also: + microsoft/azure-pipelines-tasks#17548 + microsoft/azure-pipelines-tasks#13033 + microsoft/azure-pipelines-tasks#18476
I am hitting same issue. What circumstances make it hang? |
I'm experiencing a very similar issue with a different task (in my case https://www.unitydevops.com/) - the problem here is that Unity.exe - a complex program outside our control - leaves some child processed alive when it exists cleanly. The executable cleanly ends with Exit Code 0 but then Azure Pipelines still waits indefinitely until it times out and considers the task failed. Unfortunately, Azure Pipelines offers not info about which child process is still running and offers no means to kill the child processes. Following some discussion on various forums, I would even have assumed that it should kill child processes by default and provide some info (error?) about that. |
Finding same issues with Specflow and chrome driver staying active. |
Hi @rvairavelu! Are there any news regarding this issue? |
Hi guys. Having same problem here when running multiple test projects. Any workarounds available? |
Is it the same problem reported on dotnet/sdk#27106 and dotnet/sdk#9452? |
Required Information
Question, Bug, or Feature?
Type: Bug
Enter Task Name: DotNetCoreCLI@2
Environment
Issue Description
Instead of showing the current execution time as usual, executing the
test
command via the DotNetCoreCLI@2 task shows100%
while hanging indefinitely (until the configured timeout triggers), and not traces of the log can be found anywhere (there are bits and pieces of the log in the agent's_diag/pages
folder, but nothing complete).This happens when a child process of
dotnet
locks its IO stream(s), anddotnet test
reportsTask logs
N/A
The text was updated successfully, but these errors were encountered: