Skip to content

Commit

Permalink
Miscellaneous
Browse files Browse the repository at this point in the history
Fix URLs
Fix commands
  • Loading branch information
sdmaclea committed Oct 17, 2019
1 parent 49fb20a commit 04a00d8
Show file tree
Hide file tree
Showing 3 changed files with 7 additions and 7 deletions.
4 changes: 2 additions & 2 deletions docs/core/diagnostics/app_is_leaking_memory_eventual_crash.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ The output should be similar to:

Here we can see that right after startup, the managed heap memory is 4 MB.

Now, let's hit the URL http://localhost:5000/api/diagscenario/memleak/200000
Now, let's hit the URL [http://localhost:5000/api/diagscenario/memleak/200000](http://localhost:5000/api/diagscenario/memleak/200000)

Rerun the dotnet-counters command. We should see an increase in memory usage as shown below:

Expand All @@ -89,7 +89,7 @@ When analyzing possible memory leaks, we need access to the apps memory heap. We
Using the previous [Sample debug target](sample-debug-target.md) started above, run the following command to generate a core dump:

```dotnetcli
sudo ./dotnet-dump collect -p 4807
sudo dotnet-dump collect -p 4807
```

4807 is the process ID that can be found using `dotnet-trace list-processes`. The result is a core dump located in the same folder.
Expand Down
2 changes: 1 addition & 1 deletion docs/core/diagnostics/app_running_slow_highcpu.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ The output should be similar to the below:

Here we can see that right after startup, the CPU isn't being consumed at all (0%).

Now, let's hit the URL (http://localhost:5000/api/diagscenario/highcpu/60000)
Now, let's hit the URL [http://localhost:5000/api/diagscenario/highcpu/60000](http://localhost:5000/api/diagscenario/highcpu/60000)

Rerun the [dotnet-counters](dotnet-counters.md) command. We should see an increase in CPU usage as shown below:

Expand Down
8 changes: 4 additions & 4 deletions docs/core/diagnostics/hung_app.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,26 +39,26 @@ dotnet-trace list-processes

Navigate to the following URL:

http://localhost:5000/api/diagscenario/deadlock
[http://localhost:5000/api/diagscenario/deadlock](http://localhost:5000/api/diagscenario/deadlock)

Let the request run for about 10-15 seconds then create the dump:

```dotnetcli
sudo ./dotnet-dump collect -p 4807
sudo dotnet-dump collect -p 4807
```

## Analyzing the core dump

To start our investigation, let's open the core dump using dotnet-dump analyze:

```dotnetcli
./dotnet-dump analyze ~/.dotnet/tools/core_20190513_143916
dotnet-dump analyze ~/.dotnet/tools/core_20190513_143916
```

Since we're looking at a potential hang, we want an overall feel for the thread activity in the process. We can use the threads command as shown below:

```console
threads
> threads
*0 0x1DBFF (121855)
1 0x1DC01 (121857)
2 0x1DC02 (121858)
Expand Down

0 comments on commit 04a00d8

Please sign in to comment.