Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lock contention in LoggerFactory.CreateLogger #83609

Closed
davidfowl opened this issue Mar 17, 2023 · 2 comments · Fixed by #87904
Closed

Lock contention in LoggerFactory.CreateLogger #83609

davidfowl opened this issue Mar 17, 2023 · 2 comments · Fixed by #87904
Labels
Milestone

Comments

@davidfowl
Copy link
Member

Description

There's lock contention when trying to create a logger using LoggerFactory.CreateLogger. The global lock protects mutating filters and logger providers. This shows up under heavy load.

Will fill in more details later, filing this as a reminder.

@davidfowl davidfowl added the tenet-performance Performance related issue label Mar 17, 2023
@ghost ghost added the untriaged New issue has not been triaged by the area owner label Mar 17, 2023
@ghost
Copy link

ghost commented Mar 17, 2023

Tagging subscribers to this area: @dotnet/area-extensions-logging
See info in area-owners.md if you want to be subscribed.

Issue Details

Description

There's lock contention when trying to create a logger using LoggerFactory.CreateLogger. The global lock protects mutating filters and logger providers. This shows up under heavy load.

Will fill in more details later, filing this as a reminder.

Author: davidfowl
Assignees: -
Labels:

tenet-performance, untriaged, area-Extensions-Logging

Milestone: -

@tarekgh tarekgh added this to the 8.0.0 milestone Mar 17, 2023
@ghost ghost removed the untriaged New issue has not been triaged by the area owner label Mar 17, 2023
@brantburnett
Copy link
Contributor

I was curious about this and did a bit of playing around (in advance of the details). ConcurrentDictionary doesn't seem to work because of the other reasons for locking the collection.

I tried ReaderWriterLockSlim since, in theory, the locks would be almost entirely reader locks. While I believe this eliminated a lot of the contention (untested) in the single-threaded read path it increased time by about 40%.

Method Job Mean Error StdDev Median Min Max Ratio RatioSD
CreateLogger Monitor 31.21 ns 0.784 ns 0.872 ns 31.47 ns 28.91 ns 32.17 ns 1.00 0.00
CreateLogger RWLockSlim 43.63 ns 1.171 ns 1.349 ns 43.35 ns 42.16 ns 46.33 ns 1.40 0.05

Then I tried this approach using a delayed build of a FrozenDictionary: 05d5c63

The (unproven) theory here is:

  • Consumers using DI-only will get the same path as before because they never call CreateLogger for the same name more than once (cached in DI container as a singleton)
  • Loggers tend to be mostly initialized early in the life of the application from a limited list of names and then stabilize to a read-only scenario
  • Consumers using LoggerFactory.CreateLogger directly will eventually hit the optimized path, based on a threshold (which would need to be tuned with heuristics)

Because it uses the (more expensive) path to build a read-optimized FrozenDictionary, this actually improves the speed of dictionary reads after the threshold is passed, even before considering the elimination of lock contention. If the assumptions above hold this would mean an amortized overall improvement in exchange for a bit more up-front computation building the optimized dictionary.

Method Job NumLoggers Mean Error StdDev Median Min Max Ratio RatioSD
CreateLogger Monitor 128 33.30 ns 0.841 ns 0.864 ns 33.54 ns 31.37 ns 34.63 ns 1.00 0.00
CreateLogger FrozenDict 128 17.82 ns 0.486 ns 0.559 ns 17.91 ns 16.70 ns 18.84 ns 0.54 0.02
CreateLogger Monitor 1024 33.14 ns 0.997 ns 1.148 ns 33.30 ns 30.79 ns 34.73 ns 1.00 0.00
CreateLogger FrozenDict 1024 19.02 ns 0.466 ns 0.499 ns 19.05 ns 18.22 ns 20.04 ns 0.58 0.03

I guess we'll see if this makes sense once the problem is explained fully.

@tarekgh tarekgh modified the milestones: 8.0.0, Future Jun 1, 2023
@ghost ghost added the in-pr There is an active PR which will close this issue when it is merged label Jun 22, 2023
@ghost ghost removed the in-pr There is an active PR which will close this issue when it is merged label Jun 22, 2023
@ghost ghost locked as resolved and limited conversation to collaborators Jul 22, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants