Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Restrict trace events to correct log level #264

Merged
merged 7 commits into from
Sep 25, 2024

Conversation

alecsammon
Copy link
Contributor

We noticed during testing that debug messages were appearing as events in our traces.

This change only adds events if the level would be handled by the logger - i.e. the level is at least the same as the defined minimum log level.

@alecsammon alecsammon requested a review from a team May 17, 2024 14:45
Copy link

welcome bot commented May 17, 2024

Thanks for opening your first pull request! If you haven't yet signed our Contributor License Agreement (CLA), then please do so that we can accept your contribution. A link should appear shortly in this PR if you have not already signed one.

Copy link

linux-foundation-easycla bot commented May 17, 2024

CLA Signed

The committers listed above are authorized under a signed CLA.

Copy link

codecov bot commented May 17, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 84.80%. Comparing base (9b97522) to head (b3cee45).
Report is 4 commits behind head on main.

Additional details and impacted files

Impacted file tree graph

@@            Coverage Diff            @@
##               main     #264   +/-   ##
=========================================
  Coverage     84.80%   84.80%           
  Complexity      947      947           
=========================================
  Files            80       80           
  Lines          3798     3798           
=========================================
  Hits           3221     3221           
  Misses          577      577           
Flag Coverage Δ
Aws 85.75% <ø> (ø)
Context/Swoole 0.00% <ø> (ø)
Instrumentation/CodeIgniter 73.94% <ø> (ø)
Instrumentation/ExtAmqp 89.58% <ø> (ø)
Instrumentation/Guzzle 69.73% <ø> (ø)
Instrumentation/HttpAsyncClient 81.33% <ø> (ø)
Instrumentation/IO 70.90% <ø> (ø)
Instrumentation/MongoDB 77.33% <ø> (ø)
Instrumentation/OpenAIPHP 86.82% <ø> (ø)
Instrumentation/PDO 89.56% <ø> (ø)
Instrumentation/Psr14 78.12% <ø> (ø)
Instrumentation/Psr15 93.50% <ø> (ø)
Instrumentation/Psr16 97.50% <ø> (ø)
Instrumentation/Psr18 82.08% <ø> (ø)
Instrumentation/Psr3 61.03% <ø> (ø)
Instrumentation/Psr6 97.61% <ø> (ø)
Instrumentation/Slim 86.95% <ø> (ø)
Instrumentation/Symfony 89.03% <ø> (ø)
Instrumentation/Yii 77.77% <ø> (ø)
Logs/Monolog 100.00% <ø> (ø)
Propagation/ServerTiming 100.00% <ø> (ø)
Propagation/TraceResponse 100.00% <ø> (ø)
ResourceDetectors/Container 93.02% <ø> (ø)
Shims/OpenTracing 92.99% <ø> (ø)
Symfony 88.20% <ø> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.


Continue to review full report in Codecov by Sentry.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 9b97522...b3cee45. Read the comment docs.

@brettmc
Copy link
Collaborator

brettmc commented May 18, 2024

Looks like a sensible change to me. Can you take a look at the phan issue? It might just need to be suppressed...

@alecsammon
Copy link
Contributor Author

Thanks Brett. Have added a phan supression - hopefully build will be 🟢 now.

@brettmc
Copy link
Collaborator

brettmc commented May 18, 2024

@alecsammon progress, now some phpstan issues. FYI you can run all of the CI checks locally: PHP_VERSION=8.3 PROJECT=Instrumentation/Laravel make all

$underlyingLogger = $this->logger->getLogger();

/** @phan-suppress-next-line PhanUndeclaredMethod */
if (method_exists($underlyingLogger, 'isHandling') && !$underlyingLogger->isHandling($log->level)) {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks like the isHandling is only available in the monolog - my understanding is that someone could pass a custom logger.

I'm not sure I like the method_exists - but maybe is the safest way of ensuring this doesn't break any code?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey Alec 👋

I had the same understanding, as it looks like PSR-3 doesn't account for that functionality.

This isn't the first method_exists() usage at least across contrib. Another option could be an instanceof check to see if the method exists and is compatible.

I wonder if OTel has some filtering mechanism in general for this, as I imagine the PSR-3 instrumentation would log everything as well.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👋🏻 Hi!

I do think that this isn't perfect - but maybe prevents at least some leaking of debug logs into enviroments where it shouldn't be.

Looks like a processor could also be used - i.e. in a collector - to give more options for dropping events - see https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/main/processor/filterprocessor/README.md.
However this is maybe more advanced - and for people with basic setups it's probably better to have this as a fallback.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it expected that the underlying logger could change? If not, maybe it is possible to get the underlying one already in register and only assign to a property if method_exists passes there.

src/Instrumentation/Laravel/src/Watchers/LogWatcher.php Outdated Show resolved Hide resolved
$underlyingLogger = $this->logger->getLogger();

/** @phan-suppress-next-line PhanUndeclaredMethod */
if (method_exists($underlyingLogger, 'isHandling') && !$underlyingLogger->isHandling($log->level)) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey Alec 👋

I had the same understanding, as it looks like PSR-3 doesn't account for that functionality.

This isn't the first method_exists() usage at least across contrib. Another option could be an instanceof check to see if the method exists and is compatible.

I wonder if OTel has some filtering mechanism in general for this, as I imagine the PSR-3 instrumentation would log everything as well.

@alecsammon
Copy link
Contributor Author

@brettmc - I'm hoping that the build should all be sorted now!

Are you able to re-run the workflow when you're able?

@alecsammon
Copy link
Contributor Author

Investigating build failures again - sorry :(

@alecsammon
Copy link
Contributor Author

I'm wondering if this is right approach at all.

It maybe better to use a environment variable to determine which levels should be logged

I.e. https://opentelemetry-python-contrib.readthedocs.io/en/latest/instrumentation/logging/logging.html#envvar-OTEL_PYTHON_LOG_LEVEL

https://opentelemetry.io/docs/specs/otel/configuration/sdk-environment-variables/ does describe a OTEL_LOG_LEVEL which is maybe the right one to use? Or OTEL_PHP_LOG_LEVEL may be better?

@agoallikmaa
Copy link
Contributor

I looked into how Java instrumentation handles this, and there for the most common logging library (see Logback appender instrumentation), the instrumented method itself is only ever called if some log level checks already passed, so this filter technically exists there even if it's due to the instrumentation hook never being reached otherwise. So there is definitely also a precedent for this approach in other OpenTelemetry implementations.

As for what a global configuration option would do or if one is even needed, that could be a separate topic of discussion. If such an option were to be added, it could act merely as an additional filter, for example to exclude debug and lower even if they are configured to get logged somewhere otherwise.

@brettmc
Copy link
Collaborator

brettmc commented May 22, 2024

I'm wondering whether attaching logs as trace events is the right thing to do at all? Is there any precedent? Otel has a logging signal, and we already have contrib modules which will take monolog/psr-3 log messages and forward them in the opentelemetry format, including trace+span id.

If we were to go with an environment variable, I don't think OTEL_LOG_LEVEL is the correct one, because that's for configuring the SDK logger, ie errors, warnings etc coming out of our SDK, which is unrelated to application log level.

@bobstrecansky
Copy link
Collaborator

@alecsammon - are you still interested in pushing this forward?

@brettmc brettmc merged commit e2ba8fa into open-telemetry:main Sep 25, 2024
104 of 117 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants