-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[loggingexporter] Add InstrumentationLibararyScope attributes #5976
[loggingexporter] Add InstrumentationLibararyScope attributes #5976
Conversation
Codecov Report
@@ Coverage Diff @@
## main #5976 +/- ##
=======================================
Coverage 92.24% 92.24%
=======================================
Files 212 212
Lines 13263 13264 +1
=======================================
+ Hits 12235 12236 +1
Misses 811 811
Partials 217 217
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. |
These are added after the processors are evaluated. So if the logs exporter implements instrumentation_scope, it will break the graphql json pipeline processor. And there is no way to manage that without removing it from the export code. |
@ElementalWarrior I am not sure I follow. This PR only added that the instrumentation scope attributes are logged by the logging exporter. How does that break the pipeline? Do you mean that the InstrumentationLibraryScope attributes themselves break your pipeline? |
Our logs don't include attributes in the downstream Loki data. By adding these after processors I am unable to filter them out. I cannot do: So yes because of the presence of the attributes, it breaks the JSON pipeline operator in logql. |
To make sure I understand correctly: Do they show up even though you don't expect them, or do they not show up even though you expect them? I am also not sure how you get your logs: are you parsing the logs output of the OTel collector? |
So yes, I wasn't expecting them to show up. We have a lambda that processes AWS ECS stdout, does some minor processing then ships to opentelemetry-collector. The collector then writes to a loki cluster, which we query with grafana. I have as of now changed the lambda to set It seems odd to append these attributes after the Effectively after updating the otel collector version, log statements went from:
to
And I was unable to remove with the collector itself. When querying with grafana, the json function/pipeline processor fails with the latter. |
The instrumentation library scope attributes must have been on the collector's internal representation of the data the whole time, just not visible in the loggingexporter. In my understanding, the loggingexporter is basically a debugging tool to look at the data that flows through the collector. This change just added them to the data dumped to the console, and I think it is important for it to be printed, if it exists in the data. Honestly, I am surprised that you are parsing the collector's output, instead of using something like the Loki exporter or exporting OTLP to Loki directly (not sure if Loki can accept OTLP). I did never expect the loggingexporter to be used as an input for downstream processing. Have you looked into the transformprocessor to remove these attributes? Especially the Scope context might be interesting: https://github.com/open-telemetry/opentelemetry-collector-contrib/tree/main/pkg/ottl/contexts/ottlscope |
I'm not parsing the output of the collector.
Perhaps my investigation into the source of the attributes being added came to the wrong PR. I have not tried the transform process, I had tried using the attribute processor. Perhaps that's where the problem lied when trying to remove this. |
Ah, I see, then I misunderstood. If you recently updated your collector, there might have been some change in the internal data format. I don't think this issue is the culprit then - Unfortunately, I don't have any other pointers for you in that case :( |
Description:
Added the Scope attributes to be logged in logging exporter