Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[loggingexporter] Add InstrumentationLibararyScope attributes #5976

Merged

Conversation

pirgeo
Copy link
Member

@pirgeo pirgeo commented Aug 26, 2022

Description:
Added the Scope attributes to be logged in logging exporter

@pirgeo pirgeo requested review from a team and dmitryax August 26, 2022 08:21
@codecov
Copy link

codecov bot commented Aug 26, 2022

Codecov Report

Merging #5976 (63b4bac) into main (940943c) will increase coverage by 0.00%.
The diff coverage is 100.00%.

@@           Coverage Diff           @@
##             main    #5976   +/-   ##
=======================================
  Coverage   92.24%   92.24%           
=======================================
  Files         212      212           
  Lines       13263    13264    +1     
=======================================
+ Hits        12235    12236    +1     
  Misses        811      811           
  Partials      217      217           
Impacted Files Coverage Δ
...er/loggingexporter/internal/otlptext/databuffer.go 99.12% <100.00%> (+<0.01%) ⬆️

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

@bogdandrutu bogdandrutu merged commit 25192af into open-telemetry:main Aug 26, 2022
@pirgeo pirgeo deleted the log-scope-attributes branch August 26, 2022 16:22
@ElementalWarrior
Copy link

These are added after the processors are evaluated. So if the logs exporter implements instrumentation_scope, it will break the graphql json pipeline processor. And there is no way to manage that without removing it from the export code.

@pirgeo
Copy link
Member Author

pirgeo commented Mar 20, 2023

@ElementalWarrior I am not sure I follow. This PR only added that the instrumentation scope attributes are logged by the logging exporter. How does that break the pipeline? Do you mean that the InstrumentationLibraryScope attributes themselves break your pipeline?

@ElementalWarrior
Copy link

Our logs don't include attributes in the downstream Loki data.

By adding these after processors I am unable to filter them out.

I cannot do:
{foo="bar"} | json
In logql in that case on the message body.

So yes because of the presence of the attributes, it breaks the JSON pipeline operator in logql.

@pirgeo
Copy link
Member Author

pirgeo commented Mar 21, 2023

To make sure I understand correctly: Do they show up even though you don't expect them, or do they not show up even though you expect them?

I am also not sure how you get your logs: are you parsing the logs output of the OTel collector?

@ElementalWarrior
Copy link

So yes, I wasn't expecting them to show up.

We have a lambda that processes AWS ECS stdout, does some minor processing then ships to opentelemetry-collector.

The collector then writes to a loki cluster, which we query with grafana.

I have as of now changed the lambda to set instrumentation_scope=None in the python code that is shipping the LogData class

It seems odd to append these attributes after the processors step in the collector pipelines. It takes away control from the administrators of the collector instance, and would force another processing step downstream of the collector.

Effectively after updating the otel collector version, log statements went from:

{"some_key": "some_value", ...}

to

instrumentation_scope_version=1 instrumentation_scope_library=whatever {"some_key": "some_value", ...}

And I was unable to remove with the collector itself.

When querying with grafana, the json function/pipeline processor fails with the latter.

@pirgeo
Copy link
Member Author

pirgeo commented Mar 21, 2023

The instrumentation library scope attributes must have been on the collector's internal representation of the data the whole time, just not visible in the loggingexporter. In my understanding, the loggingexporter is basically a debugging tool to look at the data that flows through the collector. This change just added them to the data dumped to the console, and I think it is important for it to be printed, if it exists in the data.

Honestly, I am surprised that you are parsing the collector's output, instead of using something like the Loki exporter or exporting OTLP to Loki directly (not sure if Loki can accept OTLP). I did never expect the loggingexporter to be used as an input for downstream processing.

Have you looked into the transformprocessor to remove these attributes? Especially the Scope context might be interesting: https://github.com/open-telemetry/opentelemetry-collector-contrib/tree/main/pkg/ottl/contexts/ottlscope

@ElementalWarrior
Copy link

I'm not parsing the output of the collector.

  logs:
      receivers: [otlp]
      processors:
        - batch
      exporters:
        - loki

Perhaps my investigation into the source of the attributes being added came to the wrong PR.

I have not tried the transform process, I had tried using the attribute processor. Perhaps that's where the problem lied when trying to remove this.

@pirgeo
Copy link
Member Author

pirgeo commented Mar 21, 2023

Ah, I see, then I misunderstood. If you recently updated your collector, there might have been some change in the internal data format. I don't think this issue is the culprit then - Unfortunately, I don't have any other pointers for you in that case :(

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants