Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Container tagging for traces in the datadog exporter #19398

Closed
nils-borrmann-y42 opened this issue Mar 8, 2023 · 6 comments
Closed

Container tagging for traces in the datadog exporter #19398

nils-borrmann-y42 opened this issue Mar 8, 2023 · 6 comments
Labels
enhancement New feature or request exporter/datadog Datadog components Stale waiting for author

Comments

@nils-borrmann-y42
Copy link

nils-borrmann-y42 commented Mar 8, 2023

Component(s)

exporter/datadog

Is your feature request related to a problem? Please describe.

In my service instrumentation I am tagging my traces with the attributes container.id and container.name, in order for traces to be linked to the respective infrastructure container metrics in Datadog. The trace attributes do arrive in datadog, and are also transformed into the datadog format of container_id and container_name, but Datadog still does not connect the trace to the infrastructure container metrics.

Looking through the code, I figured out that the datadog exporter would need to set the attribute _dd.tags.container. This even seems to have been implemented in the past (originally implemented in #1895, removed in #9693 ?).

I confirmed this by manually setting the _dd.tags.container attribute in my service instrumentation code, and alas, traces and container metrics were connected in Datadog.

(For anyone with the same problem stumbling on this issue: What you need to do is set this attribute in your otel resource: "_dd.tags.container": "container_id:{},container_name:{}" where container_id is a 64-char hexadecimal string and container_name is the plain text name of the container in k8s.)

Describe the solution you'd like

Populate the _dd.tags.container tag with the container attributes needed to connect traces and infrastructure metrics.

Describe alternatives you've considered

No response

Additional context

I also opened a ticket with Datadog (1121695) about this, their response helped to get me on the right track here.

@nils-borrmann-y42 nils-borrmann-y42 added enhancement New feature or request needs triage New item requiring triage labels Mar 8, 2023
@github-actions github-actions bot added the exporter/datadog Datadog components label Mar 8, 2023
@github-actions
Copy link
Contributor

github-actions bot commented Mar 8, 2023

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@atoulme atoulme removed the needs triage New item requiring triage label Mar 10, 2023
@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label May 10, 2023
@mx-psi mx-psi removed the Stale label May 10, 2023
@gbbr
Copy link
Member

gbbr commented May 10, 2023

Hi @nils-borrmann-y42 👋🏻 I think this should work now, considering you are using >=v0.76.3. Can you please try it out and report back?

@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Sep 11, 2023
@mx-psi
Copy link
Member

mx-psi commented Sep 11, 2023

I am going to close this as completed given we released a fix for it and we haven't gotten a reply on this thread. If the problem persists please comment so we can reopen the issue, thanks!

@mx-psi mx-psi closed this as completed Sep 11, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request exporter/datadog Datadog components Stale waiting for author
Projects
None yet
Development

No branches or pull requests

4 participants