Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using Sentry breadcrumbs and tags alongside OpenTelemetry isn't working correctly #7538

Closed
3 tasks done
seeARMS opened this issue Mar 20, 2023 · 40 comments · Fixed by #7577
Closed
3 tasks done

Using Sentry breadcrumbs and tags alongside OpenTelemetry isn't working correctly #7538

seeARMS opened this issue Mar 20, 2023 · 40 comments · Fixed by #7577

Comments

@seeARMS
Copy link

seeARMS commented Mar 20, 2023

Is there an existing issue for this?

How do you use Sentry?

Sentry Saas (sentry.io)

Which SDK are you using? If you use the CDN bundles, please specify the exact bundle (e.g. bundle.tracing.min.js) in your SDK setup.

@sentry/node

SDK Version

7.43.0

Framework Version

No response

Link to Sentry event

https://paragraph.sentry.io/performance/node:08b586a3e77243548a96ac02db296d0f/?project=6606495&query=&statsPeriod=5m&transaction=POST+%2Fmetrics%2Fpost-track

SDK Setup

Sentry.init({
  dsn: "MY_DSN",

  enabled: isProduction,

  instrumenter: "otel",

  release: process.env.COMMIT,
  environment: process.env.FLY_APP_NAME || "development",


  tracesSampler: (samplingContext) => {
    // Drop transactions for routes we don't care about
    for (const route of SENTRY_IGNORED_ROUTES) {
      if (samplingContext?.request?.url?.endsWith(route)) return 0
    }

    // Drop transactions for non-production environments. (We still care about errors
    // on non-production environments, but we don't need to see all the transactions.)
    if (process.env.FLY_APP_NAME !== "paragraph-master") return 0

    return TRACE_SAMPLE_PERCENTAGE
  },

  profilesSampleRate: 1.0, // Profiling sample rate is relative to tracesSampleRate

  beforeSend: (event) => {
    if (!event.request?.data) return event

    const size = JSON.stringify(event.request?.data || "").length

    // Arbitrarily cut off massive payloads. Sentry silently drops events that are too large,
    // so trim them.
    // See https://github.com/getsentry/sentry-javascript/issues/3036#issuecomment-1066491190
    if (size > 500000) {
      event.request = event.request?.data?.slice(0, 500000)
    }

    return event
  },

  integrations: [new ProfilingIntegration()],
})

// Only load OpenTelemetry in production,
// so we aren't spamming Sentry with test data.
if (isProduction) {
  const sdk = new opentelemetry.NodeSDK({
    /*
    sampler: new opentelemetry.tracing.TraceIdRatioBasedSampler(
      TRACE_SAMPLE_PERCENTAGE
    ),
    */
    // Existing config
    traceExporter: new OTLPTraceExporter(),
    instrumentations: [
      new HttpInstrumentation({
        ignoreIncomingRequestHook(req) {
          for (const route of SENTRY_IGNORED_ROUTES) {
            if (req.url.endsWith(route)) {
              return true
            }
          }
          return false
        },
      }),

      new ExpressInstrumentation(),
      new GrpcInstrumentation({
        ignoreGrpcMethods: [
          "google.logging.v2.LoggingServiceV2/WriteLogEntries",
          new RegExp(".*google.logging.*"),
          new RegExp(".*WriteLogEntries*"),
          (name) => {
            return name.startsWith("google.logging")
          },
        ],
      }),
    ],

    // Sentry config
    spanProcessor: new SentrySpanProcessor(),
  })

  otelApi.propagation.setGlobalPropagator(new SentryPropagator())

  sdk.start()
}

Steps to Reproduce

I've been happily using Sentry for some time. using @sentry/node with Express.

Today we switched from using Sentry tracing to OpenTelemetry-based tracing, using @sentry/opentelemetry-node. Transactions are getting logged fine, but it appears that breadcrumbs and tags are not correctly added to the transaction.

Viewing any Sentry event (eg the one linked above) has tags from what seems like other requests. Additionally, the breadcrumbs also don't correspond to this request.

There's a chance I'm doing something incorrect in Sentry.init() above. For example, I don't need to use any Sentry tracing functionality now that I have OpenTelemtry tracing working, right? The documentation here is a bit light so it's unclear to me.

Additionally, I'm getting an exception on startup (see logs below).

Expected Result

I'd expect to be able to use Sentry.addBreadcrumb and Sentry.addTag throughout the codebase, and it would add the breadcrumb and tag to the current transaction/span created by Opentelemetry.

Actual Result

I get this non-fatal exception on startup:

2023-03-20T19:21:52Z app[53d5f709] iad [info]Error: @opentelemetry/api: Attempted duplicate registration of API: propagation
2023-03-20T19:21:52Z app[53d5f709] iad [info]    at registerGlobal (/workspace/node_modules/@opentelemetry/api/build/src/internal/global-utils.js:32:21)
2023-03-20T19:21:52Z app[53d5f709] iad [info]    at PropagationAPI.setGlobalPropagator (/workspace/node_modules/@opentelemetry/api/build/src/api/propagation.js:52:50)
2023-03-20T19:21:52Z app[53d5f709] iad [info]    at NodeTracerProvider.register (/workspace/node_modules/@opentelemetry/sdk-trace-base/build/src/BasicTracerProvider.js:100:31)
2023-03-20T19:21:52Z app[53d5f709] iad [info]    at NodeTracerProvider.register (/workspace/node_modules/@opentelemetry/sdk-trace-node/build/src/NodeTracerProvider.js:43:15)
2023-03-20T19:21:52Z app[53d5f709] iad [info]    at NodeSDK.start (/workspace/node_modules/@opentelemetry/sdk-node/build/src/sdk.js:158:24)
2023-03-20T19:21:52Z app[53d5f709] iad [info]    at Object.<anonymous> (/workspace/packages/api/dist/trace.cjs:127:7)
2023-03-20T19:21:52Z app[53d5f709] iad [info]    at Module._compile (node:internal/modules/cjs/loader:1159:14)
2023-03-20T19:21:52Z app[53d5f709] iad [info]    at Module._extensions..js (node:internal/modules/cjs/loader:1213:10)
2023-03-20T19:21:52Z app[53d5f709] iad [info]    at Module.load (node:internal/modules/cjs/loader:1037:32)
2023-03-20T19:21:52Z app[53d5f709] iad [info]    at Module._load (node:internal/modules/cjs/loader:878:12)

If I replace the NodeSDK init with the following, the error goes away, but I don't know if it's actually using SentryPropagator now:

  const sdk = new opentelemetry.NodeSDK({
    textMapPropagator: new SentryPropagator(),
... 
})
@AbhiPrasad
Copy link
Member

Hey @seeARMS, thanks for writing in!

To isolate your breadcrumbs/tags, you need to use our express request handler!

As per the docs,

// The request handler must be the first middleware on the app
app.use(Sentry.Handlers.requestHandler());

This should make sure sentry data is isolated.

I get this non-fatal exception on startup:

Yes, we need to adjust our docs for this, using dependency injected propagator is more correct than the global propagation API. Will fix!

@AbhiPrasad
Copy link
Member

Created getsentry/sentry-docs#6501 and #7548 to update the docs!

@seeARMS
Copy link
Author

seeARMS commented Mar 21, 2023

Thanks for the help!

I gave that a try and now no breadcrumbs or tags are getting logged.

Transaction after the change, with no tags/breadcrumbs: https://paragraph.sentry.io/performance/node:7cb35584c76c4a598319153e8ad6b3aa/?project=6606495&query=&referrer=performance-transaction-summary&statsPeriod=2m&transaction=GET+%2Fblogs%2FxeXAY7C8qmb1h6Cp8zTa%2Fcommunities&unselectedSeries=p100%28%29

Same transaction but different event which occurred yesterday, with the erroneous tags/breadcrumbs as described above: https://paragraph.sentry.io/performance/node:a87a3b8398ee42e39c33dc5b9b1955f5/?project=6606495&query=&referrer=performance-transaction-summary&showTransactions=recent&statsPeriod=24h&transaction=GET+%2Fblogs%2FxeXAY7C8qmb1h6Cp8zTa%2Fcommunities&unselectedSeries=p100%28%29

Do I need to add the tracing integration (new Tracing.Integrations.Express({ app }) to Sentry.init as well as the tracing handler middleware (app.use(Sentry.Handlers.tracingHandler()))? Or is this not needed since I'm now using OpenTelemetry?

@AbhiPrasad
Copy link
Member

AbhiPrasad commented Mar 21, 2023

You shouldn't need to use the Express integration nor the tracingHandler, since we are relying completely on OpenTelemetry to generate the spans.

The Sentry.Handlers.requestHandler is needed, because it's isolating the context between your requests. I don't understand why this is not working though 🤔. Is the requestHandler the first express middleware getting registered?

Need to reproduce this, will try tomorrow! If you could provide a quick repro, would be a huge help as well!

Also 👋 to fellow uwaterloo grad ^^

@seeARMS
Copy link
Author

seeARMS commented Mar 21, 2023

Here's a repro: https://github.com/seeARMS/sentry-javascript-7538-reproduction

Clone it, run npm install, then npm run start. It successfully instantiates OpenTelemetry and Sentry, and the transaction gets created successfully when the / route is hit, but the single breadcrumb never gets displayed in Sentry.

Here's an event in question that was logged using the repro (no breadcrumb present): https://paragraph.sentry.io/performance/node:f116399485744a8ab2452906b6d39fee/?environment=sentry-reproduction&project=6606495&query=&referrer=performance-transaction-summary&statsPeriod=2m&transaction=GET+%2F&unselectedSeries=p100%28%29

And hello! Always great to see other uWaterloo grads in the wild 😄

@AbhiPrasad
Copy link
Member

Hey @seeARMS thank you so much for the reproduction! Took me a while to debug, but the issue is due to inconsistency between how we isolate requests and how opentelemetry isolates requests. I opened a PR to fix this: #7577

Thank you for your patience while we figured it out - this will be part of the next release of the SDK (going out end of week).

@AbhiPrasad AbhiPrasad self-assigned this Mar 22, 2023
@seeARMS
Copy link
Author

seeARMS commented Mar 23, 2023

Nice! Thanks for the quick turn-around; really appreciate it!

@seeARMS
Copy link
Author

seeARMS commented Aug 9, 2023

@AbhiPrasad, this seems to be happening again - did anything change in Sentry or OpenTelemtry?

A given transaction seems to have a lot of tags, breadcrumbs, and console.logs from other requests.

Eg: https://paragraph.sentry.io/issues/4018310436/?project=6606495&query=is%3Aunresolved&referrer=issue-stream&stream_index=1

Using @sentry/node and @sentry/opentelemtry-node version 7.61.1.

@AbhiPrasad
Copy link
Member

Hey, apologies for the trouble - let's try to figure out what's happening.

Looking at https://github.com/getsentry/sentry-javascript/commits/develop/packages/opentelemetry-node nothing must have changed for the top level otel integration, but maybe it was a regression with the core functionality or otel itself.

Did you upgrade your node version since then? Also could you share your otel version?

A reproduction would help sped up the investigation. I don't have a ton of bandwidth next couple days, @mydea could you help take a look?

@AbhiPrasad AbhiPrasad reopened this Aug 9, 2023
@AbhiPrasad AbhiPrasad removed their assignment Aug 9, 2023
@seeARMS
Copy link
Author

seeARMS commented Aug 9, 2023

I cloned my existing repo above (https://github.com/seeARMS/sentry-javascript-7538-reproduction), updated all dependencies to the latest using npm-check-updates then ncu --upgrade, and the / route actually isn't even getting logged anymore.

Using the old dependencies (eg cloning the repo above as-is) shows Sentry working as intended:

➜ npm run start
npm WARN cli npm v9.8.1 does not support Node.js v17.8.0. This version of npm supports the following node versions: `^14.17.0 || ^16.13.0 || >=18.0.0`. You can find the latest version at https://nodejs.org/.

> start
> node --require ./tracing.js app.js

Sentry Logger [log]: Integration installed: InboundFilters
Sentry Logger [log]: Integration installed: FunctionToString
Sentry Logger [log]: Integration installed: Console
Sentry Logger [log]: HTTP Integration is skipped because of instrumenter configuration.
Sentry Logger [log]: Integration installed: Http
Sentry Logger [log]: Integration installed: Undici
Sentry Logger [log]: Integration installed: OnUncaughtException
Sentry Logger [log]: Integration installed: OnUnhandledRejection
Sentry Logger [log]: Integration installed: ContextLines
Sentry Logger [log]: Integration installed: LocalVariables
Sentry Logger [log]: Integration installed: Context
Sentry Logger [log]: Integration installed: Modules
Sentry Logger [log]: Integration installed: RequestData
Sentry Logger [log]: Integration installed: LinkedErrors
Sentry Logger [log]: Integration installed: ProfilingIntegration
http server listening on port 8085
Sentry Logger [log]: [Tracing] starting undefined transaction - GET
Sentry Logger [log]: [Profiling] started profiling transaction: GET
Sentry Logger [log]: [Tracing] Starting '< unknown op >' span on transaction 'GET' (9cfa7e02b183fb92).
Sentry Logger [log]: [Tracing] Finishing '< unknown op >' span on transaction 'GET' (9cfa7e02b183fb92).
Sentry Logger [log]: [Tracing] Starting '< unknown op >' span on transaction 'GET' (9cfa7e02b183fb92).
Sentry Logger [log]: [Tracing] Finishing '< unknown op >' span on transaction 'GET' (9cfa7e02b183fb92).
Sentry Logger [log]: [Tracing] Starting '< unknown op >' span on transaction 'GET' (9cfa7e02b183fb92).
Sentry Logger [log]: [Tracing] Finishing '< unknown op >' span on transaction 'GET' (9cfa7e02b183fb92).
Sentry Logger [log]: [Tracing] Starting '< unknown op >' span on transaction 'GET' (9cfa7e02b183fb92).
Sentry Logger [log]: [Tracing] Finishing '< unknown op >' span on transaction 'GET' (9cfa7e02b183fb92).
Triggered route
Sentry Logger [log]: [Profiling] stopped profiling of transaction: GET /
Sentry Logger [log]: [Tracing] Finishing http.server transaction: GET /.
Sentry Logger [log]: [Tracing] starting undefined transaction - POST
Sentry Logger [log]: [Profiling] started profiling transaction: POST

After upgrading all deps to the latest, no Sentry debug lines showing that tracing or profiling is working:

➜ npm run start

> start
> node --require ./tracing.js app.js

Sentry Logger [log]: Integration installed: InboundFilters
Sentry Logger [log]: Integration installed: FunctionToString
Sentry Logger [log]: Integration installed: Console
Sentry Logger [log]: HTTP Integration is skipped because of instrumenter configuration.
Sentry Logger [log]: Integration installed: Http
Sentry Logger [log]: Integration installed: Undici
Sentry Logger [log]: Integration installed: OnUncaughtException
Sentry Logger [log]: Integration installed: OnUnhandledRejection
Sentry Logger [log]: Integration installed: ContextLines
Sentry Logger [log]: Integration installed: LocalVariables
Sentry Logger [log]: Integration installed: Context
Sentry Logger [log]: Integration installed: Modules
Sentry Logger [log]: Integration installed: RequestData
Sentry Logger [log]: Integration installed: LinkedErrors
Sentry Logger [log]: Integration installed: ProfilingIntegration
http server listening on port 8085
Triggered route

Here are my OTEL versions of the prod install:

    "@opentelemetry/api": "^1.4.1",
    "@opentelemetry/instrumentation-bunyan": "^0.31.1",
    "@opentelemetry/instrumentation-fs": "^0.7.1",
    "@opentelemetry/instrumentation-grpc": "^0.36.0",
    "@opentelemetry/instrumentation-http": "^0.36.0",
    "@opentelemetry/sdk-node": "^0.36.1",
    "@sentry/opentelemetry-node": "^7.61.1",
    "@sentry/profiling-node": "^1.1.2",
    "@sentry/tracing": "^7.61.1",
    "opentelemetry-instrumentation-express": "^0.39.1",

I don't believe I've upgraded Node.

@getsantry getsantry bot moved this to Waiting for: Product Owner in GitHub Issues with 👀 Aug 9, 2023
@mydea
Copy link
Member

mydea commented Aug 10, 2023

Thank you for the repro! I digged into it, and it seems the problem is @opentelemetryinstrumentation-grpc v0.41.2 - it works when pinned to v0.41.1.

It works for me with these dependencies:

"@opentelemetry/api": "1.4.1",
"@opentelemetry/core": "1.15.2",
"@opentelemetry/exporter-trace-otlp-grpc": "0.41.2",
"@opentelemetry/instrumentation-express": "0.33.0",
"@opentelemetry/instrumentation-grpc": "0.41.1",
"@opentelemetry/instrumentation-http": "0.41.2",
"@opentelemetry/sdk-node": "0.41.2",
"@opentelemetry/semantic-conventions": "1.15.2",
"@sentry/node": "7.62.0",
"@sentry/opentelemetry-node": "7.62.0",
"@sentry/profiling-node": "1.1.2",

I need to dig a bit further, but there are two relevant changes in 0.41.1 --> 0.41.2:

(just looking at the "not capturing at all" issue right now, not yet into breadcrumbs)

@mydea
Copy link
Member

mydea commented Aug 10, 2023

Some further digging, I opened an issue for the grpc instrumentation here: open-telemetry/opentelemetry-js#4053

@seeARMS
Copy link
Author

seeARMS commented Aug 29, 2023

Thanks! Saw your comments. Upgraded all @sentry package to 7.65.0 and confirmed the GrpcInstrumentation is instantiated after the HTTPinstrumentation and tags/breadcrumbs are still not getting logged correctly.

@getsantry getsantry bot moved this to Waiting for: Product Owner in GitHub Issues with 👀 2 Jan 29, 2024
@mydea
Copy link
Member

mydea commented Jan 30, 2024

@seeARMS How are you getting samplingContext?.request?.url?

I am only getting:

{
  samplingContext: {
    transactionContext: { name: 'GET', parentSampled: undefined },
    parentSampled: undefined
  }
}

#10399 (comment)

and looking at the implementation of @sentry/opentelemetry, thats the only thing they pass:

https://github.com/getsentry/sentry-javascript/blob/7dc92604e4a9448a17512d5622e28a3976b9267e/packages/opentelemetry/src/sampler.ts#L56C24-L62

We'll adjust this (still WIP) so that the tracesSampler receives the attributes of the span as sampling context - that would work for this case, wouldn't it?

@gajus
Copy link
Contributor

gajus commented Jan 30, 2024

We'll adjust this (still WIP) so that the tracesSampler receives the attributes of the span as sampling context - that would work for this case, wouldn't it?

Any pointers how to workaround this meanwhile?

@getsantry getsantry bot moved this to Waiting for: Product Owner in GitHub Issues with 👀 2 Jan 30, 2024
@mydea
Copy link
Member

mydea commented Jan 30, 2024

I'll open a PR to add this for the next release!

@mydea
Copy link
Member

mydea commented Jan 30, 2024

See: #10426

@mydea
Copy link
Member

mydea commented Jan 31, 2024

Moving this here, out of the PR - reply to #10426 (comment):

@mydea Related – how do I know inside of request if the request was sampled or not?

I need to generate and inject sentry-trace meta attribute.

<meta content="${sentryTrace}" name="sentry-trace" />

I currently do this:

import { context, trace } from '@opentelemetry/api';

const activeContext = context.active();

const activeSpan = trace.getSpan(activeContext);

if (!activeSpan) {
  return {
    sentryTrace: null,
    spanId: null,
    traceId: null,
  };
}

const { spanId, traceId } = activeSpan.spanContext();

const sentryTrace = `${traceId}-${spanId}-0`;

but need to figure out how to adjust -0 to -1 when the request is sampled.

const traceparentData = spanToTraceHeader(span);
const dynamicSamplingContext = dynamicSamplingContextToSentryBaggageHeader(
  getDynamicSamplingContextFromSpan(span),
);
const contentMeta = `<head>
    <meta name="sentry-trace" content="${traceparentData}"/>
    <meta name="baggage" content="${dynamicSamplingContext}"/>
`;

Should work?

@gajus
Copy link
Contributor

gajus commented Jan 31, 2024

Just to save others some time, spanToTraceHeader, dynamicSamplingContextToSentryBaggageHeader and getDynamicSamplingContextFromSpan come from @sentry/utils and @sentry/core.

@mydea The span accepted by these methods is not the same as what is returned by OpenTelemetry's getSpan.

@gajus
Copy link
Contributor

gajus commented Jan 31, 2024

This was useful:

/**
 * @see https://github.com/getsentry/sentry-javascript/blob/8bec42e0285ee301e8fc9bcaf02046daf48e0495/packages/core/src/utils/spanUtils.ts#L103
 */
export const spanIsSampled = (span: opentelemetry.api.Span) => {
  const { traceFlags } = span.spanContext();

  // eslint-disable-next-line no-bitwise
  return Boolean(traceFlags & 0x1);
};

@mydea
Copy link
Member

mydea commented Feb 1, 2024

Just to save others some time, spanToTraceHeader, dynamicSamplingContextToSentryBaggageHeader and getDynamicSamplingContextFromSpan come from @sentry/utils and @sentry/core.

@mydea The span accepted by these methods is not the same as what is returned by OpenTelemetry's getSpan.

Ah right, of course - this work is still WIP. In v8, spans will be the same everywhere - you will generally get OpenTelemetry Spans everywhere when interacting with spans. We hope to have a alpha version of this in a few weeks, where you should be able to have a fully integrated experience with the Sentry SDK & OpenTelemetry!

@seeARMS
Copy link
Author

seeARMS commented Feb 8, 2024

Generally looks good. I think whatever the baseLogger is doing might lose the async context that isolates requests.

Any other suggestions on what I can do here? I'm a bit at my wits end and have spent far too long trying to get this to work

@getsantry getsantry bot moved this to Waiting for: Product Owner in GitHub Issues with 👀 2 Feb 8, 2024
@lforst
Copy link
Member

lforst commented Feb 9, 2024

@seeARMS Nope I am lost. Either we have a bug in the SDK or your logger is the offender. Can you reproduce this with a minimal setup and share it with us? That would be amazing!

@AbhiPrasad
Copy link
Member

Things have changed a lot since we last update this - we now have official OTEL support with the SDK as the default which should make this experience much better.

Express with no custom OTEL setup: https://docs.sentry.io/platforms/javascript/guides/express/

custom OTEL setup (override sampler, processor, exporter): https://docs.sentry.io/platforms/javascript/guides/express/tracing/instrumentation/opentelemetry/#using-a-custom-opentelemetry-setup

Closing this issue as such.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Archived in project
Archived in project
Development

Successfully merging a pull request may close this issue.

5 participants