-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OTEL span logging in HotRod example #4550
Conversation
f12c2ae
to
bf37a8a
Compare
|
||
return logger | ||
} | ||
if otelSpan := trace.SpanFromContext(ctx); otelSpan != nil { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A hack to fallback to trying to fetch the OTEL span if trying to fetch the OT span fails.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @albertteoh for hacking around this. Supposedly, I thought we needed this. You made it. Awesome! 💯
Codecov ReportPatch coverage has no change and project coverage change:
Additional details and impacted files@@ Coverage Diff @@
## main #4550 +/- ##
==========================================
- Coverage 97.05% 97.04% -0.02%
==========================================
Files 301 301
Lines 17817 17817
==========================================
- Hits 17292 17290 -2
- Misses 421 423 +2
Partials 104 104
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this reveals one gotcha we will need to work around everywhere - when plain OTEL is used upper in the call chain, the OT instrumentation in lower layers will not find the span (OTEL->OT), while the reverse is apparently working (OT->OTEL). This is something to point out in the blog, and also in the bridge README.
@@ -53,7 +53,14 @@ func (b Factory) For(ctx context.Context) Logger { | |||
zap.String("span_id", otelSpan.SpanContext().SpanID().String()), | |||
} | |||
} | |||
|
|||
return logger | |||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't recall where L50-55 came from, they seem to do something similar.
spanFields []zapcore.Field | ||
} | ||
|
||
func (sl otelSpanLogger) Debug(msg string, fields ...zapcore.Field) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These methods do not directly depend on underlying span so we could keep a single spanLogger type and only differentiate lower in the implementation which span it is dealing with.
logger *zap.Logger | ||
span trace.Span | ||
spanFields []zapcore.Field | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Another thing that's worth looking into is this: when the Context contains an OT span generated by OTEL Bridge, is there a way to access the real OTEL Span that it wraps? Because if so, rather than maintaining two implementations here we can simply upgrade this spanLogger to only work with OTEL API.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Namely, if they don't have it already, I recommend creating a PR for the bridge to implement an Unwrap method:
func Unwrap(opentracing.Span) (trace.Span, error)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Doesn't look like they have a public method for that https://pkg.go.dev/go.opentelemetry.io/otel/bridge/opentracing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It looks like we can keep a single spanLogger because when writing the span to the context, its tracer is obtained and, because it's a bridge tracer, it's able to inject the span in the "OTEL-way" with the correct context key and OTEL span interface. See: https://github.com/opentracing/opentracing-go/blob/master/gocontext.go#L14.
I've opened another PR that takes advantage of this fact: #4556.
Superseded by #4556. |
## Short description of the changes - A cleaned up version of #4550. - It uses a single spanLogger for both OpenTracing and OpenTelemetry spans. - The logging works for both types spans because of [this clever line of code](https://github.com/opentracing/opentracing-go/blob/master/gocontext.go#L14) where `tracerWithHook` is an OTEL bridge tracer, allowing span to be stored in the `ctx` the "OTEL-way", making the span discoverable as a legitimate OTEL span such that `ok == true`: ```go if span, ok := ctx.Value(currentSpanKey).(Span); ok { ... } ``` ## Testing Ran locally to confirm logs appearing for both OpenTracing spans: ![Screenshot 2023-07-01 at 8 12 16 pm](https://github.com/jaegertracing/jaeger/assets/26584478/572b037a-8fbb-43aa-abdf-6a8a2a137f89) and OpenTelemetry spans: ![Screenshot 2023-07-01 at 8 12 08 pm](https://github.com/jaegertracing/jaeger/assets/26584478/681da3b3-8f07-4f85-b60e-8235592c89bd) --------- Signed-off-by: albertteoh <[email protected]>
## Short description of the changes - A cleaned up version of jaegertracing#4550. - It uses a single spanLogger for both OpenTracing and OpenTelemetry spans. - The logging works for both types spans because of [this clever line of code](https://github.com/opentracing/opentracing-go/blob/master/gocontext.go#L14) where `tracerWithHook` is an OTEL bridge tracer, allowing span to be stored in the `ctx` the "OTEL-way", making the span discoverable as a legitimate OTEL span such that `ok == true`: ```go if span, ok := ctx.Value(currentSpanKey).(Span); ok { ... } ``` ## Testing Ran locally to confirm logs appearing for both OpenTracing spans: ![Screenshot 2023-07-01 at 8 12 16 pm](https://github.com/jaegertracing/jaeger/assets/26584478/572b037a-8fbb-43aa-abdf-6a8a2a137f89) and OpenTelemetry spans: ![Screenshot 2023-07-01 at 8 12 08 pm](https://github.com/jaegertracing/jaeger/assets/26584478/681da3b3-8f07-4f85-b60e-8235592c89bd) --------- Signed-off-by: albertteoh <[email protected]> Signed-off-by: Afzal Ansari <[email protected]>
Description
This is a throw-away draft to illustrate the changes necessary to get logging working for the HotRod example, after switching the
driver.Server
to use the OTEL tracer.The key to the problem is using the
otelgrpc.UnaryServerInterceptor
during the switch which, AFAIK, when extracting the span context, inserts an OTEL span into the context.This means that the
logger.For(ctx)
lookup for the span will fail to find an OT span:and return the default stdout logger, which explains why the logs are visible in the terminal but not in spans when run locally.
This PR adds an
otelSpanLogger
, which is almost identical to the OTspanLogger
, with a different encoding (incomplete) implementation. When extracted from the context, it will successfully log to the span as an event and surface as a log entry in Jaeger UI:Note, just the bare minimum and least-effort changes were introduced, and so should not be simply copy-pasted as a solution, but rather, be carefully considered before incorporating. Little care was taken for code correctness, style, etc.
The changes from this PR are made on top of #4548, so there may be some unrelated changes present.