Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add spans to Trace::ExportError #1582

Merged
merged 4 commits into from
Jan 25, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 13 additions & 1 deletion sdk/lib/opentelemetry/sdk/trace/export.rb
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,19 @@ module Trace
# The Export module contains the built-in exporters and span processors for the OpenTelemetry
# reference implementation.
module Export
ExportError = Class.new(OpenTelemetry::Error)
# Raised when an export fails; spans are available via :spans accessor
class ExportError < OpenTelemetry::Error
# Returns the {Span} array for this exception
#
# @return [Array<OpenTelemetry::SDK::Trace::Span>]
attr_reader :spans
fbogsany marked this conversation as resolved.
Show resolved Hide resolved

# @param [Array<OpenTelemetry::SDK::Trace::Span>] spans the array of spans that failed to export
def initialize(spans)
super("Unable to export #{spans.size} spans")
@spans = spans
end
end

# Result codes for the SpanExporter#export method and the SpanProcessor#force_flush and SpanProcessor#shutdown methods.

Expand Down
23 changes: 10 additions & 13 deletions sdk/lib/opentelemetry/sdk/trace/export/batch_span_processor.rb
Original file line number Diff line number Diff line change
Expand Up @@ -110,7 +110,7 @@ def force_flush(timeout: nil) # rubocop:disable Metrics/MethodLength
remaining_timeout = OpenTelemetry::Common::Utilities.maybe_timeout(timeout, start_time)
return TIMEOUT if remaining_timeout&.zero?

batch = snapshot.shift(@batch_size).map!(&:to_span_data)
batch = snapshot.shift(@batch_size)
result_code = export_batch(batch, timeout: remaining_timeout)
return result_code unless result_code == SUCCESS
end
Expand Down Expand Up @@ -162,7 +162,7 @@ def work
@condition.wait(@mutex, @delay_seconds) while spans.empty? && @keep_running
return unless @keep_running

fetch_batch
spans.shift(@batch_size)
end

@metrics_reporter.observe_value('otel.bsp.buffer_utilization', value: spans.size / max_queue_size.to_f)
Expand All @@ -183,35 +183,32 @@ def reset_on_fork(restart_thread: true)
OpenTelemetry.handle_error(exception: e, message: 'unexpected error in BatchSpanProcessor#reset_on_fork')
end

def export_batch(batch, timeout: @exporter_timeout_seconds)
def export_batch(span_array, timeout: @exporter_timeout_seconds)
batch = span_array.map(&:to_span_data)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✅ calling to_span_data here instead of elsewhere in bsp
✅ no longer mutating span_array so that we can pass an array of Spans to report_result

result_code = @export_mutex.synchronize { @exporter.export(batch, timeout: timeout) }
report_result(result_code, batch)
report_result(result_code, span_array)
result_code
rescue StandardError => e
report_result(FAILURE, batch)
report_result(FAILURE, span_array)
@metrics_reporter.add_to_counter('otel.bsp.error', labels: { 'reason' => e.class.to_s })
FAILURE
end

def report_result(result_code, batch)
def report_result(result_code, span_array)
if result_code == SUCCESS
@metrics_reporter.add_to_counter('otel.bsp.export.success')
@metrics_reporter.add_to_counter('otel.bsp.exported_spans', increment: batch.size)
@metrics_reporter.add_to_counter('otel.bsp.exported_spans', increment: span_array.size)
else
OpenTelemetry.handle_error(exception: ExportError.new("Unable to export #{batch.size} spans"))
OpenTelemetry.handle_error(exception: ExportError.new(span_array))
@metrics_reporter.add_to_counter('otel.bsp.export.failure')
report_dropped_spans(batch, reason: 'export-failure')
report_dropped_spans(span_array, reason: 'export-failure')
end
end

def report_dropped_spans(dropped_spans, reason:, function: nil)
@metrics_reporter.add_to_counter('otel.bsp.dropped_spans', increment: dropped_spans.size, labels: { 'reason' => reason, OpenTelemetry::SemanticConventions::Trace::CODE_FUNCTION => function }.compact)
end

def fetch_batch
spans.shift(@batch_size).map!(&:to_span_data)
end

def lock
@mutex.synchronize do
yield
Expand Down
Loading