Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

drop failed to exporter batches and return error when forcing flush a span processor #1860

Merged
merged 6 commits into from
Apr 29, 2021
Merged

drop failed to exporter batches and return error when forcing flush a span processor #1860

merged 6 commits into from
Apr 29, 2021

Conversation

paivagustavo
Copy link
Member

This resolves a problem in the batch span processor and a nice-to-have error report on the ForceFlush() method.

Failed span batches are now dropped as expected. Quoting specs:

Failure - exporting failed. The batch must be dropped. For example, this can happen when the batch contains bad data and cannot be serialized.

BatchSpanProcessor.ForceFlush() method now correctly report export errors. Quoting specs:

ForceFlush SHOULD provide a way to let the caller know whether it succeeded, failed or timed out.

Fixes #1833

@codecov
Copy link

codecov bot commented Apr 28, 2021

Codecov Report

Merging #1860 (0ad6cc6) into main (f6a9279) will increase coverage by 0.0%.
The diff coverage is 100.0%.

Impacted file tree graph

@@          Coverage Diff          @@
##            main   #1860   +/-   ##
=====================================
  Coverage   78.6%   78.6%           
=====================================
  Files        137     137           
  Lines       7304    7305    +1     
=====================================
+ Hits        5743    5749    +6     
+ Misses      1316    1313    -3     
+ Partials     245     243    -2     
Impacted Files Coverage Δ
sdk/trace/batch_span_processor.go 87.1% <100.0%> (+3.1%) ⬆️
exporters/trace/jaeger/jaeger.go 92.8% <0.0%> (+0.5%) ⬆️

@paivagustavo paivagustavo added area:trace Part of OpenTelemetry tracing pkg:SDK Related to an SDK package labels Apr 28, 2021
Copy link
Member

@Aneurysm9 Aneurysm9 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It shouldn't matter with the batch being reset regardless of errors, but there's a line in processQueue() that should be updated. It's currently:

shouldExport := len(bsp.batch) == bsp.o.MaxExportBatchSize

and should probably be:

shouldExport := len(bsp.batch) >= bsp.o.MaxExportBatchSize

sdk/trace/batch_span_processor.go Outdated Show resolved Hide resolved
sdk/trace/batch_span_processor_test.go Outdated Show resolved Hide resolved
sdk/trace/batch_span_processor_test.go Outdated Show resolved Hide resolved
@MrAlias MrAlias merged commit e399d35 into open-telemetry:main Apr 29, 2021
@paivagustavo paivagustavo deleted the drop_failed_batches branch April 29, 2021 16:43
@Aneurysm9 Aneurysm9 mentioned this pull request Jun 17, 2021
@pellared pellared added this to the untracked milestone Nov 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:trace Part of OpenTelemetry tracing pkg:SDK Related to an SDK package
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Batch processor batch size will grow endlessly on error
4 participants