-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Concurrent batch processor features → batch processor #11248
Closed
Closed
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Confused how you fix #11213? |
@bogdandrutu that was a mistake. |
bogdandrutu
reviewed
Sep 30, 2024
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi Josh, this change is a bit too large and complicated to be reviewed in one PR. Can we split the 2 functionalities in 2 different PRs?
- The PR for multiple consumers.
- The PR for sync.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
Contributes downstream functionality back to the core component.
EarlyReturn
feature: enable legacy behavior (true)MaxConcurrency
feature: enable legacy behavior (1)TODO: Add feature gate.
EarlyReturn=true
legacy behavior should transition to false.MaxConcurrency=1
legacy behavior should transition to unlimited.Caveats Despite a year+ long effort to add equivalent and satisfactory batching support in the
exporterhelper
subcomponent, we still today lack support for back-pressure with batching and reliable error transmission. I believe it is time to say "Yes/And" here. I support the efforts to improveexporterhelper
and will contribute to that project myself, however I think repairing the legacy batch processor is also a good idea, given how long this has taken.The changes here were developed downstream, see https://github.com/open-telemetry/otel-arrow/blob/main/collector/processor/concurrentbatchprocessor
Link to tracking issue
I'm listing a number of issues that are connected with this, both issues pointing to unmet needs in the exporterhelper batcher and missing features in the legacy batcher. Accepting these changes will allow significantly improved batching support in the interim period until the new batching support is complete.
Part of #10825 -- until this features is complete, users who depend on
metadata_keys
in the batch processor will not be able to upgradePart of #10368 -- I see this as the root cause, we haven't been able to introduce concurrency to the exporterhelper w/o also introducing a queue, which interferes with error transmission
Fixes #8245 -- my original report about the problem solved here -- we add concurrency with batching and error transmission and do not depend on a queue (persistent or in-memory)
Part of #9591 -- users must use one of the available memory limiting mechanisms in conjunction with the batch processor
Part of #8122 -- until this is finished, users depend on the original batch processor
Part of #7460 -- another statement of #8245; the batch processor does not propagate errors today, and this fixes the batch processor's contribution to the problem.
Testing
New tests are included.
Documentation
TODO/WIP