-
Notifications
You must be signed in to change notification settings - Fork 7.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
1.x: request rebatch operator #3971
Conversation
* </dl> | ||
* | ||
* @param n the initial request amount, further request will happen after 75% of this value | ||
* @return the Observable that rebatches request amounts from downstream |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added.
👍 |
@abersnaze, @stealthcode you had some use cases for this, any objections? |
the reuse of the observeOn is interesting but couldn't it be done without the allocation of a queue? |
If the downstream request is unbounded and the downstream has caught up then the queue can be skipped. In this case, Otherwise, the upstream emissions have to be stored temporarily for an underrequesting downstream. |
👍 |
I know that @abersnaze still had reservations about this. I think that this should not be using |
My concern is this - If @abersnaze implemented the batching functionality then why wouldn't we use that? The queue in observeOn scheduling creates a layer of indirection that seems unnecessary. |
Remember, this started out as a change to |
Thanks for reminding me of the context of this work. It seems like we have 2 implementations for the same functionality. I think @abersnaze and I agree that the 2 features of request batching and request valve type functionality could be composed. However I think that using |
I personally would be okay with either implementation. I think Also it's interesting to note that users are gravitating more and more to taking direct control over the |
For example this PR does something similar but exactly n (could be modified to have optional 25%) and without a queue #3781. |
This is a follow-up on #3964 but with a separate operator on
Observable
.