Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

1.x: request rebatch operator #3971

Merged
merged 3 commits into from
Jun 1, 2016
Merged

Conversation

akarnokd
Copy link
Member

This is a follow-up on #3964 but with a separate operator on Observable.

* </dl>
*
* @param n the initial request amount, further request will happen after 75% of this value
* @return the Observable that rebatches request amounts from downstream
Copy link
Collaborator

@DavidMGross DavidMGross May 31, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added.

@zsxwing
Copy link
Member

zsxwing commented Jun 1, 2016

👍

@akarnokd
Copy link
Member Author

akarnokd commented Jun 1, 2016

@abersnaze, @stealthcode you had some use cases for this, any objections?

@abersnaze
Copy link
Contributor

the reuse of the observeOn is interesting but couldn't it be done without the allocation of a queue?

@akarnokd
Copy link
Member Author

akarnokd commented Jun 1, 2016

If the downstream request is unbounded and the downstream has caught up then the queue can be skipped. In this case, observeOn can't be reused anymore and a custom drain logic has to be implemented.

Otherwise, the upstream emissions have to be stored temporarily for an underrequesting downstream.

@stevegury
Copy link
Member

👍

@akarnokd akarnokd merged commit bc83db0 into ReactiveX:1.x Jun 1, 2016
@akarnokd akarnokd deleted the OperatorRebatch branch June 1, 2016 21:53
@stealthcode
Copy link

I know that @abersnaze still had reservations about this. I think that this should not be using observeOn.

@stealthcode
Copy link

My concern is this - If @abersnaze implemented the batching functionality then why wouldn't we use that? The queue in observeOn scheduling creates a layer of indirection that seems unnecessary.

@akarnokd
Copy link
Member Author

akarnokd commented Jun 2, 2016

Remember, this started out as a change to observeOn to not ignore the immediate scheduler but people wanted this behavior exposed behind a proper name.

@stealthcode
Copy link

Thanks for reminding me of the context of this work. It seems like we have 2 implementations for the same functionality. I think @abersnaze and I agree that the 2 features of request batching and request valve type functionality could be composed. However I think that using observeOn for this functionality is not necessarily the best way to accomplish this.

@stealthcode
Copy link

I personally would be okay with either implementation. I think observeOn is a nice choice because of the request management but would prefer it more if there wasn't a queue. But realistically I think the choice is fine.

Also it's interesting to note that users are gravitating more and more to taking direct control over the requester-producer interactions.

@abersnaze
Copy link
Contributor

For example this PR does something similar but exactly n (could be modified to have optional 25%) and without a queue #3781.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants