Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Support Parallel Segment Fetching #4784
feat: Support Parallel Segment Fetching #4784
Changes from 7 commits
80bb4fe
7a2e080
b5e9759
558287c
01ea31f
81a9c67
2472320
10006d9
81960bd
9656ab4
c6a8e8c
fdd3d5c
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Design-wise, I'm not sure it makes sense to always clear all of the segments away in this case.
For example, if you are going from a pre-fetch limit of 4 to a pre-fetch limit of 6, couldn't you just keep the stuff you have already pre-fetched?
It seems like the most efficient thing to do would be to store an array that keeps track of the order of the pre-fetched segments, and then pop and clear segments from the end of it until you fit into the new limit. That way nothing happens if the limit has been increased, and you clear away as few pre-fetched segments as possible in the case of the limit being decreased.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
updated. actually it turns out that Map.keys reserves the insertion order.
so instead of creating a new ordered array, i just grab
segmentPrefetchMap_.keys
and iterate from end to pop + clear the segments.not sure Array.from(iterator) will actually do any iteration though, but it might have minimal performance impact since resetLimit could be called very rarely.
open to other suggestions :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pushed a new commit to simplify resetLimit, by using while + arr.pop() instead of for loop from end of arr