Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for fetching multiple segments in parallel #4658

Closed
tyrelltle opened this issue Nov 7, 2022 · 4 comments · Fixed by #4784
Closed

Add support for fetching multiple segments in parallel #4658

tyrelltle opened this issue Nov 7, 2022 · 4 comments · Fixed by #4784
Assignees
Labels
flag: seeking PR We are actively seeking PRs for this; we do not currently expect the core team will resolve this priority: P2 Smaller impact or easy workaround status: archived Archived and locked; will not be updated type: enhancement New feature or request
Milestone

Comments

@tyrelltle
Copy link
Contributor

Have you read the FAQ and checked for duplicate open issues? Yes

Is your feature request related to a problem? Please describe.
Currently shaka is fetching the segments sequentially, for each of the audio and video streams.
image
When the client network is slow, especially when playback speed is set to 2x, rebuffering may happen frequently when the buffer is starved.

Describe the solution you'd like
We would like to re-start the previous PR #2809 which adds parallel fetching of mpeg-dash segments. However, before doing that we would like to confirm if there is existing feature plan to solve this problem, since this PR was old (2020).

Describe alternatives you've considered
We could also avoid changing shaka code, by creating a custom HttpFetchPlugin to kick start fetching segment of next presentation time when fetching each segment of current presentation time. However this sounds like hacky solution. ex: what if there is a race condition and etc.

Additional context
Not at the moment.

@tyrelltle tyrelltle added the type: enhancement New feature or request label Nov 7, 2022
@github-actions github-actions bot added this to the Backlog milestone Nov 7, 2022
@avelad
Copy link
Member

avelad commented Nov 17, 2022

@tyrelltle All improvements are welcome, I assign the issue to you so you can work on it. Thank you!

@avelad avelad added flag: seeking PR We are actively seeking PRs for this; we do not currently expect the core team will resolve this priority: P2 Smaller impact or easy workaround labels Nov 17, 2022
@tyrelltle
Copy link
Contributor Author

Thanks @avelad i will create a PR for it

@joeyparrish
Copy link
Member

Overlapping segment requests has the potential to increase bandwidth utilization (getting throughput closer to actual bandwidth capability). This could lead to the ability to stream at higher bitrates/resolutions for some users.

The component responsible for that decision would be StreamingEngine, which makes requests. So I expect you would need to focus your efforts there.

Keep in mind, though, that when bufferingGoal is met, there's no longer any opportunity for parallel fetches, as we would naturally stop fetching entirely. That probably explains the gaps in your screenshot above.

@tyrelltle
Copy link
Contributor Author

tyrelltle commented Nov 22, 2022

@joeyparrish @avelad
I am preparing to create a PR. Just sharing my current plan in case you have early feedback.

After going though couple options I am thinking to use the solution implemented in this abandoned PR
#2809

What the abandoned PR does:

  • before fetchAndAppend() it kick starts prefetching N segments ahead of current requesting segment and store in Map
  • main flow of StreamingEngine is unchanged, it still requests segments sequentially, difference is it will re-use the existing payload stored in the Map if the requested segment is already there.
  • Pro
    • less dramatic change to the main flow
  • Con
    • we are keeping a separated buffer of segments in a Map, besides the video buffer. So need more care to clean it up.

Another option I was thinking but suggest not proceed

  • avoid having additional Map storing the parallel segments
  • instead, change fetchAndAppend to not only fetch parallel segments, but also each parallel operation is responsible for appending the segment to video buffer
  • Pro
    • no additional Map buffer maintained, and prefetched segments that will not be used will get cleaned up by existing clearBuffer logics.
  • Con
    • more changes to main flow
    • Its hard to ensure the requirement of no buffer is ahead of other buffers
      • say we have separated audio and video streams, each stream's update_() kick start 10 parallel opertions to fetch next 10 segments AND append them. That means video stream will get buffered more than audio stream always thus breaking the rule.

I am thinking follow the approach from the abandoned PR, and in additon, add logics to ensure the Map buffer is cleaned up when seek happens, and when the code path of makeAbortDecision_ is called.

But let me know if you suggest other approaches, meanwhile I will work on the PR.

Thanks !

avelad pushed a commit that referenced this issue Jan 31, 2023
closes #4658.

This solution is inspired by abandoned PR
#2809, which
implements segment prefetching ahead of current play head.


![image](https://user-images.githubusercontent.com/3315733/205465795-75c605d2-c2e3-4d03-90f5-46a72a7189d2.png)
@avelad avelad modified the milestones: Backlog, v4.4 Feb 3, 2023
@github-actions github-actions bot added the status: archived Archived and locked; will not be updated label Apr 1, 2023
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Apr 1, 2023
gkatsev pushed a commit to sky-hugolima/shaka-player-contrib that referenced this issue Dec 6, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
flag: seeking PR We are actively seeking PRs for this; we do not currently expect the core team will resolve this priority: P2 Smaller impact or easy workaround status: archived Archived and locked; will not be updated type: enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants