Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

stream has too much local buffering based on user behavior #56

Closed
hburgund opened this issue Mar 20, 2017 · 7 comments
Closed

stream has too much local buffering based on user behavior #56

hburgund opened this issue Mar 20, 2017 · 7 comments
Assignees

Comments

@hburgund
Copy link
Member

I am not sure of the inner workings on the client side, but for some reason, there is a very large amount of audio buffering of the stream in comparison to the typical 5-7 seconds that has been the case in other RW apps.

I have a theory, but it is unproven.

  • User chooses Institution
  • stream is instantiated on server
  • stream begins buffering locally
  • user takes time proceeding through the next few screens to get to the Listen screen where audio plays
  • local buffering is increased over this time period
  • user selects tag to hear audio and stream begins playing, but it begins by playing the entire buffer resulting in the asset content being delayed by the amount of time the user took to navigate the intervening screens plus the typical 5-7 seconds.

I don't know how the media player works, but I'm guessing that the solution will simply be to not instantiate the media player until the user gets to the listen screen and expects to hear audio. I realize this will likely add a few seconds delay to the initial start of playback, but we already have a spinner and this seems reasonable in comparison to the ridiculous delays we now can encounter.

@hburgund
Copy link
Member Author

I have tested more rigorously and am quite confident that the stream is being instantiated at the wrong time in the flow. Or at least the local buffering of the stream starts at the wrong time.

Currently, the stream is being instantiated with a POST api/2/streams/ call when the user selects and Institution (in this case the only choice is PEM). I assume there are a number of api requests that happen at this point, including the creation of the stream.

I have done tests where I wait differing times between selecting the Institution and clicking the play button on the Listen screen and the amount of time I wait is more or less exactly added onto the resulting latency, so I am quite confident in what is happening.

I see two potential solutions here:

  • wait until the user selects a tag or presses Play to instantiate the stream on the server
    • this will result in a larger delay between pressing Play and hearing anything, which isn't ideal
  • OR instantiate the stream at the same time as currently, but don't instantiate the local media player until the user actually wants to hear the audio ie when selecting a tag or pressing play on the Listen screen. I think what is happening now is that the media player begins buffering right when the stream is created on the server, so it stores up a lot of local buffer data that increases latency.

I think the second solution makes more sense functionally, and I'm guessing it would be easier to implement in the code as well, but I need advice from @seeReadCode.

Please let me know your reactions to my theory and proposed solutions and whichever solution you think is best, please point me to some key spots in the code to get me ramped up asap as this codebase is entirely new to me right now.

@seeReadCode
Copy link
Collaborator

@hburgund
Copy link
Member Author

yes; it resulted in a RWFramework error: RWGetProjectsIdFailure.

@seeReadCode
Copy link
Collaborator

Ah, because a user/session hasn't been established I imagine.

So let's keep that start line. We could try 1. removing this one:
https://github.com/seeRead/roundware-ios-framework-v2/blob/master/Pod/Classes/RWFrameworkAPI.swift#L134-L136

And then 2. adding rwf.apiPostStreams() to https://github.com/seeRead/roundware-ios-digita11y/blob/master/Digita11y/RoomsViewController.swift#L262

@hburgund
Copy link
Member Author

I tried this suggestion, but it didn't make a difference with the issue. As stated above, I think that the best solution is to get the AVPlayer to not start buffering until the user actually wants to hear audio rather than way in advance.

So with this approach in mind, @zobkiw was able to help me get it working. I will post code soon, but essentially, we removed the createPlayer() in the framework which forced the AVPlayer to not be created until the user clicks on play or a tag. Seems to do the trick.

@hburgund hburgund self-assigned this Mar 25, 2017
hburgund added a commit to roundware/roundware-ios-framework-v2 that referenced this issue Mar 27, 2017
@hburgund
Copy link
Member Author

fixed in framework with: roundware/roundware-ios-framework-v2@443c1c8

hburgund added a commit to roundware/roundware-ios-framework-v2 that referenced this issue Mar 31, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants