-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Does an MPD with SegmentTimeline necessarily need to use wallclock? #1233
Comments
I read some closed issues:
And all of them stick with this notion of |
I am quite sure the answer is "yes, it must be in wall clock time and any dash.js behavior to the contrary is an accident" but I will leave it to player authors to go in depth about that. I just wanted to mention that clock-sync is solved by the UTCTiming element in DASH (defined in 2014 amd 1), which provides an authoritative time source to the client and thus defines "wall clock time" unambiguously. Edit: this is if you want to be IOP-conforming. Non-IOP-conforming behavior can be far more flexible, of course. |
Thank you @sandersaares 😄 |
We currently take a strict view of this and require the presentation timeline (derived from AST) to be correct. In future, we may be more flexible, to better tolerate drift in live streams. See #999 for details. |
Hi nice people, I changed my server-side implementation to rely on the sequential scheme ( Please noticed that I didn't update my Should I remove the <?xml version="1.0" encoding="utf-8"?>
<MPD xmlns="urn:mpeg:dash:schema:mpd:2011" xmlns:ns1="http://www.w3.org/2001/XMLSchema-instance" availabilityStartTime="2017-10-18T16:23:54Z" minBufferTime="PT2S" minimumUpdatePeriod="PT0S" publishTime="2017-12-06T13:41:34Z" timeShiftBufferDepth="PT120S" type="dynamic" ns1:schemaLocation="urn:mpeg:dash:schema:mpd:2011 DASH-MPD.xsd" profiles="urn:mpeg:dash:profile:isoff-live:2011,http://dashif.org/guidelines/dash-if-simple">
<Period start="PT0S" id="0">
<AdaptationSet id="1990" mimeType="audio/mp4" contentType="audio" lang="por" segmentAlignment="true" startWithSAP="1" audioSamplingRate="48000">
<Representation id="1988" bandwidth="96000" codecs="mp4a.40.5">
<SegmentTemplate timescale="1" media="116_audio-$Number$.mp4" startNumber="4463029" initialization="data:video/mp4;base64,CAFE">
<SegmentTimeline>
<S t="5263668" d="10" r="12"/>
</SegmentTimeline>
</SegmentTemplate>
</Representation>
</AdaptationSet>
<AdaptationSet id="1989" mimeType="video/mp4" contentType="video" frameRate="30/1" segmentAlignment="true" startWithSAP="1" par="16:9" maxWidth="768" maxHeight="432">
<Representation id="1986" bandwidth="626000" codecs="avc1.64001F" sar="1:1" width="512" height="288">
<SegmentTemplate timescale="1" media="116_626-$Number$.mp4" startNumber="4463029" initialization="data:video/mp4;base64,CAFE">
<SegmentTimeline>
<S t="5263668" d="10" r="12"/>
</SegmentTimeline>
</SegmentTemplate>
</Representation>
<Representation id="1987" bandwidth="1485000" codecs="avc1.64001F" sar="1:1" width="768" height="432">
<SegmentTemplate timescale="1" media="116_1485-$Number$.mp4" startNumber="4463029" initialization="data:video/mp4;base64,CAFE">
<SegmentTimeline>
<S t="5263668" d="10" r="12"/>
</SegmentTimeline>
</SegmentTemplate>
</Representation>
</AdaptationSet>
</Period>
</MPD> |
I also tried to use the <?xml version="1.0" encoding="utf-8"?>
<MPD xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:mpeg:dash:schema:mpd:2011" xsi:schemaLocation="urn:mpeg:dash:schema:mpd:2011 http://standards.iso.org/ittf/PubliclyAvailableStandards/MPEG-DASH_schema_files/DASH-MPD.xsd" id="110" type="dynamic" publishTime="2018-02-09T12:26:44" minimumUpdatePeriod="PT15S" availabilityStartTime="2017-11-19T17:41:42.573000+00:00" minBufferTime="PT15S" suggestedPresentationDelay="PT30.00S" timeShiftBufferDepth="PT59.50S" profiles="urn:hbbtv:dash:profile:isoff-live:2012,urn:mpeg:dash:profile:isoff-live:2011">
<Period start="PT0.00S" id="1">
<AdaptationSet mimeType="video/mp4" scanType="progressive" segmentAlignment="true" subsegmentAlignment="true" startWithSAP="1" subsegmentStartsWithSAP="1" bitstreamSwitching="true">
<Representation id="1" width="768" height="432" frameRate="30/1" bandwidth="1485000" codecs="avc1.64001F">
<SegmentTemplate timescale="30" media="coelhao_video_1_2_$Number$.mp4?m=1516907594" initialization="coelhao_video_1_2_init.mp4?m=1516907594" startNumber="4467355">
<SegmentTimeline>
<S t="211975782" d="300" r="4"/>
</SegmentTimeline>
</SegmentTemplate>
</Representation>
<Representation id="2" width="512" height="288" frameRate="30/1" bandwidth="626000" codecs="avc1.64001F">
<SegmentTemplate timescale="30" media="coelhao_video_1_3_$Number$.mp4?m=1516907594" initialization="coelhao_video_1_3_init.mp4?m=1516907594" startNumber="4467355">
<SegmentTimeline>
<S t="211975782" d="300" r="4"/>
</SegmentTimeline>
</SegmentTemplate>
</Representation>
</AdaptationSet>
<AdaptationSet mimeType="audio/mp4" segmentAlignment="0" lang="eng">
<Representation id="3" bandwidth="96000" audioSamplingRate="48000" codecs="mp4a.40.5">
<AudioChannelConfiguration schemeIdUri="urn:mpeg:dash:23003:3:audio_channel_configuration:2011" value="2"></AudioChannelConfiguration>
<SegmentTemplate timescale="48000" media="coelhao_audio_1_1_$Number$.mp4?m=1516907594" initialization="coelhao_audio_1_1_init.mp4?m=1516907594" startNumber="4467355">
<SegmentTimeline>
<S t="339161257599" d="479232"/>
<S t="339161736831" d="481280"/>
<S t="339162218111" d="479232"/>
<S t="339162697343" d="481280"/>
<S t="339163178623" d="479232"/>
</SegmentTimeline>
</SegmentTemplate>
</Representation>
</AdaptationSet>
</Period>
</MPD>
|
I'm planning to find a way to expose these resource so you can test it. |
Every live manifest type requires clock sync. Changing from Looking at the Elemental manifest, if we fetched your manifest at |
thanks, @TheModMaker I'll take a closer look at it ;D |
I was trying to understand what you've described and I'm still wondering how this information is used to determine the segment that Shaka needs to request. For instance, I ran a ruby program here to understand the relation between these variables (mostly AST_UTC=1511113302
#=> 1511113302
NOW=Time.now.to_i
#=> 1518199697
LIVE_EDGE=NOW - AST_UTC
#=> 7086395 (is this the wallclock?)
t_x=339163178623
#=> 339163178623 (last S@t for audio)
timescale=48000
#=> 48000 (ts for audio)
STARTS_AT=t_x/timescale
#=> 7065899
LIVE_EDGE
#=> 7086395
STARTS_AT
#=> 7065899
(LIVE_EDGE-STARTS_AT)
#=> 20496
(LIVE_EDGE-STARTS_AT)/60/60
# => 5 (in hours) At the end how does shaka uses the |
You can look at #1265 for an in-depth description of how segment times work. It deals with multi-Period live, but the basic concepts are the same. Basically there is a concept of a "presentation time". This represents the time that a segment will be played. For our case, this also maps to the When we start streaming, we need to determine what presentation time to start playing at. We use Note that you can't change the |
Thank you very much @TheModMaker I learned more here and I'll read the docs again =D but if I have any relevant doubt I'll post it here. |
Hi @TheModMaker @sandersaares and @joeyparrish thank you again for the attention and help, I read the guideline and ISO again (at least the sections on the timeline) and I'm still with some doubts, but first, let's see if I really understood it right. Let's continue with this pseudo code scenario, after that I'll post my question: AST = 1000
PUBLISH = 1900
UTC_TAG = 1900
PERIOD_START = 0
PTO = 0
SEGMENTS = [
{SEQ: 0, DURATION: 10, T: 1880},
{SEQ: 1, DURATION: 10, T: 1890},
{SEQ: 2, DURATION: 10, T: 1900}
]
CLIENT_NOW = UTC_TAG
START_PRESENTATION_TIME = CLIENT_NOW - AST # or live edge
# 900
Some random thoughts I think that we should offer or be by default less strict with timing but I know that deep inside this is mostly an MPEG dash spec "problem", it'd be pretty good if there was a I really loved what US proposed (although I thought it was on specs ¬¬), basically a dash client should rely only on |
See #999 for ignoring I will write up a doc describing this is much more detail since there has been a lot of confusion about how live works in Dash. It is important to note: segment times are in presentation time, not wall clock time. Please ignore wall clock time since it is ONLY used to determine where we start playing. We only use it to determine where to set So you start streaming, what presentation time do we start at? It is From this point forward, we only deal with presentation times. These times start at 0 and go forward (think like a VOD stream). While we are streaming, we look at So lets convert the three segments you listed to their presentation times. A segment has a presentation time of: Segment 0 will have a presentation time of:
This exists, it is called the IOP Guidelines. This is much more realistic and restrictive than the DASH spec itself. But even that is a bit permissive on what is allowed.
The segment times are in presentation time, so this doesn't use AST. Once we have 900, we look up a segment with a presentation time of 900.
That is not correct. AST is equivalent to 0 in presentation time. So if AST is 1000 in wall clock time, then that is equivalent to 0 in presentation time. But again, this only means that if we start playing at 1900 wall clock time (e.g. using
Yes and no. Once we start streaming we will just pick the next segment. But if we seek, we need to look up which segment to start at. But again, we are using
You can't. After #999 we will do it by default, but there is no way to do it now.
There is NO difference between different kinds of manifests. You can use So this: <SegmentTemplate timescale="1" media="Foo-$Number$.mp4" initialization="init.mp4">
<SegmentTimeline>
<S t="0" d="10"/>
<S d="10"/>
<S d="10"/>
<S d="10"/>
<S d="10"/>
</SegmentTimeline>
</SegmentTemplate> Is exactly the same as: <SegmentTemplate timescale="1" media="Foo-$Number$.mp4" initialization="init.mp4"
duration="10" /> |
First, sorry to make you go again and again in your explanation. It'd be very helpful to have such doc! A sample code (with controlled Presentation and Wall Clock timing) and diagrams can help.
Redoing my example in pseudo code: # -------------------------|AST|------|NOW|->
AST = 100
# at AST=100 my encoder started to split segments of 10 seconds
# starting with S[0]@t=0 then ten seconds later S[1]@t=10 and so on..
PUBLISH = 130
UTC_TAG = 130
# thirty seconds after I publish an (new) mpd
PERIOD_START = 0
PTO = 0
# it has some previous segments
SEGMENTS = [
{SEQ: 0, DURATION: 10, T: 10},
{SEQ: 1, DURATION: 10, T: 20},
{SEQ: 2, DURATION: 10, T: 30}
]
CLIENT_NOW = UTC_TAG
START_PRESENTATION_TIME = CLIENT_NOW - AST
# video.currentTime=30
# from this point and beyond it'll rely mostly on S[x]@t
It seems to me that the |
No problem. I'm happy to help.
Actually we will be playing at segment 1. I forgot we need to offset by the segment duration. So "right now" is 30, but that time is still being recorded on the server. We need to wait for the whole segment to be recorded, so we will be playing at least 10 seconds in the past. So the real live edge will be 20.
TSB only changes whether a segment is "available". This can be thought of as a DVR window, you can play the last TSB seconds of a live stream. Anything older than that isn't available, so you can't seek to it. For example, if the TSB was 15, then in your example, we'll allow seeking in the range |
Thank you very much @TheModMaker I think I finally understood and I'll adapt my server-side code to be compliant with Dash. I'll be happy to help or give feedback or read the doc you're going to do. cheers. |
New doc cherry-picked for v2.3.4. |
Have you read the FAQ and checked for duplicate issues: Yes
What version of Shaka Player are you using: v2.3.0-uncompiled
Can you reproduce the issue with our latest release version: Yes
Can you reproduce the issue with the latest code from
master
: YesAre you using the demo app or your own custom app: Demo
If custom app, can you reproduce the issue using our demo app:
What browser and OS are you using: MacOS 10.13 Firefox 57.0.4
What are the manifest and license server URIs:
What did you do?
Try to play a live streaming, that I already played using
DASHjs
reference player.What did you expect to happen?
The player to use solely the
SegmentTimeline
values to retrieve the live edge segment.What actually happened?
It did not. It seems that it uses the
now() - (AST + Period.start) - delay
as the base to calculate the edge segmentTime
. I came to this conclusion by debuging this live stream.Long Story;
While reading some docs on that I was under the impression that the player could find the first segment (most recent) by itself with no dependency on any clock at all.
For instance, let's take this
SegmentTemplate
:The way I know which segment is the edge of this live streaming is:
Is this assumption backed by the DASH specs? (since I couldn't find on the guidelines but at the same time the DASHjs is able to play it)
The text was updated successfully, but these errors were encountered: