Skip to content

Commit a78b2a8

Browse files
committed
[MSE] Fix sample DTS to prevent deletion of preexisting samples when PTS doesn't conflict
https://bugs.webkit.org/show_bug.cgi?id=305208 Reviewed by Alicia Boya Garcia. Let's consider a preexisting samples A, B, C, and a new sample N with the following timestamps: Decoding time Presentation time ------------- ----------------- Sample A: 827.826164000 827.826164000 Sample B: 827.826164100 827.826164100 Sample C: 827.826164200 827.826164200 Sample N: 827.826125000 827.909542000 These disparities in DTS vs. PTS offsets between samples A and B exist because both samples come from completely different videos (eg: because of ad insertion). In a timeline: DTS ···NABC······· PTS ····ABC···N··· Judging by PTS, the samples A, B, C should remain and theoretically don't need to be erased. However, considering DTS, the samples A, B, C have to be erased, because the sample N would reach the video decoder earlier (by decoding ordering) and change the state of the decoder, making it unsuitable for samples A, B, C. This commit tries to fix the situation by changing sample N (the new sample) DTS to a value slightly higher than sample C DTS (that is, DTS plus epsilon, a small value), like this: Decoding time Presentation time ------------- ----------------- Sample A: 827.826164000 827.826164000 Sample B: 827.826164100 827.826164100 Sample C: 827.826164200 827.826164200 Sample N: *827.826164201* 827.909542000 DTS ····ABCN······ PTS ····ABC···N··· This fix avoids the deletion of samples A, B, C and the potential problems associated with that. A layout test has been added to show the problem and prove that the fix solves it. * LayoutTests/media/media-source/media-source-samples-out-of-order-erase-last-previous-frames-expected.txt: Added. * LayoutTests/media/media-source/media-source-samples-out-of-order-erase-last-previous-frames.html: Added. * Source/WebCore/platform/graphics/SourceBufferPrivate.cpp: (WebCore::SourceBufferPrivate::processMediaSample): If a new sync sample is being added and there are next samples in decode order with PTS lower than the one of the new sample, fix the DTS of the new sample by assigning it to have a safeDecodeTime (the DTS of the highest (by DTS order) preexisting sample, plus epsilon), but only if that DTS wouldn't be higher than the DTS of the next sample that should come after the new sample being added (estimated as sample.decodeTime() + 2*duration, and keeping a safety margin). Canonical link: https://commits.webkit.org/307359@main
1 parent 132c067 commit a78b2a8

File tree

3 files changed

+386
-12
lines changed

3 files changed

+386
-12
lines changed
Lines changed: 124 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,124 @@
1+
2+
RUN(video.src = URL.createObjectURL(source))
3+
EVENT(sourceopen)
4+
RUN(sourceBuffer = source.addSourceBuffer("video/mock; codecs=mock"))
5+
Test 1: First segment has normal, monotonically increasing samples with decode timestamp equal
6+
to presentation timestamp.
7+
RUN(sourceBuffer.appendBuffer(mediaSegment))
8+
EVENT(updateend)
9+
EXPECTED (bufferedSamples.length == '4') OK
10+
{PTS({0/10 = 0.000000}), DTS({0/10 = 0.000000}), duration({10/10 = 1.000000}), flags(1), generation(0)}
11+
{PTS({10/10 = 1.000000}), DTS({10/10 = 1.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
12+
{PTS({20/10 = 2.000000}), DTS({20/10 = 2.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
13+
{PTS({30/10 = 3.000000}), DTS({30/10 = 3.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
14+
Test 1: Second, segment with overlap in decoding order but no overlap in presentation order shouldn't
15+
cause any deletion if decoding order is corrected to avoid conflicts.
16+
RUN(sourceBuffer.appendBuffer(mediaSegment))
17+
EVENT(updateend)
18+
EXPECTED (bufferedSamples.length == '5') OK
19+
{PTS({0/10 = 0.000000}), DTS({0/10 = 0.000000}), duration({10/10 = 1.000000}), flags(1), generation(0)}
20+
{PTS({10/10 = 1.000000}), DTS({10/10 = 1.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
21+
{PTS({20/10 = 2.000000}), DTS({20/10 = 2.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
22+
{PTS({30/10 = 3.000000}), DTS({30/10 = 3.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
23+
{PTS({40/10 = 4.000000}), DTS({3000100/1000000 = 3.000100}), duration({50/10 = 5.000000}), flags(1), generation(1)}
24+
RUN(sourceBuffer.remove(0, Infinity))
25+
EVENT(updateend)
26+
Test 2: First segment has normal, monotonically increasing samples with decode timestamp equal
27+
to presentation timestamp, and a last sync sample.
28+
RUN(sourceBuffer.appendBuffer(mediaSegment))
29+
EVENT(updateend)
30+
EXPECTED (bufferedSamples.length == '5') OK
31+
{PTS({0/10 = 0.000000}), DTS({0/10 = 0.000000}), duration({10/10 = 1.000000}), flags(1), generation(0)}
32+
{PTS({10/10 = 1.000000}), DTS({10/10 = 1.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
33+
{PTS({20/10 = 2.000000}), DTS({20/10 = 2.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
34+
{PTS({30/10 = 3.000000}), DTS({30/10 = 3.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
35+
{PTS({100/10 = 10.000000}), DTS({100/10 = 10.000000}), duration({10/10 = 1.000000}), flags(1), generation(0)}
36+
Test 2: Second, segment with overlap in decoding order but no overlap in presentation order shouldn't
37+
cause any deletion if decoding order is corrected to avoid conflicts.
38+
RUN(sourceBuffer.appendBuffer(mediaSegment))
39+
EVENT(updateend)
40+
EXPECTED (bufferedSamples.length == '6') OK
41+
{PTS({0/10 = 0.000000}), DTS({0/10 = 0.000000}), duration({10/10 = 1.000000}), flags(1), generation(0)}
42+
{PTS({10/10 = 1.000000}), DTS({10/10 = 1.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
43+
{PTS({20/10 = 2.000000}), DTS({20/10 = 2.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
44+
{PTS({30/10 = 3.000000}), DTS({30/10 = 3.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
45+
{PTS({40/10 = 4.000000}), DTS({3000100/1000000 = 3.000100}), duration({50/10 = 5.000000}), flags(1), generation(1)}
46+
{PTS({100/10 = 10.000000}), DTS({100/10 = 10.000000}), duration({10/10 = 1.000000}), flags(1), generation(0)}
47+
RUN(sourceBuffer.remove(0, Infinity))
48+
EVENT(updateend)
49+
Test 3: First segment has normal, monotonically increasing samples with decode timestamp equal
50+
to presentation timestamp.
51+
RUN(sourceBuffer.appendBuffer(mediaSegment))
52+
EVENT(updateend)
53+
EXPECTED (bufferedSamples.length == '4') OK
54+
{PTS({0/10 = 0.000000}), DTS({0/10 = 0.000000}), duration({10/10 = 1.000000}), flags(1), generation(0)}
55+
{PTS({10/10 = 1.000000}), DTS({10/10 = 1.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
56+
{PTS({20/10 = 2.000000}), DTS({20/10 = 2.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
57+
{PTS({30/10 = 3.000000}), DTS({30/10 = 3.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
58+
Test 3: Segment with overlap in decoding order but no overlap in presentation order shouldn't
59+
cause any deletion if decoding order is corrected to avoid conflicts. This correction can't extend
60+
beyond the duration of the sample, to avoid conflicts with the hypothetical decode timestamp of a
61+
sample appended in the future (DTS 39/10), so some samples will be deleted to prevent that.
62+
Same duration samples.
63+
RUN(sourceBuffer.appendBuffer(mediaSegment))
64+
EVENT(updateend)
65+
EXPECTED (bufferedSamples.length == '5') OK
66+
{PTS({0/10 = 0.000000}), DTS({0/10 = 0.000000}), duration({10/10 = 1.000000}), flags(1), generation(0)}
67+
{PTS({10/10 = 1.000000}), DTS({10/10 = 1.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
68+
{PTS({20/10 = 2.000000}), DTS({20/10 = 2.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
69+
{PTS({30/10 = 3.000000}), DTS({30/10 = 3.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
70+
{PTS({40/10 = 4.000000}), DTS({3000100/1000000 = 3.000100}), duration({10/10 = 1.000000}), flags(1), generation(1)}
71+
RUN(sourceBuffer.remove(0, Infinity))
72+
EVENT(updateend)
73+
Test 4: First segment has normal, monotonically increasing samples with decode timestamp equal
74+
to presentation timestamp, and a last sync sample.
75+
RUN(sourceBuffer.appendBuffer(mediaSegment))
76+
EVENT(updateend)
77+
EXPECTED (bufferedSamples.length == '5') OK
78+
{PTS({0/10 = 0.000000}), DTS({0/10 = 0.000000}), duration({10/10 = 1.000000}), flags(1), generation(0)}
79+
{PTS({10/10 = 1.000000}), DTS({10/10 = 1.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
80+
{PTS({20/10 = 2.000000}), DTS({20/10 = 2.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
81+
{PTS({30/10 = 3.000000}), DTS({30/10 = 3.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
82+
{PTS({100/10 = 10.000000}), DTS({100/10 = 10.000000}), duration({10/10 = 1.000000}), flags(1), generation(0)}
83+
Test 4: Second, segment with overlap in decoding order but no overlap in presentation order shouldn't
84+
cause any deletion if decoding order is corrected to avoid conflicts. This correction can't extend
85+
beyond the duration of the sample, to avoid conflicts with the hypothetical decode timestamp of a
86+
sample appended in the future (DTS 39/10), so some samples will be deleted to prevent that.
87+
Same duration samples.
88+
RUN(sourceBuffer.appendBuffer(mediaSegment))
89+
EVENT(updateend)
90+
EXPECTED (bufferedSamples.length == '6') OK
91+
{PTS({0/10 = 0.000000}), DTS({0/10 = 0.000000}), duration({10/10 = 1.000000}), flags(1), generation(0)}
92+
{PTS({10/10 = 1.000000}), DTS({10/10 = 1.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
93+
{PTS({20/10 = 2.000000}), DTS({20/10 = 2.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
94+
{PTS({30/10 = 3.000000}), DTS({30/10 = 3.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
95+
{PTS({40/10 = 4.000000}), DTS({3000100/1000000 = 3.000100}), duration({10/10 = 1.000000}), flags(1), generation(1)}
96+
{PTS({100/10 = 10.000000}), DTS({100/10 = 10.000000}), duration({10/10 = 1.000000}), flags(1), generation(0)}
97+
RUN(sourceBuffer.remove(0, Infinity))
98+
EVENT(updateend)
99+
Test 5: First segment has normal, monotonically increasing samples with decode timestamp equal
100+
to presentation timestamp, and a last sync sample.
101+
RUN(sourceBuffer.appendBuffer(mediaSegment))
102+
EVENT(updateend)
103+
EXPECTED (bufferedSamples.length == '5') OK
104+
{PTS({0/10 = 0.000000}), DTS({0/10 = 0.000000}), duration({10/10 = 1.000000}), flags(1), generation(0)}
105+
{PTS({10/10 = 1.000000}), DTS({10/10 = 1.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
106+
{PTS({20/10 = 2.000000}), DTS({20/10 = 2.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
107+
{PTS({30/10 = 3.000000}), DTS({30/10 = 3.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
108+
{PTS({100/10 = 10.000000}), DTS({100/10 = 10.000000}), duration({10/10 = 1.000000}), flags(1), generation(0)}
109+
Test 5: Second, segment with overlap in decoding order but no overlap in presentation order shouldn't
110+
cause any deletion if decoding order is corrected to avoid conflicts. This correction can't extend
111+
beyond the duration of the sample, to avoid conflicts with the hypothetical decode timestamp of a
112+
sample appended in the future (DTS 29/10). This isn't met, so the whole correction algorithm bails out.
113+
and all the conflicting samples are deleted (old behaviour). Same duration samples.
114+
RUN(sourceBuffer.appendBuffer(mediaSegment))
115+
EVENT(updateend)
116+
EXPECTED (bufferedSamples.length == '4') OK
117+
{PTS({0/10 = 0.000000}), DTS({0/10 = 0.000000}), duration({10/10 = 1.000000}), flags(1), generation(0)}
118+
{PTS({10/10 = 1.000000}), DTS({10/10 = 1.000000}), duration({10/10 = 1.000000}), flags(0), generation(0)}
119+
{PTS({40/10 = 4.000000}), DTS({19/10 = 1.900000}), duration({10/10 = 1.000000}), flags(1), generation(1)}
120+
{PTS({100/10 = 10.000000}), DTS({100/10 = 10.000000}), duration({10/10 = 1.000000}), flags(1), generation(0)}
121+
RUN(sourceBuffer.remove(0, Infinity))
122+
EVENT(updateend)
123+
END OF TEST
124+
Lines changed: 225 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,225 @@
1+
<!DOCTYPE html>
2+
<html>
3+
<head>
4+
<title>media-source-samples-out-of-order-erase-last-previous-frames</title>
5+
<script src="mock-media-source.js"></script>
6+
<script src="../video-test.js"></script>
7+
<script>
8+
var source;
9+
var sourceBuffer;
10+
var initSegment;
11+
var mediaSegment;
12+
13+
if (window.internals)
14+
internals.initializeMockMediaSource();
15+
16+
window.addEventListener('load', async event => {
17+
findMediaElement();
18+
19+
source = new MediaSource();
20+
run('video.src = URL.createObjectURL(source)');
21+
await waitFor(source, 'sourceopen');
22+
23+
run('sourceBuffer = source.addSourceBuffer("video/mock; codecs=mock")');
24+
25+
// Test 1.
26+
27+
consoleWrite("Test 1: First segment has normal, monotonically increasing samples with decode timestamp equal ");
28+
consoleWrite("to presentation timestamp.");
29+
30+
mediaSegment = concatenateSamples([
31+
makeAInit(4, [makeATrack(0, 'mock', TRACK_KIND.VIDEO)]),
32+
// Syntax: makeASample(presentationTime, decodeTime, duration, timeScale, trackID, flags, generation)
33+
makeASample( 0, 0, 10, 10, 0, SAMPLE_FLAG.SYNC, 0),
34+
makeASample(10, 10, 10, 10, 0, SAMPLE_FLAG.NONE, 0),
35+
makeASample(20, 20, 10, 10, 0, SAMPLE_FLAG.NONE, 0),
36+
makeASample(30, 30, 10, 10, 0, SAMPLE_FLAG.NONE, 0),
37+
]);
38+
run('sourceBuffer.appendBuffer(mediaSegment)');
39+
await waitFor(sourceBuffer, 'updateend');
40+
41+
bufferedSamples = await internals.bufferedSamplesForTrackId(sourceBuffer, 0);
42+
testExpected("bufferedSamples.length", 4);
43+
bufferedSamples.forEach(consoleWrite);
44+
45+
consoleWrite("Test 1: Second, segment with overlap in decoding order but no overlap in presentation order shouldn't ");
46+
consoleWrite("cause any deletion if decoding order is corrected to avoid conflicts.");
47+
48+
mediaSegment = concatenateSamples([
49+
makeAInit(1, [makeATrack(0, 'mock', TRACK_KIND.VIDEO)]),
50+
makeASample(40, 9, 50, 10, 0, SAMPLE_FLAG.SYNC, 1),
51+
]);
52+
run('sourceBuffer.appendBuffer(mediaSegment)');
53+
await waitFor(sourceBuffer, 'updateend');
54+
55+
bufferedSamples = await internals.bufferedSamplesForTrackId(sourceBuffer, 0);
56+
testExpected("bufferedSamples.length", 5);
57+
bufferedSamples.forEach(consoleWrite);
58+
59+
run('sourceBuffer.remove(0, Infinity)');
60+
await waitFor(sourceBuffer, 'updateend');
61+
62+
// Test 2.
63+
64+
consoleWrite("Test 2: First segment has normal, monotonically increasing samples with decode timestamp equal ");
65+
consoleWrite("to presentation timestamp, and a last sync sample.");
66+
67+
mediaSegment = concatenateSamples([
68+
makeAInit(4, [makeATrack(0, 'mock', TRACK_KIND.VIDEO)]),
69+
makeASample( 0, 0, 10, 10, 0, SAMPLE_FLAG.SYNC, 0),
70+
makeASample( 10, 10, 10, 10, 0, SAMPLE_FLAG.NONE, 0),
71+
makeASample( 20, 20, 10, 10, 0, SAMPLE_FLAG.NONE, 0),
72+
makeASample( 30, 30, 10, 10, 0, SAMPLE_FLAG.NONE, 0),
73+
makeASample(100, 100, 10, 10, 0, SAMPLE_FLAG.SYNC, 0)
74+
]);
75+
run('sourceBuffer.appendBuffer(mediaSegment)');
76+
await waitFor(sourceBuffer, 'updateend');
77+
78+
bufferedSamples = await internals.bufferedSamplesForTrackId(sourceBuffer, 0);
79+
testExpected("bufferedSamples.length", 5);
80+
bufferedSamples.forEach(consoleWrite);
81+
82+
consoleWrite("Test 2: Second, segment with overlap in decoding order but no overlap in presentation order shouldn't ");
83+
consoleWrite("cause any deletion if decoding order is corrected to avoid conflicts.");
84+
85+
mediaSegment = concatenateSamples([
86+
makeAInit(1, [makeATrack(0, 'mock', TRACK_KIND.VIDEO)]),
87+
makeASample(40, 9, 50, 10, 0, SAMPLE_FLAG.SYNC, 1),
88+
]);
89+
run('sourceBuffer.appendBuffer(mediaSegment)');
90+
await waitFor(sourceBuffer, 'updateend');
91+
92+
bufferedSamples = await internals.bufferedSamplesForTrackId(sourceBuffer, 0);
93+
testExpected("bufferedSamples.length", 6);
94+
bufferedSamples.forEach(consoleWrite);
95+
96+
run('sourceBuffer.remove(0, Infinity)');
97+
await waitFor(sourceBuffer, 'updateend');
98+
99+
// Test 3.
100+
101+
consoleWrite("Test 3: First segment has normal, monotonically increasing samples with decode timestamp equal ");
102+
consoleWrite("to presentation timestamp.");
103+
104+
mediaSegment = concatenateSamples([
105+
makeAInit(4, [makeATrack(0, 'mock', TRACK_KIND.VIDEO)]),
106+
makeASample( 0, 0, 10, 10, 0, SAMPLE_FLAG.SYNC, 0),
107+
makeASample(10, 10, 10, 10, 0, SAMPLE_FLAG.NONE, 0),
108+
makeASample(20, 20, 10, 10, 0, SAMPLE_FLAG.NONE, 0),
109+
makeASample(30, 30, 10, 10, 0, SAMPLE_FLAG.NONE, 0),
110+
]);
111+
run('sourceBuffer.appendBuffer(mediaSegment)');
112+
await waitFor(sourceBuffer, 'updateend');
113+
114+
bufferedSamples = await internals.bufferedSamplesForTrackId(sourceBuffer, 0);
115+
testExpected("bufferedSamples.length", 4);
116+
bufferedSamples.forEach(consoleWrite);
117+
118+
consoleWrite("Test 3: Segment with overlap in decoding order but no overlap in presentation order shouldn't ");
119+
consoleWrite("cause any deletion if decoding order is corrected to avoid conflicts. This correction can't extend ");
120+
consoleWrite("beyond the duration of the sample, to avoid conflicts with the hypothetical decode timestamp of a ");
121+
consoleWrite("sample appended in the future (DTS 39/10), so some samples will be deleted to prevent that. ");
122+
consoleWrite("Same duration samples.");
123+
124+
mediaSegment = concatenateSamples([
125+
makeAInit(1, [makeATrack(0, 'mock', TRACK_KIND.VIDEO)]),
126+
makeASample(40, 29, 10, 10, 0, SAMPLE_FLAG.SYNC, 1),
127+
]);
128+
run('sourceBuffer.appendBuffer(mediaSegment)');
129+
await waitFor(sourceBuffer, 'updateend');
130+
131+
bufferedSamples = await internals.bufferedSamplesForTrackId(sourceBuffer, 0);
132+
testExpected("bufferedSamples.length", 5);
133+
bufferedSamples.forEach(consoleWrite);
134+
135+
run('sourceBuffer.remove(0, Infinity)');
136+
await waitFor(sourceBuffer, 'updateend');
137+
138+
// Test 4.
139+
140+
consoleWrite("Test 4: First segment has normal, monotonically increasing samples with decode timestamp equal ");
141+
consoleWrite("to presentation timestamp, and a last sync sample.");
142+
143+
mediaSegment = concatenateSamples([
144+
makeAInit(4, [makeATrack(0, 'mock', TRACK_KIND.VIDEO)]),
145+
makeASample( 0, 0, 10, 10, 0, SAMPLE_FLAG.SYNC, 0),
146+
makeASample( 10, 10, 10, 10, 0, SAMPLE_FLAG.NONE, 0),
147+
makeASample( 20, 20, 10, 10, 0, SAMPLE_FLAG.NONE, 0),
148+
makeASample( 30, 30, 10, 10, 0, SAMPLE_FLAG.NONE, 0),
149+
makeASample(100, 100, 10, 10, 0, SAMPLE_FLAG.SYNC, 0)
150+
]);
151+
run('sourceBuffer.appendBuffer(mediaSegment)');
152+
await waitFor(sourceBuffer, 'updateend');
153+
154+
bufferedSamples = await internals.bufferedSamplesForTrackId(sourceBuffer, 0);
155+
testExpected("bufferedSamples.length", 5);
156+
bufferedSamples.forEach(consoleWrite);
157+
158+
consoleWrite("Test 4: Second, segment with overlap in decoding order but no overlap in presentation order shouldn't ");
159+
consoleWrite("cause any deletion if decoding order is corrected to avoid conflicts. This correction can't extend ");
160+
consoleWrite("beyond the duration of the sample, to avoid conflicts with the hypothetical decode timestamp of a ");
161+
consoleWrite("sample appended in the future (DTS 39/10), so some samples will be deleted to prevent that. ");
162+
consoleWrite("Same duration samples.");
163+
164+
mediaSegment = concatenateSamples([
165+
makeAInit(1, [makeATrack(0, 'mock', TRACK_KIND.VIDEO)]),
166+
makeASample(40, 29, 10, 10, 0, SAMPLE_FLAG.SYNC, 1),
167+
]);
168+
run('sourceBuffer.appendBuffer(mediaSegment)');
169+
await waitFor(sourceBuffer, 'updateend');
170+
171+
bufferedSamples = await internals.bufferedSamplesForTrackId(sourceBuffer, 0);
172+
testExpected("bufferedSamples.length", 6);
173+
bufferedSamples.forEach(consoleWrite);
174+
175+
run('sourceBuffer.remove(0, Infinity)');
176+
await waitFor(sourceBuffer, 'updateend');
177+
178+
// Test 5.
179+
180+
consoleWrite("Test 5: First segment has normal, monotonically increasing samples with decode timestamp equal ");
181+
consoleWrite("to presentation timestamp, and a last sync sample.");
182+
183+
mediaSegment = concatenateSamples([
184+
makeAInit(4, [makeATrack(0, 'mock', TRACK_KIND.VIDEO)]),
185+
makeASample( 0, 0, 10, 10, 0, SAMPLE_FLAG.SYNC, 0),
186+
makeASample( 10, 10, 10, 10, 0, SAMPLE_FLAG.NONE, 0),
187+
makeASample( 20, 20, 10, 10, 0, SAMPLE_FLAG.NONE, 0),
188+
makeASample( 30, 30, 10, 10, 0, SAMPLE_FLAG.NONE, 0),
189+
makeASample(100, 100, 10, 10, 0, SAMPLE_FLAG.SYNC, 0)
190+
]);
191+
run('sourceBuffer.appendBuffer(mediaSegment)');
192+
await waitFor(sourceBuffer, 'updateend');
193+
194+
bufferedSamples = await internals.bufferedSamplesForTrackId(sourceBuffer, 0);
195+
testExpected("bufferedSamples.length", 5);
196+
bufferedSamples.forEach(consoleWrite);
197+
198+
consoleWrite("Test 5: Second, segment with overlap in decoding order but no overlap in presentation order shouldn't ");
199+
consoleWrite("cause any deletion if decoding order is corrected to avoid conflicts. This correction can't extend ");
200+
consoleWrite("beyond the duration of the sample, to avoid conflicts with the hypothetical decode timestamp of a ");
201+
consoleWrite("sample appended in the future (DTS 29/10). This isn't met, so the whole correction algorithm bails out. ");
202+
consoleWrite("and all the conflicting samples are deleted (old behaviour). Same duration samples.");
203+
204+
mediaSegment = concatenateSamples([
205+
makeAInit(1, [makeATrack(0, 'mock', TRACK_KIND.VIDEO)]),
206+
makeASample(40, 19, 10, 10, 0, SAMPLE_FLAG.SYNC, 1),
207+
]);
208+
run('sourceBuffer.appendBuffer(mediaSegment)');
209+
await waitFor(sourceBuffer, 'updateend');
210+
211+
bufferedSamples = await internals.bufferedSamplesForTrackId(sourceBuffer, 0);
212+
testExpected("bufferedSamples.length", 4);
213+
bufferedSamples.forEach(consoleWrite);
214+
215+
run('sourceBuffer.remove(0, Infinity)');
216+
await waitFor(sourceBuffer, 'updateend');
217+
218+
endTest();
219+
});
220+
</script>
221+
</head>
222+
<body>
223+
<video></video>
224+
</body>
225+
</html>

0 commit comments

Comments
 (0)