Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Synchronize buffering of a stream of images to be displayed alongside the video #347

Closed
neilRGS opened this issue Apr 20, 2016 · 13 comments
Closed
Labels
status: archived Archived and locked; will not be updated type: question A question from the community

Comments

@neilRGS
Copy link

neilRGS commented Apr 20, 2016

Hi.
We have a strategic requirement to be able to read the timestamps in a live stream.
Our application synchronises slides with video output, so we need to be able to tell how long the stream has been running for. It doesn't matter what format the time is in, so long as it can be read by the player.

Does Shaka have the capability to read timestamps from a live stream? If so, how do I retrieve the data?
If not, are there any plans for it to be able to do that?
Finally, if there are no plans for that, can anyone suggest a player which can do it? JWPlayer can't; I've asked them already.

Many thanks

Neil.

@tdrews
Copy link
Contributor

tdrews commented Apr 20, 2016

By timestamps in a live stream, do you mean the actual timestamps in the media segments? We have tentatively planned to support EMSG boxes (#259); the work required to address #259 could be used as a starting point for implementing more general box parsing. Or, perhaps there's a simpler solution: do you just need the player to output the position of the live-edge? Or the start time of the last segment inserted?

@tdrews tdrews added the type: question A question from the community label Apr 20, 2016
@neilRGS
Copy link
Author

neilRGS commented Apr 21, 2016

It would the the actual timestamps in the media segments. The idea being to mitigate buffering and latency, so that when the player receives the timestamp as a piece of meta data perhaps, then we can display the relevant slide for that time.

Shall I transfer this to #259 ?

Thanks

N.

@tdrews
Copy link
Contributor

tdrews commented Apr 21, 2016

I'll add this as a separate, potential enhancement for v2+.

@tdrews tdrews changed the title Question: Timestamps in live stream Implement a network response filter to read media timestamps. Apr 21, 2016
@tdrews tdrews added type: enhancement New feature or request and removed type: question A question from the community labels Apr 21, 2016
@tdrews tdrews added this to the v2+ milestone Apr 21, 2016
@neilRGS
Copy link
Author

neilRGS commented Apr 22, 2016

Testing this morning in an updated Chrome v50 and Canary, shows that the chrome media-internals is able to see a time-stamp of some description.

Occasionally, the diagnostics shows something like this:

00:05:37 701 debug Detected an append sequence with keyframe following a non-keyframe, both with the same decode time-stamp of 1269.2

I have tested starting and stopping the stream, to observe that number. It correlates with the amount of time that the stream (Wowza application) has been running. So that is good. I am not yet sure how to get to this information. Ideas gratefully received and I will post anything I find as well.

Ultimately, if we can get a time-stamp of some description from the stream, then I'll be happy. I'm currently looking at Chrome's media_internals.js file to see if I can find where it is reading the timestamp.

@joeyparrish
Copy link
Member

If you are synchronizing slides to the video, you need the currently-displayed video timestamp, correct? Not the timestamp of the segment we are currently downloading? Our default buffering configuration is to download 30 seconds ahead of what is displayed.

You can trivially get the timestamp of what is being displayed using video.currentTime. The video element also has a timeupdate event to let you know when currentTime changes.

Is this what you're looking for?

@neilRGS
Copy link
Author

neilRGS commented Apr 27, 2016

Hi Joey.
Not quite.
That will just show me the time that the player has been playing for.
What I need is the time that the stream has been running for.
In tests using JW Player, they have a method (on.('meta')) through which i can get a certain amount of metadata from the stream. I have found that metadata.segment.mediaSequenceNumber will give me the number of the chunk that is loaded. I can set the chunk duration on the Wowza application config, to be, say 2 seconds and then multiply the mediasequencenumber value. That will give me a close approximation of the time the stream has been running for. I believe that it will be close enough for our immediate needs, but am running tests just now to see if there is any drift, and if so, by how much over a period of time.

However, moving forward using Shaka, I need to be able to access data in a similar way. Even better, the encoder we use (Streamcoders Mediasuite) will allow us to send out additional tracks, and we will be able to send a track alongside the video and audio, which can contain timing data.

Is there any way track data such as that can be read through Shaka? The stream gets sent from the encoder to Wowza, and then played through the player. I have established that injecting metadata at the Wowza end is problematic, and I want to avoid that. It really needs to be done in the encoding application and read by the player. Only then will we know everything is synchronised well enough for our player platform to be able to do what it needs at the time it needs to do it.

Cheers,

Neil.

@joeyparrish
Copy link
Member

Hi Neil,

Sorry to drag this out, but I just wanted to clarify a few things and make sure we understand what you need.

Before you asked about "the actual timestamps in the media segments". video.currentTime shows the current media timestamp being displayed. Unless we apply an offset (driven by presentationTimeOffset and/or Period.start), video.currentTime shows exactly what's in the media.

In your most recent comment, though, it sounds like you want the time the stream has been running. So, for example, if the live stream begins at midnight, and it is now 1am, the time you want would be 3600s. This is independent of the media timestamps, though, which do not have to start at t=0 when the live broadcast begins.

It's also not clear to me if you really want the time the stream has been running, or the time the stream had been running when the currently-displayed frame was created. To synchronize with another stream, that would make the most sense to me.

So, to clarify:

  • Do you want media timestamps or the amount of time elapsed since the stream began broadcasting?
  • Do you want this time to correspond to the live edge, or to the video frame being displayed?

Also, what do you mean by "track data ... read through Shaka"? Do you mean adding an additional track client-side that is not in the manifest?

@neilRGS
Copy link
Author

neilRGS commented Apr 28, 2016

Hi Joey.
I have been dragged off this (kicking and screaming!) and on to a pressing issue on a different project. However, in the meantime, I'll give you a summary of what I am trying to achieve and will welcome your thoughts:

We have a presentation platform ('stage') through which the user will watch a live stream and next to the player, slides will appear at certain times as the stream plays.

We need to be able to sync the slides to the stream, so due to buffering and latency, we can't use a separate web service to push the slide markers to the 'stage'.

Therefore, we need to get time data of some sort from the stream itself.

Bearing in mind that the stream might start well before the presenter starts presenting, so the presenter will basically push a button when they start speaking. That will be our 'zero' point.

Ideally then, we need a way of looking into the stream to get the event data (timestamp, or a piece of text etc.) so that our stage can read that event and pull in the correct slide, or fire off whatever event of its own, as required.

I hope that clarifies it a bit better for you.

Thanks in advance,

Neil.

@joeyparrish
Copy link
Member

Let me try to restate your problem in a different way. Please tell me if I am getting it right or not.

You need to display timed images alongside the video. You want to load the images before the corresponding video frames are being displayed so that the image can be displayed without latency.

Is that right?

@joeyparrish joeyparrish removed this from the v2+ milestone May 3, 2016
@joeyparrish joeyparrish added type: question A question from the community and removed type: enhancement New feature or request labels May 3, 2016
@neilRGS
Copy link
Author

neilRGS commented May 3, 2016

Hi Joey.
Yes, that pretty much sums it up. :-)

@joeyparrish
Copy link
Member

Then I believe I have a relatively simple suggestion.

Shaka is buffering ahead of the playhead by a configurable amount. By default, this is 30 seconds. It also drops old buffers behind the playhead by a configurable amount. By default, this is also 30 seconds.

If you want to load your images ahead of their display, you should just do a very simple buffer-ahead type algorithm similar to what Shaka does. You don't actually need to know the timestamps in the segment we just downloaded.

  1. Listen for timeupdate events on the video tag or set a timer to poll every few seconds or so.
  2. Let T be the value of video.currentTime when the event or timer fires.
  3. Let A be the value of player.getConfiguration().streaming.bufferingGoal.
  4. Let B be the value of player.getConfiguration().streaming.bufferBehind.
  5. If there are any images that need to be shown between times T and T + A that you haven't loaded yet, start loading them now in order.
  6. If there are images loaded that are no longer needed after time T - B, release them so that they are destroyed by the garbage collector.

@joeyparrish joeyparrish changed the title Implement a network response filter to read media timestamps. Synchronize buffering of a stream of images to be displayed alongside the video May 3, 2016
@joeyparrish
Copy link
Member

@neilRGS Is this still an issue for you or can we close it?

@joeyparrish joeyparrish added the status: waiting on response Waiting on a response from the reporter(s) of the issue label Jul 2, 2016
@joeyparrish
Copy link
Member

@neilRGS, since we haven't heard from you in a while, I'm closing this issue now. If you need further support, just let us know and we'll be happy to help.

@joeyparrish joeyparrish removed the status: waiting on response Waiting on a response from the reporter(s) of the issue label Jul 14, 2016
@shaka-project shaka-project locked and limited conversation to collaborators Mar 22, 2018
@shaka-bot shaka-bot added the status: archived Archived and locked; will not be updated label Apr 15, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
status: archived Archived and locked; will not be updated type: question A question from the community
Projects
None yet
Development

No branches or pull requests

4 participants