-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Synchronize buffering of a stream of images to be displayed alongside the video #347
Comments
By timestamps in a live stream, do you mean the actual timestamps in the media segments? We have tentatively planned to support EMSG boxes (#259); the work required to address #259 could be used as a starting point for implementing more general box parsing. Or, perhaps there's a simpler solution: do you just need the player to output the position of the live-edge? Or the start time of the last segment inserted? |
It would the the actual timestamps in the media segments. The idea being to mitigate buffering and latency, so that when the player receives the timestamp as a piece of meta data perhaps, then we can display the relevant slide for that time. Shall I transfer this to #259 ? Thanks N. |
I'll add this as a separate, potential enhancement for v2+. |
Testing this morning in an updated Chrome v50 and Canary, shows that the chrome media-internals is able to see a time-stamp of some description. Occasionally, the diagnostics shows something like this:
I have tested starting and stopping the stream, to observe that number. It correlates with the amount of time that the stream (Wowza application) has been running. So that is good. I am not yet sure how to get to this information. Ideas gratefully received and I will post anything I find as well. Ultimately, if we can get a time-stamp of some description from the stream, then I'll be happy. I'm currently looking at Chrome's media_internals.js file to see if I can find where it is reading the timestamp. |
If you are synchronizing slides to the video, you need the currently-displayed video timestamp, correct? Not the timestamp of the segment we are currently downloading? Our default buffering configuration is to download 30 seconds ahead of what is displayed. You can trivially get the timestamp of what is being displayed using Is this what you're looking for? |
Hi Joey. However, moving forward using Shaka, I need to be able to access data in a similar way. Even better, the encoder we use (Streamcoders Mediasuite) will allow us to send out additional tracks, and we will be able to send a track alongside the video and audio, which can contain timing data. Is there any way track data such as that can be read through Shaka? The stream gets sent from the encoder to Wowza, and then played through the player. I have established that injecting metadata at the Wowza end is problematic, and I want to avoid that. It really needs to be done in the encoding application and read by the player. Only then will we know everything is synchronised well enough for our player platform to be able to do what it needs at the time it needs to do it. Cheers, Neil. |
Hi Neil, Sorry to drag this out, but I just wanted to clarify a few things and make sure we understand what you need. Before you asked about "the actual timestamps in the media segments". In your most recent comment, though, it sounds like you want the time the stream has been running. So, for example, if the live stream begins at midnight, and it is now 1am, the time you want would be 3600s. This is independent of the media timestamps, though, which do not have to start at t=0 when the live broadcast begins. It's also not clear to me if you really want the time the stream has been running, or the time the stream had been running when the currently-displayed frame was created. To synchronize with another stream, that would make the most sense to me. So, to clarify:
Also, what do you mean by "track data ... read through Shaka"? Do you mean adding an additional track client-side that is not in the manifest? |
Hi Joey. We have a presentation platform ('stage') through which the user will watch a live stream and next to the player, slides will appear at certain times as the stream plays. We need to be able to sync the slides to the stream, so due to buffering and latency, we can't use a separate web service to push the slide markers to the 'stage'. Therefore, we need to get time data of some sort from the stream itself. Bearing in mind that the stream might start well before the presenter starts presenting, so the presenter will basically push a button when they start speaking. That will be our 'zero' point. Ideally then, we need a way of looking into the stream to get the event data (timestamp, or a piece of text etc.) so that our stage can read that event and pull in the correct slide, or fire off whatever event of its own, as required. I hope that clarifies it a bit better for you. Thanks in advance, Neil. |
Let me try to restate your problem in a different way. Please tell me if I am getting it right or not. You need to display timed images alongside the video. You want to load the images before the corresponding video frames are being displayed so that the image can be displayed without latency. Is that right? |
Hi Joey. |
Then I believe I have a relatively simple suggestion. Shaka is buffering ahead of the playhead by a configurable amount. By default, this is 30 seconds. It also drops old buffers behind the playhead by a configurable amount. By default, this is also 30 seconds. If you want to load your images ahead of their display, you should just do a very simple buffer-ahead type algorithm similar to what Shaka does. You don't actually need to know the timestamps in the segment we just downloaded.
|
@neilRGS Is this still an issue for you or can we close it? |
@neilRGS, since we haven't heard from you in a while, I'm closing this issue now. If you need further support, just let us know and we'll be happy to help. |
Hi.
We have a strategic requirement to be able to read the timestamps in a live stream.
Our application synchronises slides with video output, so we need to be able to tell how long the stream has been running for. It doesn't matter what format the time is in, so long as it can be read by the player.
Does Shaka have the capability to read timestamps from a live stream? If so, how do I retrieve the data?
If not, are there any plans for it to be able to do that?
Finally, if there are no plans for that, can anyone suggest a player which can do it? JWPlayer can't; I've asked them already.
Many thanks
Neil.
The text was updated successfully, but these errors were encountered: