mpeg-dash real-time streaming delay from start - streaming

Hello I'm researching about realtime streaming with mpeg-dash, rtmp by using nginx-rtmp-module.
So when the client use rtmp the video is delay from start about 1-2 secs.
But when the client use mpeg-dash with dash.js I can see the delay from
start is about 8 secs or more with setLiveDelay(1).
This is my setup.
dash on;
dash_path /tmp/dash;
dash_fragment 2s;
dash_playlist_length 10s;
Is this is the cons of mpeg-dash about realtime streaming right?
Thanks for your advise.

Is this is the cons of mpeg-dash about realtime streaming right?
It's an attribute of any segmented streaming. HLS has the same problem.
There are tradeoffs to be made when choosing your streaming media technologies. If latency actually matters to you, DASH is not the right choice. Look into WebRTC. If you need to stream one-to-many, DASH is certainly easier to do.

Related

Is there any way to handle network-caching option in flutter vlc player?

I am streaming a video to android platform with flutter vlc player.
When I stream the video, it has delay of 2-3 seconds. However if I stream at linux_vlc_player, with setting network-caching option into 0 sec, it shows about 0.5 seconds delay. So I thought the delay in flutter vlc player also can be reduced when I handle the caching option, but I could not find it in flutter_vlc_player.
Where can I find the option?
The image below is how I handled the caching option in linux vlc player. You can see the Caching option is 0ms.
I'm looking for the same. I'm trying to achieve ultra low-latency streams in Flutter, which for my application probably means switching from MJPEG to RTSP.
In the meantime though, and perhaps a good fit for your application, the flutter_mjpeg package is pretty simple to use, and doesn't cripple the latency with a long buffer:
https://pub.dev/packages/flutter_mjpeg

Difference between playing a video stream and a video file on a browser

I was reading about the thin clients and streaming videos. How is it different from downloading a file locally and then playing it on a browser. I mean internally how does streaming work? does streaming take less CPU and memory than playing from a file?
The concept behind streaming is very simple - essentially you can imagine the server sending the video either byte by byte, or in 'chunks' and the client receiving the bytes or chunks into a 'first in first out' queue and then playing them in the order they are received (and at the speed required to play the video properly).
More sophisticated streaming techniques will allow the client switch between different bit rate encodings while downloading the chunks of a file - this means that if the network conditions change during video playback the client can choose a lower or higher bit rate chunk as the next chunk to download appropriately. This is referred to as Adaptive Bit Rate streaming.
Advantages of streaming include fast video start up and seeking, better utilisation of bandwidth and no need to download the whole video if the user decides to seek or stop watching.
The following article gives a very good overview: http://www.jwplayer.com/blog/what-is-video-streaming/

Flash Playback and HTTP Live Streaming

I'm looking for a solution to provide streaming video to a variety of clients. I have iPhone clients as well as Flash-based clients. I'd like to not have to provide two separate mechanisms for delivering streaming content. Apple has decreed that HTTP Live Streaming is the way to provide streaming video to the iPhone (though does carve out an exception for small progressive downloads).
My question: Are there examples of Flash implementations consuming HTTP Live Streaming content? What challenges might be faced if I were to try and implement such a player? Are there other technologies I should consider?
Thanks!
Not yet. Maybe never. But...
What you could do is stream from a Wowza Media Server, which will allow you to publish one stream that can be consumed by various clients, including both Apple client devices and Flash browser clients.

iPhone: HTTP live streaming without any server side processing

I want to be able to (live) stream the frames/video FROM the iPhone camera to the internet. I've seen in a Thread (streaming video FROM an iPhone) that it's possible using AVCaptureSession's beginConfiguration and commitConfiguration. But I don't know how to start designing this task. There are already a lot of tutorials about how to stream video TO the iPhone, and it is not actually what I am searching for.
Could you guys give me any ideas which could help me further?
That's a tricky one. You should be able to do it, but it won't be easy.
One way that wouldn't be live (not answering your need, but worth mentioning) is to capture from the camera and save it to a video file. see the AV Foundation Guide on how to do that. Once saved you can then use the HTTP Live Streaming segmenter to generate the proper segments. Apple has applications for Mac OSX, but there's an open source version as well that you could adapt for iOS. On top of that, you'd also have to run an http server to serve those segments. Lots of http servers out there you could adapt.
But to do it live, first as you have already found, you need to collect frames from the camera. Once you have those you want to convert them to h.264. For that you want ffmpeg. Basically you shove the images to ffmpeg's AVPicture, making a stream. Then you'd need to manage that stream so that the live streaming segmenter recognized it as a live streaming h.264 device. I'm not sure how to do that, and it sounds like some serious work. Once you've done that, then you need to have an http server, serving that stream.
What might actually be easier would be to use an RTP/RTSP based stream instead. That approach is covered by open source versions of RTP and ffmpeg supports that fully. It's not http live streaming, but it will work well enough.

live streaming in iPhone

Will the ALAC format support live streaming in iPhone ? the ALAC audio recording format is streamed to Server machine? so will i be able to play the audio chunk data, does ALAC format support?
Thank You.
Assuming you mean "Apple Lossless" audio...
I don't see why it wouldn't, but I don't know the details. You'll probably need to embed it in a transport stream instead of a MPEG 4 container (but then, I don't know how the HTTP live streaming works either).
I don't think streaming lossless audio is sensible, though.
Streaming lossless audio is possible, we have flac streaming using icecast and it works beautifully. However, we are not using HTTP Live Stream (HLS) to do it. We stream flac from the source generator to a number of servers and they create HLS's from there.
It is technically possible to mux alac into mpegts (ffmpeg can do this) as well as play it back (using ffmpeg), but there isn't a format identifier for other clients. Adding this feature to HLS will be as easy as calling/writing Apple and asking them to add ALAC to this list:
http://www.smpte-ra.org/mpegreg/mpegreg.html
and update their products accordingly. If you've purchased an Apple product less than 90 days ago, or you have AppleCare: give them a call. They have to work on the issue for you if you are covered. The more requests that get elevated to their engineers, the more likely they are to add support for alac in HLS.