How to achieve low latency live-streaming with Wowza and Video.js - streaming

I've setup a Wowza streaming server and am serving Apple HLS video to a web page. I've read the articles on achieving low-latency live streaming and implemented all the details I can find. I have the Wowza server set to low-latency mode for the live stream.
The issue: The live video stream works for hours, but the latency is around 60-90 seconds behind real time. How do I get the latency from camera to web page down to 5 seconds or less?
Here is my html layout for video.js:
<div style="padding:5px;text-align:center;" preload="none">
<video id=example-video width=720 height=405 class="video-js vjs-default-skin" controls></video>
<source src="http://1.2.3.4:1935/live/myStream/playlist.m3u8" type="application/x-mpegURL">
</video>
</div>
Any insights on how to knock the latency down from 60 seconds to 5 or less would be appreciated!

you will need to customise the configuration as recommended below-
Audio AAC encoder – Bitrate:- 96
Video H.264 encoder – Bitrate:- 800
Keyframe Interval:- 2 second
Tune:- Zero latency
Buffer size:- 50
FPS:- 30
Note: I have share based on the assumptions that you'r using FFMG or WBS to capture the live stream and supply that in RTMP format.

Related

How to resolve Internet Explorer 11 problem with Ant Media Server?

Ant Media Server works fine with all modern browsers. But I have to support the Internet Explorer 11 for playing the stream. What is the best way to do that?
Playing the stream with „https://mydomain:5443/WebRTCAppEE/streams/stream1.m3u8“ with flowplayer in IE11 works. But the latency is up to 20 seconds, it is too long for live video.
I think RTMP has a shorter latency, but it doesn’t works for me. If I use the URL „rtmp://mydomain/WebRTCAppEE/stream1“ in flowplayer, ffplay or VLC.
Is there something special to configure the server side? Is the URL correct? Or have you any other idea?
Unfortunately, IE 11 does not support WebRTC. So that it cannot play WebRTC.
https://caniuse.com/#search=webrtc
URL is correct there is no problem about that.
Here are the things to do.
Latency can be decreased from 20 seconds to 6-10 seconds range in HLS by making key frame interval to 1 second.
You can let user know they can switch to chrome or firefox for ultra low latency streaming

How to play video while it is downloading using AVPro video in unity3D?

I want to play the video simultaneously while it is downloading via unitywebrequest. Will AVPro video support this? If so please provide me some guidance, as i am new to unity and avpro video. I can able to play the video which is downloaded fully through FullscreenVideo.prefab in AVPro demo. Any help will be much appreciated.
There are two main options you could use for displaying the video while it is still downloading.
Through livestream
You can stream a video to AVPro video using the "absolute path or URL" option on the media player component, then linking this to a stream in rtsp, MPEG-DASH, HLS, or HTTP progressive streaming format. Depending on what platforms you will be targeting some of these options will work better than others
A table of which file format supports what platform can be found in the AVProVideo Usermanual that is included with AVProVideo from page 12 and onwards.
If you want to use streaming you also need to set the "internet access" option to "required" in the player settings, as a video cannot stream without internet access.
A video that is being streamed will automatically start/resume playing when enough video is buffered.
This does however require a constant internet connection which may not be ideal if you're targeting mobile devices, or unnecessary if you're planning to play videos in a loop.
HLS m3u8
HTTP Live Streaming (HLS) works by cutting the overall stream into shorter, manageable hunks of data. These chunks will then get downloaded in sequence regardless of how long the stream is. m3u8 is a file format that works with playlists that keeps information on the location of multiple media files instead of an entire video, this can then be fed into a HLS player that will play the small media files in sequence as dictated in the m3u8 file.
using this method is usefull if you're planning to play smaller videos on repeat as the user will only have to download each chunk of the video once, which you can then store for later use.
You can also make these chunks of video as long or short as you want, and set a buffer of how many chunks you want to have pre-loaded. if for example you set the chunk size to 5 seconds, with a buffer of 5 videos the only loading time you'll have is when loading the first 25 seconds of the video. once these first 5 chunks are loaded it will start playing the video and load the rest of the chunks in the background, without interrupting the video (given your internet speed can handle it)
a con to this would be that you have to convert all your videos to m3u8 yourself. a tool such as FFMPEG can help with this though.
references
HLS
m3u8
AVPro documentation

mpeg-dash real-time streaming delay from start

Hello I'm researching about realtime streaming with mpeg-dash, rtmp by using nginx-rtmp-module.
So when the client use rtmp the video is delay from start about 1-2 secs.
But when the client use mpeg-dash with dash.js I can see the delay from
start is about 8 secs or more with setLiveDelay(1).
This is my setup.
dash on;
dash_path /tmp/dash;
dash_fragment 2s;
dash_playlist_length 10s;
Is this is the cons of mpeg-dash about realtime streaming right?
Thanks for your advise.
Is this is the cons of mpeg-dash about realtime streaming right?
It's an attribute of any segmented streaming. HLS has the same problem.
There are tradeoffs to be made when choosing your streaming media technologies. If latency actually matters to you, DASH is not the right choice. Look into WebRTC. If you need to stream one-to-many, DASH is certainly easier to do.

Why does FLV load much faster than MP4?

I have a website where I stream my own videos, I use 2 video formats: MP4 and FLV they both have almost the same size
I notice that FLV load almost instantly but the MP4 takes like 35-60 seconds before it start playing
Media info:
MP4:
http://pastebin.com/kCET5YKP
FLV:
http://pastebin.com/tfKgZEBg
Tested with Apache and Litespeed web-server
Server config:
CPU: Intel Xeon E3
RAM: 8GB
HDD: 1TB
Network: 1Gbps dedicated
first, you should understand that flv is mainly used to live streaming and mp4 for vod(video-on-demand). generally, mp4 format has a big header contains metadata of all frames, so the player start playing it until download entire headers(many boxes).But flv has a little bit header(codec data), each tag contains a frame and tag's header describe tagtype, codecID, pts, dts.... It's similar to TS.
On the other hand, may be related to behavior of the player.
My English is very poor, please bear with me.

how to stream high quality video and audio from iPhone to remote server through Internet

I am seeking the following three items, which I cannot find on STACKOVERFLOW or anywhere:
sample code for AVFoundation capturing to file chunks (~10seconds) that are ready for compression?
sample code for compressing the video and audio for transmisison across the Internet?
ffmpeg?
sample code for HTTP Live Streaming sending files from iPhone to Internet server?
My goal is to use the iPhone as a high quality AV camcorder that streams to a remote server.
If the intervening data rate bogs down, files should buffer at the iPhone.
thanks.
You can use AVAssetWriter to encode a MP4 file of your desired length. The AV media will be encoded into the container in H264/AAC. You could then simply upload this to a remote server. If you wanted you could segment the video for HLS streaming, but keep in mind that HLS is designed as a server->client streaming protocol. There is no notion of push as far as I know. You would have to create a custom server to accept pushing of segmented video streams (which does not really make a lot of sense given the way HLS is designed. See the RFC Draft. A better approach might be to simply upload the MP4(s) via a TCP socket and have your server segment the video for streaming to client viewers. This could be easily done with FFmpeg either on the command line, or via a custom program.
I also wanted to add that if you try and stream 720p video over a cellular connection your app will more then likely get rejected for excessive data use.
Capture Video and Audio using AVFouncation. You can specify the Audio and Video codecs to kCMVideoCodecType_H264 and kAudioFormatMPEG4AAC, Frame sizes, Frame rates in AVCaptureformatDescription. It will give you Compressed H264 video and AAC aduio.
Encapsulate this and transmit to server using any RTP servers like Live555 Media.