I suspect Microsoft azure CDN to be to slow to stream video 360. Video 2880x1440 pixel and this is not something outstanding but bitrate relatively high - 25 Mb/s (video 360 degrees)
http://realgrad.azureedge.net/realgrad/index.html
Can anybody please check that streaming goes without delay?
Related
I am using LibVlcSharp to play an adaptive video stream (HLS) transcoded by Azure Media Services with the EncoderNamedPreset.AdaptiveStreaming setting in my Xamarin.Forms app.
When viewing my video I notice that the first few (5-6) seconds of my video are very blurry.
This is probably because the player starts with a "safe" low bitrate, and after a few chunks of data have been downloaded, it determines that the bandwith is sufficient for a higher quality video being displayed and it can switch up in quality.
This first few seconds of low quality bother me, and I would rather have it start at a higher bitrate.
I would be happy if the video would switch up sooner (<2 seconds), which would probably mean I need a different encoder setting that produces smaller "chunks" of video.
But maybe the easier solution is to set the starting bitrate to a higher value.
I have seen this done on other media players in the form of
ams.InitialBitrate = ams.AvailableBitrates.Max<uint>();
Does LibVlc have a similar option?
This is probably because the player starts with a "safe" low bitrate, and after a few chunks of data have been downloaded, it determines that the bandwith is sufficient for a higher quality video being displayed and it can switch up in quality.
Probably. You could verify that assumption by checking the bitrate at the start at the video and again when the quality improved, like this:
await media.Parse(MediaParseOptions.ParseNetwork);
foreach(var track in media.Tracks)
{
Debug.WriteLine($"{nameof(track.Bitrate)}: {track.Bitrate}");
}
Does LibVlc have a similar option?
Try this
--adaptive-logic={,predictive,nearoptimal,rate,fixedrate,lowest,highest}
Adaptive Logic
You can try it like new LibVLC("--adaptive-logic=highest")
See the docs for more info, or the forums. If that does not work, I'd set it server-side.
I've setup a Wowza streaming server and am serving Apple HLS video to a web page. I've read the articles on achieving low-latency live streaming and implemented all the details I can find. I have the Wowza server set to low-latency mode for the live stream.
The issue: The live video stream works for hours, but the latency is around 60-90 seconds behind real time. How do I get the latency from camera to web page down to 5 seconds or less?
Here is my html layout for video.js:
<div style="padding:5px;text-align:center;" preload="none">
<video id=example-video width=720 height=405 class="video-js vjs-default-skin" controls></video>
<source src="http://1.2.3.4:1935/live/myStream/playlist.m3u8" type="application/x-mpegURL">
</video>
</div>
Any insights on how to knock the latency down from 60 seconds to 5 or less would be appreciated!
you will need to customise the configuration as recommended below-
Audio AAC encoder – Bitrate:- 96
Video H.264 encoder – Bitrate:- 800
Keyframe Interval:- 2 second
Tune:- Zero latency
Buffer size:- 50
FPS:- 30
Note: I have share based on the assumptions that you'r using FFMG or WBS to capture the live stream and supply that in RTMP format.
Reference - Why sliced thread affect so much on realtime encoding using ffmpeg x264?
With Zerolatency / sliced-threads enabled, I am observing that the decoding time shoots up! I am encoding on my Windows 10 laptop and streaming to Samsung S4 phone where it is decoded and rendered. If usually, decoding takes 2-3 ms, it shoots up to around 25 ms if I use sliced-threads. It is a real time streaming application so I need low latency and that's why I enabled zerolatency. Can someone help please?
I am using the hardware decoder on the phone.
I was reading about the thin clients and streaming videos. How is it different from downloading a file locally and then playing it on a browser. I mean internally how does streaming work? does streaming take less CPU and memory than playing from a file?
The concept behind streaming is very simple - essentially you can imagine the server sending the video either byte by byte, or in 'chunks' and the client receiving the bytes or chunks into a 'first in first out' queue and then playing them in the order they are received (and at the speed required to play the video properly).
More sophisticated streaming techniques will allow the client switch between different bit rate encodings while downloading the chunks of a file - this means that if the network conditions change during video playback the client can choose a lower or higher bit rate chunk as the next chunk to download appropriately. This is referred to as Adaptive Bit Rate streaming.
Advantages of streaming include fast video start up and seeking, better utilisation of bandwidth and no need to download the whole video if the user decides to seek or stop watching.
The following article gives a very good overview: http://www.jwplayer.com/blog/what-is-video-streaming/
My app was recently denied because it is set to stream 60+ mb files from my web server and play them; the MPMoviePlayerController downloaded the entire file in 5-10 minutes while simultaneously playing it. From the testing perspective, the app worked great, but Apple limits audio streaming to 5mb/5min.
How would I go about limiting the buffer rate to only buffer 5mb/5min?
I have no idea which direction to go. I am willing to overhaul as long as the player can still stream the files from my web server.
All replies are appreciated.
Live Streaming is limited due to cellular network limits, so the only way to do this is to do this:
Google "HTTP Live Streaming"
Download the tools from Apple's Dev Center and install them
You will need to use Terminal to mediafilesegment the flies
mediafilesegmenter /Path\ to\ File/Name.mp3
Then upload the .m3u8 files and the segments to your server (same directory) and stream the m3u8. Problem solved!