set start bitrate for adaptive stream in libvlcsharp - azure-media-services

I am using LibVlcSharp to play an adaptive video stream (HLS) transcoded by Azure Media Services with the EncoderNamedPreset.AdaptiveStreaming setting in my Xamarin.Forms app.
When viewing my video I notice that the first few (5-6) seconds of my video are very blurry.
This is probably because the player starts with a "safe" low bitrate, and after a few chunks of data have been downloaded, it determines that the bandwith is sufficient for a higher quality video being displayed and it can switch up in quality.
This first few seconds of low quality bother me, and I would rather have it start at a higher bitrate.
I would be happy if the video would switch up sooner (<2 seconds), which would probably mean I need a different encoder setting that produces smaller "chunks" of video.
But maybe the easier solution is to set the starting bitrate to a higher value.
I have seen this done on other media players in the form of
ams.InitialBitrate = ams.AvailableBitrates.Max<uint>();
Does LibVlc have a similar option?

This is probably because the player starts with a "safe" low bitrate, and after a few chunks of data have been downloaded, it determines that the bandwith is sufficient for a higher quality video being displayed and it can switch up in quality.
Probably. You could verify that assumption by checking the bitrate at the start at the video and again when the quality improved, like this:
await media.Parse(MediaParseOptions.ParseNetwork);
foreach(var track in media.Tracks)
{
Debug.WriteLine($"{nameof(track.Bitrate)}: {track.Bitrate}");
}
Does LibVlc have a similar option?
Try this
--adaptive-logic={,predictive,nearoptimal,rate,fixedrate,lowest,highest}
Adaptive Logic
You can try it like new LibVLC("--adaptive-logic=highest")
See the docs for more info, or the forums. If that does not work, I'd set it server-side.

Related

Video encoding quality, azure media services

The video quality of the first few seconds of encoded videos using azure do not look very good. It is lower quality and blurry. The quality improves afterwards dramatically.
What settings do you recommend to make sure the first frames are excellent quality. First perception is very important.
Thanks,
Osama
It may be possible that you observe this kind of behavior if you have encoded your file for adaptive streaming. In this case, the output asset is composed by different files of different quality (poor and high quality).
When you play an adaptive stream, the first parts downloaded are from the poor quality files and then the player detects your bandwidth and adapt the stream to a higher quality, automatically. If you look at YouTube, Netflix or Dailymotion, you will observe the exact same behavior. It allows to adapt the stream to the available bandwidth.
If you do not want an adaptive stream, you need to use a preset that encode the file in a given bitrate / quality.
The list of supported presets is here : https://msdn.microsoft.com/en-us/library/azure/mt269960.aspx
Multiple bitrates presets are for adaptive streaming.
Single bitrate presets are for single bitrate file.
For example, if your original video is 1080p, you can use this preset: https://msdn.microsoft.com/en-us/library/azure/mt269926.aspx
But be careful that all your users that will have a low bandwidth may not be able to play your content in a smooth way.
Agree with Julien,
You are probably seeing the Adaptive Bitrate streaming ramp up when playing back the content. That's normal behavior for adaptive streaming.
You can eliminate some of the lower bitrates from the encoding preset or restrict them at the client side using the Azure Media Player SDK.
Keep in mind that you can always customize your encoding presets. We support JSON schema for preset and you can define your own based on our existing presets that we ship as a "best practice" to get folks started.
I would recommend using the Azure Media Explorer tool to play around with different encoding settings and easily launch the player for preview. Go here to access the tool from our download page:
http://aka.ms/amse

Streaming Live audio to the browser - Alternatives to the Web Audio API?

I am attempting to stream live audio from an iOS device to a web browser. The iOS device sends small, mono wav files (as they are recorded) through a web socket. Once the client receives the wav files, I have the Web Audio API decode and schedule them accordingly.
This gets me about 99% of the way there, except I can hear clicks between each audio chunk. After some reading around, I have realized the likely source of my problem: the audio is being recorded at a sample rate of only 4k and this cannot be changed. It appears that the Web Audio API's decodeAudioData() function does not handle sample rates other than 44.1k with exact precision resulting in gaps between chunks.
I have tried literally everything I could find about this problem (scriptProcessorNodes, adjusting the timing, creating new buffers, even manually upsampling) and none of them have worked. At this point I am about to abandon the Web Audio API.
Is the Web Audio API appropriate for this?
Is there a better alternative for what I am trying to accomplish?
Any help/suggestions are appreciated, thanks!
Alas! AudioFeeder.js works wonders. I just specify the sampling rate of 4k, feed it raw 32 bit pcm data and it outputs a consistent stream of seamless audio! Even has built in buffer handling events, so no need to set any loops or timeouts to schedule chunk playback. I did have to tweak it a bit, though, to connect it to the rest of my web audio nodes and not just context.destination.
Note: AudioFeeder does automatically upsample to the audio context sampling rate. Going from 4k to 44.1k did introduce some pretty gnarly sounding artifacts in the highend, but a 48db lowpass filter (4 x 12db's) at 2khz got rid of them. I chose 2khz because, thanks to Harry Nyquist, I know that a sampling rate of 4k couldn't have possibly produced frequencies above 2khz in the original file.
All hail Brion Vibbers

Get total frame count from MPMoviePlayerController

Is there any way to get total frames in a video before actually playing the video using MPMoviePlayerController?
The answer is pretty short and simple: no
Reasoning: MPMoviePlayerController plays all sorts of content. Among the vast range of content formats also is HTTP streaming. For a HTTP video stream it is impossible to get a deterministic frame count upfront as such value depends on the chosen bandwidth (chosen while being played).
You should clarify in your question on which content exactly you are trying to get this information; e.g. local video, streaming video, progressively downloaded video. Even though you will always be out of luck with streaming video, you might have a chance on the other two given options by using AVFoundation (did not actually verify this, I just know the information is available but I do not know off hand if the iOS SDK helps you gathering it).
About as close as you can get using the high-level MediaPlayer.framework classes would be to check the movie player's duration property (if it's set) and multiply by the frame rate of the video. The next problem is finding the frame rate of the video..

What Techniques Are Best To Live Stream iPhone Video Camera Data To a Computer?

I would like to stream video from an iPhone camera to an app running on a Mac. Think sorta like video chat but only one way, from the device to a receiver app (and it's not video chat).
My basic understanding so far:
You can use AVFoundation to get 'live' video camera data without saving to a file but it is uncompressed data and thus I'd have to handle compression on my own.
There's no built in AVCaptureOutput support for sending to a network location, I'd have to work this bit out on my own.
Am I right about the above or am I already off-track?
Apple Tech Q&A 1702 provides some info on saving off individual frames as images - is this the best way to go about this? Just saving off 30fps and then something like ffmpeg to compress 'em?
There's a lot of discussion of live streaming to the iPhone but far less info on people that are sending live video out. I'm hoping for some broad strokes to get me pointed in the right direction.
You can use AVCaptureVideoDataOutput and a sampleBufferDelegate to capture raw compressed frames, then you just need to stream them over the network. AVFoundation provides an API to encode frames to local video files, but doesn't provide any for streaming to the network. Your best bet is to find a library that streams raw frames over the network. I'd start with ffmpeg; I believe libavformat supports RTSP, look at the ffserver code.
Note that you should configure AVCaptureVideoDataOutput to give you compressed frames, so you avoid having to compress raw video frames without the benefit of hardware encoding.
This depends a lot on your target resolution and what type of frame rate performance you are targeting.
From an abstract point of view, I would probably have a capture thread to fill a buffer directly from AVCaptureOutput, and a communications thread to send and rezero the buffer (padded if need be) to a previously specified host every x milliseconds.
After you accomplish initial data transfer, I would work on achieving 15fps at the lowest resolution, and work my way up until the buffer overflows before the communication thread can transmit which would require balancing image resolution, buffer size (probably dependent on GSM, and soon to be CDMA frame sizes), and finally the maximum rate at which you can transmit that buffer.

smallest format for videos in an iphone app

I have a lot of videos I would like to embed into an app, and am currently just streaming them using an UIWebView browser I set up.
I know there are formats available to email videos, where videos can be like 6 mb or less.
What is the best way to do this for an iphone app.
Keeping the quality of the picture to some extent with smaller file sizes.
thanks
The file format (or container) is not the one who gives the file size, but the bitrate of the video stream, when compressing. Since you're going to use these for an iPhone app, I would go with .mov since it's Apple's proprietary format.
As for compression, it isn't really a topic that can be explained in one post, but long story short, the bitrate must be chosen according to the resolution of the video that's being compressed. Go for an h264 multi-pass encoding, and start with a bitrate of 1000 kbps and see if you're satisfied with the results, and keep pushing the bitrate lower and lower, until you get the most satisfying results with the lowest file size. It's really just a matter of fining the right balance, so it's going to take a few tries.
For audio, use AAC with a sample rate of 44.1 KHz and a bitrate of 128kbps if there is music in the audio, or a sample rate of 32KHz and a bitrate of 96kbps which is pretty decent for when there's only voice/narration, or even lower, as long as you're happy with the results.
I explained this process in an answer for a similar question - you can read it here.
Hope this helps ! :)