Get total frame count from MPMoviePlayerController - iphone

Is there any way to get total frames in a video before actually playing the video using MPMoviePlayerController?

The answer is pretty short and simple: no
Reasoning: MPMoviePlayerController plays all sorts of content. Among the vast range of content formats also is HTTP streaming. For a HTTP video stream it is impossible to get a deterministic frame count upfront as such value depends on the chosen bandwidth (chosen while being played).
You should clarify in your question on which content exactly you are trying to get this information; e.g. local video, streaming video, progressively downloaded video. Even though you will always be out of luck with streaming video, you might have a chance on the other two given options by using AVFoundation (did not actually verify this, I just know the information is available but I do not know off hand if the iOS SDK helps you gathering it).

About as close as you can get using the high-level MediaPlayer.framework classes would be to check the movie player's duration property (if it's set) and multiply by the frame rate of the video. The next problem is finding the frame rate of the video..

Related

set start bitrate for adaptive stream in libvlcsharp

I am using LibVlcSharp to play an adaptive video stream (HLS) transcoded by Azure Media Services with the EncoderNamedPreset.AdaptiveStreaming setting in my Xamarin.Forms app.
When viewing my video I notice that the first few (5-6) seconds of my video are very blurry.
This is probably because the player starts with a "safe" low bitrate, and after a few chunks of data have been downloaded, it determines that the bandwith is sufficient for a higher quality video being displayed and it can switch up in quality.
This first few seconds of low quality bother me, and I would rather have it start at a higher bitrate.
I would be happy if the video would switch up sooner (<2 seconds), which would probably mean I need a different encoder setting that produces smaller "chunks" of video.
But maybe the easier solution is to set the starting bitrate to a higher value.
I have seen this done on other media players in the form of
ams.InitialBitrate = ams.AvailableBitrates.Max<uint>();
Does LibVlc have a similar option?
This is probably because the player starts with a "safe" low bitrate, and after a few chunks of data have been downloaded, it determines that the bandwith is sufficient for a higher quality video being displayed and it can switch up in quality.
Probably. You could verify that assumption by checking the bitrate at the start at the video and again when the quality improved, like this:
await media.Parse(MediaParseOptions.ParseNetwork);
foreach(var track in media.Tracks)
{
Debug.WriteLine($"{nameof(track.Bitrate)}: {track.Bitrate}");
}
Does LibVlc have a similar option?
Try this
--adaptive-logic={,predictive,nearoptimal,rate,fixedrate,lowest,highest}
Adaptive Logic
You can try it like new LibVLC("--adaptive-logic=highest")
See the docs for more info, or the forums. If that does not work, I'd set it server-side.

Is there a way to select the bit rate while using AVPlayer for HTTP live audio streaming?

I'm using AVPlayer to stream audio content delivered in two quality formats.
The problem is that when passing from a lower format to a higher one ( done automatically by the framework when wi-fi is available ) there is a delay while playing.
Is there a way to manually select a desired quality in order to prevent that delay?
It's possible now in iOS8.
Checkout preferredPeakBitRate on AVPlayerItem.
Following copied from Apple's documentation:
The desired limit, in bits per second, of network bandwidth consumption for this item.
SWIFT: var preferredPeakBitRate: Double
OBJECTIVE-C: #property(nonatomic) double preferredPeakBitRate
Set preferredPeakBitRate to non-zero to indicate that the player should attempt to limit item playback to that bit rate, expressed in bits per second.
If network bandwidth consumption cannot be lowered to meet the preferredPeakBitRate, it will be reduced as much as possible while continuing to play the item.
Update: This was accurate at the time for iOS 4. For an updated iOS 8 answer, see here.
I've researched this very question for myself and have not found an answer which means I'm pretty positive there is no way to do this. The Apple docs don't always give all the details of what you can do with things but if you look at all the available properties, methods, etc you will find that there is nothing to allow you to tweak the stream.
I think this is the whole point of HLS. Apple wants iPhone users to have the best streaming experience possible. If they gave the developer the controls to tweak which stream is being used then that defeats the purpose. The system knows best when it comes to switching streams. If the phone cannot handle the additional bandwidth then it won't (or shouldn't) switch to the higher stream. Some things that I have found that you may want to look at...
Are your files chunked into 10 second increments? If it's more than that you might want to shorten them.
Some file conversion programs don't get the bit rates exactly right and if that is the case your phone may think it has the bandwidth for, say, a 96 kbps feed but in reality your feed is 115 kbps. Take a look at the accepted answer in this post: iPhone - App Rejected again, HTTP Live Streaming 64kbps baseline feed
Use Pantomime, is a lightweight framework for iOS, OSX and tvOS that can read and parse HTTP Live Streaming manifests.
Pantomime

What Techniques Are Best To Live Stream iPhone Video Camera Data To a Computer?

I would like to stream video from an iPhone camera to an app running on a Mac. Think sorta like video chat but only one way, from the device to a receiver app (and it's not video chat).
My basic understanding so far:
You can use AVFoundation to get 'live' video camera data without saving to a file but it is uncompressed data and thus I'd have to handle compression on my own.
There's no built in AVCaptureOutput support for sending to a network location, I'd have to work this bit out on my own.
Am I right about the above or am I already off-track?
Apple Tech Q&A 1702 provides some info on saving off individual frames as images - is this the best way to go about this? Just saving off 30fps and then something like ffmpeg to compress 'em?
There's a lot of discussion of live streaming to the iPhone but far less info on people that are sending live video out. I'm hoping for some broad strokes to get me pointed in the right direction.
You can use AVCaptureVideoDataOutput and a sampleBufferDelegate to capture raw compressed frames, then you just need to stream them over the network. AVFoundation provides an API to encode frames to local video files, but doesn't provide any for streaming to the network. Your best bet is to find a library that streams raw frames over the network. I'd start with ffmpeg; I believe libavformat supports RTSP, look at the ffserver code.
Note that you should configure AVCaptureVideoDataOutput to give you compressed frames, so you avoid having to compress raw video frames without the benefit of hardware encoding.
This depends a lot on your target resolution and what type of frame rate performance you are targeting.
From an abstract point of view, I would probably have a capture thread to fill a buffer directly from AVCaptureOutput, and a communications thread to send and rezero the buffer (padded if need be) to a previously specified host every x milliseconds.
After you accomplish initial data transfer, I would work on achieving 15fps at the lowest resolution, and work my way up until the buffer overflows before the communication thread can transmit which would require balancing image resolution, buffer size (probably dependent on GSM, and soon to be CDMA frame sizes), and finally the maximum rate at which you can transmit that buffer.

smallest format for videos in an iphone app

I have a lot of videos I would like to embed into an app, and am currently just streaming them using an UIWebView browser I set up.
I know there are formats available to email videos, where videos can be like 6 mb or less.
What is the best way to do this for an iphone app.
Keeping the quality of the picture to some extent with smaller file sizes.
thanks
The file format (or container) is not the one who gives the file size, but the bitrate of the video stream, when compressing. Since you're going to use these for an iPhone app, I would go with .mov since it's Apple's proprietary format.
As for compression, it isn't really a topic that can be explained in one post, but long story short, the bitrate must be chosen according to the resolution of the video that's being compressed. Go for an h264 multi-pass encoding, and start with a bitrate of 1000 kbps and see if you're satisfied with the results, and keep pushing the bitrate lower and lower, until you get the most satisfying results with the lowest file size. It's really just a matter of fining the right balance, so it's going to take a few tries.
For audio, use AAC with a sample rate of 44.1 KHz and a bitrate of 128kbps if there is music in the audio, or a sample rate of 32KHz and a bitrate of 96kbps which is pretty decent for when there's only voice/narration, or even lower, as long as you're happy with the results.
I explained this process in an answer for a similar question - you can read it here.
Hope this helps ! :)

Live streaming in iPhone?

I have read a lot of posts about live streaming in iPhone, but none of them really works.
The project I want to work out is as follow:
There is a MUTE movie streaming in a movie theater. I want to get the time code (the position it is playing) through wifi and makes iPhone/iPod Touch to play/stream an audio track at the same time code.
May I ask how to achieve it?
UPDATE: Latency is expected and will be taken into consideration. Small time difference is acceptable in this case.
The variable nature of a wireless connection and the latency involved will completely obliterate the video/audio sync you are trying to achieve.