smallest format for videos in an iphone app - iphone

I have a lot of videos I would like to embed into an app, and am currently just streaming them using an UIWebView browser I set up.
I know there are formats available to email videos, where videos can be like 6 mb or less.
What is the best way to do this for an iphone app.
Keeping the quality of the picture to some extent with smaller file sizes.
thanks

The file format (or container) is not the one who gives the file size, but the bitrate of the video stream, when compressing. Since you're going to use these for an iPhone app, I would go with .mov since it's Apple's proprietary format.
As for compression, it isn't really a topic that can be explained in one post, but long story short, the bitrate must be chosen according to the resolution of the video that's being compressed. Go for an h264 multi-pass encoding, and start with a bitrate of 1000 kbps and see if you're satisfied with the results, and keep pushing the bitrate lower and lower, until you get the most satisfying results with the lowest file size. It's really just a matter of fining the right balance, so it's going to take a few tries.
For audio, use AAC with a sample rate of 44.1 KHz and a bitrate of 128kbps if there is music in the audio, or a sample rate of 32KHz and a bitrate of 96kbps which is pretty decent for when there's only voice/narration, or even lower, as long as you're happy with the results.
I explained this process in an answer for a similar question - you can read it here.
Hope this helps ! :)

Related

set start bitrate for adaptive stream in libvlcsharp

I am using LibVlcSharp to play an adaptive video stream (HLS) transcoded by Azure Media Services with the EncoderNamedPreset.AdaptiveStreaming setting in my Xamarin.Forms app.
When viewing my video I notice that the first few (5-6) seconds of my video are very blurry.
This is probably because the player starts with a "safe" low bitrate, and after a few chunks of data have been downloaded, it determines that the bandwith is sufficient for a higher quality video being displayed and it can switch up in quality.
This first few seconds of low quality bother me, and I would rather have it start at a higher bitrate.
I would be happy if the video would switch up sooner (<2 seconds), which would probably mean I need a different encoder setting that produces smaller "chunks" of video.
But maybe the easier solution is to set the starting bitrate to a higher value.
I have seen this done on other media players in the form of
ams.InitialBitrate = ams.AvailableBitrates.Max<uint>();
Does LibVlc have a similar option?
This is probably because the player starts with a "safe" low bitrate, and after a few chunks of data have been downloaded, it determines that the bandwith is sufficient for a higher quality video being displayed and it can switch up in quality.
Probably. You could verify that assumption by checking the bitrate at the start at the video and again when the quality improved, like this:
await media.Parse(MediaParseOptions.ParseNetwork);
foreach(var track in media.Tracks)
{
Debug.WriteLine($"{nameof(track.Bitrate)}: {track.Bitrate}");
}
Does LibVlc have a similar option?
Try this
--adaptive-logic={,predictive,nearoptimal,rate,fixedrate,lowest,highest}
Adaptive Logic
You can try it like new LibVLC("--adaptive-logic=highest")
See the docs for more info, or the forums. If that does not work, I'd set it server-side.

Bandwidth estimation / calculation

I'm working on a new project (web page) where people can upload their creations (video format). Before starting to code, I need to make a plan, calculations. I know video streaming "eats" a lot of bandwidth, this is why I need to calculate the bandwidth for each video and an acceptable quality.
I know streaming a full HD (1920p) video "eats" 7 ~ 9 megabit / s for 1 client. Actually I'm trying to find the best solutions: less bandwidth & acceptable quality.
What's the best acceptable quality (dimension)? 860p or higher?
I found a pretty good company here in my country, where I could collocate a dedicated server with 1 GB of bandwidth and for an acceptable price. How many video stream could the bandwidth accept?
The best acceptable quality depends on the display and resizing algorithms used... a well encoded 360p video will often look great to most people on a large 1080p display if its upsized well. On a 640p phone display 160p might look great to most people.
It also depends immensely on the codecs used. As well as depending greatly on the video content (high motion will require more bits to encode well)...
There is no real answer to this, and you haven't given a great starting point for even a rule of thumb answer. Sorry but you'll need to encode and evaluate videos of different dimensions, bitrates, and codecs and determine what quality loss is acceptable for yourself. "best acceptable quality" is an entirely subjective question.

Most efficient way to format a large amount of audio files

I am currently developing an app that will contain large amounts of audio, around 60-120 minutes. Most voice audio files. My question is really what is the best way to go about storing them. For example, one single large file, separate audio files, download-as-needed cache files.
Any suggestions on file format?
These are the audio formats decoded by iPhone hardware that should take the least power to play.
Other iPhone OS audio formats employ a hardware codec for playback.
These formats are:
AAC
ALAC (Apple Lossless)
MP3
Whether to have the audio distributed with the app or separately would depend on the use. If you could reasonably expect the user to go through the material sequentially, you may want to allow the user to download part by part or stream the audio to let them conserve space on their device, while if the audio is more random access, you'd probably want it all on the device.
Several apps, including Apple's own, appear to use the open source speex codec for compressed voice-quality audio, even though this seems not to be supported by the hardware or any public API.
As Joachim suggested you can choose from AAC/ALAC/MP3 audio formats. What I'd propose now is to also consider the issue from user experience point of view:
Convert all your audio to chosen format with quality options that
satisfy you and your potential users.
Next, calculate the size of all your files and ask yourself a questions:
"are X megabytes too much to bundle for my kind of app?" and
"will that big/small app bundle encourage users to download my app?".
Optionally play a bit with quality options to shrink files (iterate).
In the next step, decide (based on you app characteristics) whether to bundle all files. For example a game is expected to have all files in place and can be big (users accept that). If your app has e.g. podcasts only, then select the best one and bundle it - once user is hooked he can download the rest (let user trigger that), so files are stored on device. Also provide user the info how much data they are about to download and warn them if file is reasonably big and they're not on Wi-Fi; or introduce the option to download only on Wi-Fi.
I hope that sounds reasonable.
For music, the following approach would be much different.
Since it's just voice, you can reduce the sample rate significantly in the majority of cases. Try [8kHz…20kHz].
In case they are multichannel - Mono should be fine for voice.
Once that's been done, I'd recommend AAC for size and quality balance.
Do some listening tests on your devices. Tweak settings if needed. Then batch process/convert them all. That can reduce your sizes by ten or more if the sources are 16/44.1.
Unless they files are very small (e.g. seconds each) or you have to open and read many of them quickly, I wouldn't bother with the huge file. A few MB is a good size for many cases.

What's the best way of live streaming iphone camera to a media server?

According to this What Techniques Are Best To Live Stream iPhone Video Camera Data To a Computer? is possible to get compressed data from iphone camera, but as I've been reading in the AVFoundation reference you only get uncompressed data.
So the questions are:
1) How to get compressed frames and audio from iPhone's camera?
2) Encoding uncompressed frames with ffmpeg's API is fast enough for real-time streaming?
Any help will be really appreciated.
Thanks.
You most likely already know....
1) How to get compressed frames and audio from iPhone's camera?
You can not do this. The AVFoundation API has prevented this from every angle. I even tried named pipes, and some other sneaky unix foo. No such luck. You have no choice but to write it to file. In your linked post a user suggest setting up the callback to deliver encoded frames. As far as I am aware this is not possible for H.264 streams. The capture delegate will deliver images encoded in a specific pixel format. It is the Movie Writers and AVAssetWriter that do the encoding.
2) Encoding uncompressed frames with ffmpeg's API is fast enough for real-time streaming?
Yes it is. However, you will have to use libx264 which gets you into GPL territory. That is not exactly compatible with the app store.
I would suggest using AVFoundation and AVAssetWriter for efficiency reasons.
I agree with Steve. I'd add that on trying with Apple's API, you're going to have to do some seriously nasty hacking. AVAssetWriter by default spends a second before spilling its buffer to file. I haven't found a way to change that with settings. The way around that seems to be to force small file writes and file close with the use of multiple AVAssetWriters. But then that introduces lots of overhead. It's not pretty.
Definitely file a new feature request with Apple (if you're an iOS developer). The more of us that do, the more likely they'll add some sort of writer that can write to a buffer and/or to a stream.
One addition I'd make to what Steve said on the x264 GPL issue is that I think you can get a commercial license for that which is better than GPL, but of course costs you money. But that means you could still use it and get pretty OK results, and not have to open up your own app source. Not as good as an augmented Apple API using their hardware codecs, but not bad.

What is the smallest audio file format?

I know this is not a specific programming question but I hope someone can give me a suggestion. My applications (iPhone and Blackberry applications) use a lot of audio files. I need a solution for my applications in order to save some spaces.
Is it right that .aac is the most suitable audio format for iPhone? Is it the smallest one? It it also suitable for Blackberry?
Is there any way to make the audio files smaller without losing a lot of quality of the sounds? How about the bitrate, sampling freq and channels? Are they really matter?
AAC is a good format for the iPhone. The iOS is optimized to play AAC.
Yes, things like bitrate, sampling frequency and number of channels are all factors in the audio file's size.
What you should do is take your audio and convert it to different formats with different settings and then just play them on a real device to see if the quality is acceptable.
Sorry, there is no simple answer. Experiment.
Depends on what type of audio you're encoding. For speech, AMR is supported by all major smartphones, and will generally give the smallest file sizes. Quality degredation is noticeable enough that it's not suitable for music, but it's optimized for voice recording (the voice notes app on the BlackBerry uses it as its file format) so it'll give you very nice results with spoken audio.