iPhone streaming debugging information - iphone

I'm looking for a way (doesn't need to be app-store save!!) to get ahold of video-streaming-relevant debugging information.
What I'm trying to do, is to write an application that opens a video stream and displays information like:
framerate
bitrate audio / video
etc etc.
codec information
basically i want to display as much information for any given stream.
Thanks for any information in advance,
best regards
sam

Even though you tagged your question with MPMoviePlayerController, that class probably isn't going to help you out very much. First of all, there's a limited amount of information you can access from it at a high level, certainly nothing about codecs and audio bitrate. And even if the class does store this type of information somehow, your app would be disqualified from being in the iTunes AppĀ Store if you access non-public methods or properties.
Secondly, MPMoviePlayerController only supports a limited number of codecs itself, namely the ones that can be decoded in hardware on the iPhone/iPad (H.264 baseline and MPEG-4 videos).
Anyways, a good option could be FFMpeg for the iPhone. Getting the information you need seems to be much more straightforward this way; check out this blog post for a nice tutorial for using the libraries.
I'm not sure about the potential legal issues concerning distributing such a program in the App Store, but if you statically link it with your binary that would at least satisfy Apple... you'll have to check the FFMpeg Legal Site for their end.

Related

iOS Advanced Audio API for decompressing format

On iOS, is it possible to get the user's audio stream in a decompressed format? For example, the MP3 is returned as a WAV that can be used for audio analysis? I'm relatively new to the iOS platform, and I remember seeing that this wasn't possible in older iOS versions. I read that iOS 4 brought in some advanced APIs but I'm not sure where I can find documentations/samples for these.
If you don't mind using API for iOS 4.1 and above, you could try using the AVAssetReader class and friends. In this similar question you have a full example on how to extract video frames. I would expect the same to work for audio, and the nice thing is that the reader deals with all the details of decompression. You can even do composition with AVComposition to merge several streams.
These classes are part of the AVFramework, which allows not only reading but also creating your own content.
Apple has an OpenAL example at http://developer.apple.com/library/mac/#samplecode/OpenALExample/Introduction/Intro.html where Scene.m should interest you.
The Apple documentation has this picture where the Core Audio framework clearly shows that it gives you MP3 out. It also states that you can access audio units in a more radical way if you so need.
The same Core Audio document gives also some information about using MIDI if it may help you.
Edit:
You're in luck today.
In this example an audio file is loaded and fed into an AudioUnit graph. You could fairly easily write an AudioUnit of your own to put into this graph and which analyzes the PCM stream as you see fit. You can even do it in the callback function, although that's probably not a good idea because callbacks are encouraged to be as simple as possible.

What's the best way of live streaming iphone camera to a media server?

According to this What Techniques Are Best To Live Stream iPhone Video Camera Data To a Computer? is possible to get compressed data from iphone camera, but as I've been reading in the AVFoundation reference you only get uncompressed data.
So the questions are:
1) How to get compressed frames and audio from iPhone's camera?
2) Encoding uncompressed frames with ffmpeg's API is fast enough for real-time streaming?
Any help will be really appreciated.
Thanks.
You most likely already know....
1) How to get compressed frames and audio from iPhone's camera?
You can not do this. The AVFoundation API has prevented this from every angle. I even tried named pipes, and some other sneaky unix foo. No such luck. You have no choice but to write it to file. In your linked post a user suggest setting up the callback to deliver encoded frames. As far as I am aware this is not possible for H.264 streams. The capture delegate will deliver images encoded in a specific pixel format. It is the Movie Writers and AVAssetWriter that do the encoding.
2) Encoding uncompressed frames with ffmpeg's API is fast enough for real-time streaming?
Yes it is. However, you will have to use libx264 which gets you into GPL territory. That is not exactly compatible with the app store.
I would suggest using AVFoundation and AVAssetWriter for efficiency reasons.
I agree with Steve. I'd add that on trying with Apple's API, you're going to have to do some seriously nasty hacking. AVAssetWriter by default spends a second before spilling its buffer to file. I haven't found a way to change that with settings. The way around that seems to be to force small file writes and file close with the use of multiple AVAssetWriters. But then that introduces lots of overhead. It's not pretty.
Definitely file a new feature request with Apple (if you're an iOS developer). The more of us that do, the more likely they'll add some sort of writer that can write to a buffer and/or to a stream.
One addition I'd make to what Steve said on the x264 GPL issue is that I think you can get a commercial license for that which is better than GPL, but of course costs you money. But that means you could still use it and get pretty OK results, and not have to open up your own app source. Not as good as an augmented Apple API using their hardware codecs, but not bad.

iPhone SDK: Is it possible to process audio file from local library

Well, I will try best not to make it as a 'I just want the code' question...
I'm recently working on a project which requires some audio signal processing from local music files (e.g. iTunes Library). The whole work includes:
Get the PCM data of an audio file (normally from iTunes library); <--AudioQueue (?)
Write the PCM data to a new file (it seems that Apple does not allow direct modification on music tracks); <--CoreAudio(?)
Do some processing and modification, like filters, manipulators, etc. <-- Will be developed in C++
Play the processed track. <--RemoteIO
The problem is, after going through some blogs and discussions:
http://lists.apple.com/archives/coreaudio-api/2009/Aug/msg00100.html, http://atastypixel.com/blog/using-remoteio-audio-unit/
http://osdir.com/ml/coreaudio-api/2009-08/msg00093.html
as well as the official sample codes, I got a feeling that the CoreAudio SDK allow us to apply audio processing only on voice demos recorded from Mic.
My question is that:
Can I get raw data from iTunes library tracks instead of Mic input?
If the first question is 'No', is there a way to 'fool' the SDK to let it think it is getting data from Mic input, not from iTunes? (I have done some similar 'hacking' stuff in C# before XD)
If the whole processing just doesn't work, can anyone provide some alternative ideas?
Any help will be appreciated. Thank you very much :-)
Thanks.
Just found something really cool yesterday.
From iPhone Media Library to PCM Samples in Dozens of Confounding, Potentially Lossy Steps
(http://www.subfurther.com/blog/?p=1103
And also a class library by MIT:
TSLibraryImport: Objective-C class + sample code for importing files from user's iPod Library in iOS4.
(http://bitbucket.org/artgillespie/tslibraryimport/changeset/a81838f8c78a
Hope they help!
Cheers,
Manca
1) No. Apple does not allow direct access to PCM data of songs. Otherwise you could create music-sharing apps, which is not in Apple's interests.
2) No. Hacking and getting approved is impossible due to Apple's code approval mechanism.
3) The only alternative I could think of is that you have to do the processing part on PC/Mac and then transfer it to the iPhone. Or you would have to store the files in your own applications folder - you should be able to load and process these via CoreAudio.
I know this thread is old but... did this work for you, Manca? And did this app get approved?
EDIT: just discovered the AVAssetReader class, introduced since iOS 4.1, should help

How do I extract a screenshot from a video in the iPhone SDK?

I'd like to be able to take a screenshot of an MPEG recorded using the iPhone camera at set intervals.
I've seen a few ways to do this; namely compiling and using FFmpeg (Using FFMPEG library with iPhone SDK for video encoding), however it seems it's quite difficult to comply with the LGPL (http://ffmpeg.org/legal.html) for commercial use.
This term of the contract pretty much makes it useless to us:
Q: Is it perfectly alright to incorporate the whole FFmpeg core into my own commercial product?
A: You might have a problem here. There have been cases where companies have used FFmpeg in their products. These companies found out that once you start trying to make money from patented technologies, the owners of the patents will come after their licensing fees. Notably, MPEG LA is vigilant and diligent about collecting for MPEG-related technologies.
Is there any other way? - or simply by accessing the rendering layer of an MPEG am I going to be "making money from patented technologies"?
As usual - any help on this would be greatly appreciated.
Cheers!
Yes, you can do it - if I am not wrong, since iOS 3.2... at least for the videos you have on your library. After loading the movie on your MPMoviePlayerController object, do this
UIImage *aThumbnail = [player thumbnailImageAtTime:timeCode timeOption:MPMovieTimeOptionExact];
//timeCode is a time within de video length, for example: 3.12 seconds.
//player is the MPMoviePlayerController object.
Unfortunately there is no official way to grab image frames from the camera in realtime.
I encourage you to file a bug report / feature request with Apple. Many people want this. If many people request a specific feature then they might consider to actually put this in.

Encode an Array of Images into a movie file? (iPhone)

My app takes time-lapse photos, and also records audio to go with it. The problem is, I have absolutely no idea how to go about turning it into a .mov/.mpeg file (I am new to this type of iPhone development). I have heard some things about FFMPEG, but apparently the license doesn't cover the public distribution of iPhone apps. Anyone have any suggestions?
you can use Theora aka VP3. it is free to use in any application and has a pretty decent quality/bitrate ratio
I do not know whether the necessary parts of FFmpeg to do this is GPL or not, but there are parts of FFmpeg that are LGPL-licensed.
They have a legal page that covers this in detail, so FFmpeg might be worth a closer look.
FFmpeg itself can be used in iphone apps distributed on the appstore. See wunderradio as an example: http://www.wunderradio.com/code.html
BUT... I am experimenting with it right now and I am kinda disappointed with the quality of the result. (not to mention that encoding is sloooow on the iphone) It seems to me that without the x264 library it is impossible to create mpeg-4 videos with decent quality. And x264 is GPL licensed, so if you use it, you must disclose the full source of your project. (Or did anyone figure out how to select some usable codec from the LGPL-d FFmpeg?)
What I don't understand is that the appstore has now a lot of video editing apps. How do they work? I made a pretty thorough search, and couldn't find any mpeg-4 codec with a permissive enough license. Do they violate GPL? Do they use private API? I really don't believe that they built a homebrew mpeg4 encoder.