Raw H264 NALU hardware decode on iOS - iphone

I receive raw H.264 NALUs from an IP camera (via Live555) and I want to decode them using hardware because FFmpeg is great but it's too slow (the camera sensor is large).
The only solution I see is to write the NALUs to some movie container file such as MPEG-4, and then read and decode that file using an AVAssetReader.
Am I off in the weeds? Is anyone having success decoding H.264 NALUs from a stream? Does anyone have any tips for writing NALUs to an MPEG-4 file? Other ideas?

Like Matt mentioned, there is no direct access to Apple's H264 decoder.
However, I have had success with ffmpeg and h264 decoding. Like you mentioned, I have built ffmpeg with LGPL I was able to decode H264 streams all the way to real-time HD stream with no latency on both ipad and iphone. Nothing fancy is required from ffmpeg, you can find bunch of standard decoding c++ code that will work just fine on iOS. Also, in my case H264 NALUs were delivered via RTP/RTSP in real-time.
Also, if I was you I would run your app through xcode instruments to truly see where you bottleneck is, but I would be highly surprised it is in ffmpeg decoding step.. Hopefully this info helps.

Unfortunately, you cannot do this at present. Feel free to file a radar with Apple about wanting this sort of access to the hardware decoder. It'll certainly be resolved as a duplicate :-). I assume it is for licensing reasons why they can't give this sort of access to the hardware codec.
So, you're going to have to use a software decoder. Please be aware that if you're going to ship to the App Store then you need something with a non-GPL license (unless you want to open source your app as well).

Related

What's the best way of live streaming iphone camera to a media server?

According to this What Techniques Are Best To Live Stream iPhone Video Camera Data To a Computer? is possible to get compressed data from iphone camera, but as I've been reading in the AVFoundation reference you only get uncompressed data.
So the questions are:
1) How to get compressed frames and audio from iPhone's camera?
2) Encoding uncompressed frames with ffmpeg's API is fast enough for real-time streaming?
Any help will be really appreciated.
Thanks.
You most likely already know....
1) How to get compressed frames and audio from iPhone's camera?
You can not do this. The AVFoundation API has prevented this from every angle. I even tried named pipes, and some other sneaky unix foo. No such luck. You have no choice but to write it to file. In your linked post a user suggest setting up the callback to deliver encoded frames. As far as I am aware this is not possible for H.264 streams. The capture delegate will deliver images encoded in a specific pixel format. It is the Movie Writers and AVAssetWriter that do the encoding.
2) Encoding uncompressed frames with ffmpeg's API is fast enough for real-time streaming?
Yes it is. However, you will have to use libx264 which gets you into GPL territory. That is not exactly compatible with the app store.
I would suggest using AVFoundation and AVAssetWriter for efficiency reasons.
I agree with Steve. I'd add that on trying with Apple's API, you're going to have to do some seriously nasty hacking. AVAssetWriter by default spends a second before spilling its buffer to file. I haven't found a way to change that with settings. The way around that seems to be to force small file writes and file close with the use of multiple AVAssetWriters. But then that introduces lots of overhead. It's not pretty.
Definitely file a new feature request with Apple (if you're an iOS developer). The more of us that do, the more likely they'll add some sort of writer that can write to a buffer and/or to a stream.
One addition I'd make to what Steve said on the x264 GPL issue is that I think you can get a commercial license for that which is better than GPL, but of course costs you money. But that means you could still use it and get pretty OK results, and not have to open up your own app source. Not as good as an augmented Apple API using their hardware codecs, but not bad.

Streaming live H.264 video via RTSP to iphone does work! w/example

Using FFMPEG, Live555, JSON
Not sure how it works but if you look at the source files at http://github.com/dropcam/dropcam_for_iphone you can see that they are using a combination of open source projects like FFMPEG, Live555, JSON etc. Using Wireshark to sniff the packets sent from one of the public cameras that's available to view with the free "Dropcam For Iphone App" at the App Store, I was able to confirm that the iphone was receiving H264 video via RTP/RTSP/RTCP and even RTMPT which looks like maybe some of the stream is tunneled?
Maybe someone could take a look at the open source files and explain how they got RTSP to work on the iphone.
Thanks for the info TinC0ils. After digging a little deeper I'v read that they have modified the Axis camera with custom firmware to limit the streaming to just a single 320x240 H264 feed, to better provide a consistent quality video over different networks and, as you point out, be less of a draw on the phone's hardware etc. My interest was driven by a desire to use my iphone to view live video and audio from a couple of IP cameras that I own without the jerkiness of MJPEG or the inherent latency that is involved with "http live streaming". I think Dropcam have done an excellent job with their hardware/software combo, I just don't need any new hardware at the moment.
Oh yeah, I almost forgot the reason of this post RTSP PROTOCOL DOES WORK ON THE IPHONE!
They are using open source projects to receive the frames and decoding in software instead of using hardware decoders. This will work, however, this runs counter to Apple's requirement that you use their HTTP Streaming. It will also require greater CPU resources such that it doesn't decode video at the desired fps/resolution on older devices and/or decrease battery life compared to HTTP streaming.

Encode an Array of Images into a movie file? (iPhone)

My app takes time-lapse photos, and also records audio to go with it. The problem is, I have absolutely no idea how to go about turning it into a .mov/.mpeg file (I am new to this type of iPhone development). I have heard some things about FFMPEG, but apparently the license doesn't cover the public distribution of iPhone apps. Anyone have any suggestions?
you can use Theora aka VP3. it is free to use in any application and has a pretty decent quality/bitrate ratio
I do not know whether the necessary parts of FFmpeg to do this is GPL or not, but there are parts of FFmpeg that are LGPL-licensed.
They have a legal page that covers this in detail, so FFmpeg might be worth a closer look.
FFmpeg itself can be used in iphone apps distributed on the appstore. See wunderradio as an example: http://www.wunderradio.com/code.html
BUT... I am experimenting with it right now and I am kinda disappointed with the quality of the result. (not to mention that encoding is sloooow on the iphone) It seems to me that without the x264 library it is impossible to create mpeg-4 videos with decent quality. And x264 is GPL licensed, so if you use it, you must disclose the full source of your project. (Or did anyone figure out how to select some usable codec from the LGPL-d FFmpeg?)
What I don't understand is that the appstore has now a lot of video editing apps. How do they work? I made a pretty thorough search, and couldn't find any mpeg-4 codec with a permissive enough license. Do they violate GPL? Do they use private API? I really don't believe that they built a homebrew mpeg4 encoder.

encoding H.264 video (or similar) on the iPhone directly?

What's the best way to encode video (with audio) on the iPhone? It looks like QTKit isn't available... so I might have to link with ffmpeg, but ffmpeg doesn't look like it encodes H.264 (judging from their home page.)
If it is possible, I'm also curious how fast I can expect it to perform on the ARM. I imagine it might take minutes to encode a 20sec movie.
Both ffmpeg and mencoder will encode H2.264 videos when combined with x264, but I'd imagine getting it all running iPhone would be an absolute nightmare, let alone the performance of it once you've got it running.
A while ago I wrote an AVI encoder for the iPhone that used raw file I/O. I just started work on a QuickTime encoder that encodes BMP data into a quicktime container. If it is H.264 you want to encode, I would try making a server that uses QTKit and having your app connect to that for conversion.

Lossy compressed format to raw PCM on iPhone

I want to start with an audio file of a modest filesize, and finish with an array of unsigned chars that can be loaded into OpenAL with alBufferData. My trouble is the steps that happen in the middle.
I thought AAC would be the way to go, but according to Apple representative Rincewind (circa 12/08):
Currently hardware assisted compression formats are not supported for decode on iPhone OS. These formats are AAC, MP3 and ALAC.
Using ExtAudioFile with a client format set generates PERM errors, so he's not making things up.
So, brave knowledge-havers, what are my options here? Package the app with .wav's and just suck up having a massive download? Write my own decoder?
Any links to resources or advice you might have would be greatly appreciated.
Offline rendering of compressed audio is now possible, see QA1562.
While Vorbis and the others suggested are good, they can be fairly slow on the iPhone as there is no hardware acceleration.
One codec that is natively supported (but has only a 4:1 compression ratio) is ADPCM, aka ima4. It's handled through the ExtAudioFile interface and is only the tiniest bit slower than loading .wav's directly.
There are some good open source audio decoding libraries that you could use:
mpg123
FAAC
Both are licensed under LGPL, meaning you can use them in closed source applications provided modifications to the library, if any, are open sourced.
You could always make your wave files mono and hence cut your wave file size in half. But that might not be the best alternative for you
Another option for doing your own decoding would be Ogg Vorbis. There's even a low-memory version of their library for integer processors called "Tremor".