How to encode and stream a video from an iPhone - iphone

I'm capturing CGImageRef frames using UIGetScreenImage() on an iPhone 3GS 3.1.3 and want to stream this over the interwebs in any appropriate, well known format.
What video codec libraries are available that would allow me to encode the raw pixel data into a video? (i'm not too familiar with video codecs, but am familiar with network programming to actually send the data out)

I wouldn't recommend encoding on the client. Have the video available on the server already encoded and use MPMoviePlayerController to present it:
good tutorial on how to stream video:
http://buildmobilesoftware.com/2010/08/09/how-to-stream-videos-on-the-iphone-or-ipad/

Related

how to stream high quality video and audio from iPhone to remote server through Internet

I am seeking the following three items, which I cannot find on STACKOVERFLOW or anywhere:
sample code for AVFoundation capturing to file chunks (~10seconds) that are ready for compression?
sample code for compressing the video and audio for transmisison across the Internet?
ffmpeg?
sample code for HTTP Live Streaming sending files from iPhone to Internet server?
My goal is to use the iPhone as a high quality AV camcorder that streams to a remote server.
If the intervening data rate bogs down, files should buffer at the iPhone.
thanks.
You can use AVAssetWriter to encode a MP4 file of your desired length. The AV media will be encoded into the container in H264/AAC. You could then simply upload this to a remote server. If you wanted you could segment the video for HLS streaming, but keep in mind that HLS is designed as a server->client streaming protocol. There is no notion of push as far as I know. You would have to create a custom server to accept pushing of segmented video streams (which does not really make a lot of sense given the way HLS is designed. See the RFC Draft. A better approach might be to simply upload the MP4(s) via a TCP socket and have your server segment the video for streaming to client viewers. This could be easily done with FFmpeg either on the command line, or via a custom program.
I also wanted to add that if you try and stream 720p video over a cellular connection your app will more then likely get rejected for excessive data use.
Capture Video and Audio using AVFouncation. You can specify the Audio and Video codecs to kCMVideoCodecType_H264 and kAudioFormatMPEG4AAC, Frame sizes, Frame rates in AVCaptureformatDescription. It will give you Compressed H264 video and AAC aduio.
Encapsulate this and transmit to server using any RTP servers like Live555 Media.

iOS SDK mms video streaming

Basically, I want to stream and play a MMS video with iOS SDK. I can stream some videos with MPMovieplayer but not MMS or RSTP.
I researched this, but I couldn't find a clear solution. Can anybody help me?
I tried VLC Mobile: http://wiki.videolan.org/MobileVLC
Dropcam: https://github.com/dropcam/dropcam_for_iphone
But I am unable to use these options.
You should use ffmpeg library, as this library can connect any streaming server (supporting rtsp, mms, tcp, udp ,rtmp ...) and then draw pictures to the screen.. (for drawing you can use opengles or uiimage also works)
First of all, use avformat_open_input to connect to your ip address then use avcodec_find_decoder & avcodec_open2 to find codecs and to open them (you should call them for both audio & video)
Then, in a while loop read packets from server by using av_read_frame method When you get frame, if it is audio then sent it to AudioUnit or AudioQueue, if it is video, then convert it from yuv to rgb format by using sws_scale method and draw the picture to the screen.
That's all.
look at this wrapper also (http://www.videostreamsdk.com), it's written on ffmpeg library and supports iOS
You can take a look at Apple Http Live Streaming. Some docs here.

How to effectively transfer real time video between two iOS device (Like facetime, skype, fring, tango)

I know how to get frame from iOS sdk.
[How to capture video frames from the camera as images using AV Foundation(http://developer.apple.com/library/ios/#qa/qa1702/_index.html)]
It's pixel, and i can transfer it to JPEG.
What the way I want to transfer the video is like this:
One iOS device A:
Get the pixel or JPEG from call function
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
Using existed technology encoding to h.264 - ffmpeg
encapsulate video with TS stream
Run http server, and wait for request
The other iOS device B:
http request to A(using http simply instead of rtp/rtsp)
So my question is, do I need to use ffmpeg to get h.264 stream or i can get from iOS API?
If I use ffmpeg to encode to h.264(libx264), how to do that, is there any sample code or guideline?
I've read the post What's the best way of live streaming iphone camera to a media server?
It's a pretty good discussion, but i want to know the detail.
The license for ffmpeg is incompatible with iOS applications distributed through the App Store.
If you want to transfer realtime video and have any kind of a usable frame rate, you won't want to use http nor TCP.
Although this doesn't directly answer your question about which video format to use, I would suggest looking into some 3rd party frameworks like ToxBox or QuickBlox. There is a fantastic tutorial using Parse and OpenTok here:
http://www.iphonegamezone.net/ios-tutorial-create-iphone-video-chat-app-using-parse-and-opentok-tokbox/

Stream video from the server . {iPhone SDK}

I'm trying to stream video with static IP : http://38.117.88.148/GemTVLink
via iPhone. Can you show me some information as to how I could implement this? I see an apple video stream app but it seems it can show only .mp4 movies? Am I right?
I want my app to load the HTTP address and play the movie, that's it.
This link works on media player.
The iPhone does not support all streaming video formats.
You should start by reading HTTP Live Streaming Overview
iOS native video player (AVPlayer, MPMoviePlayerViewController ...) can stream from http server in m3u8 format.
I looked at the link, that you mentioned(GemTVLink), it's an mms stream, iOS can not stream from microsoft streaming servers (mms), if you want to do that, you should use ffmpeg library, as this library can connect any streaming server (supporting rtsp, mms, tcp, udp ,rtmp ...) and then draw pictures to the screen.. (for drawing you can use opengles or uiimage also works)
First of all, use avformat_open_input to connect to your ip address then use avcodec_find_decoder & avcodec_open2 to find codecs and to open them (you should call them for both audio & video)
Then, in a while loop read packets from server by using av_read_frame method When you get frame, if it is audio then sent it to AudioUnit or AudioQueue, if it is video, then convert it from yuv to rgb format by using sws_scale method and draw the picture to the screen.
That's all.
look at this wrapper also (http://www.videostreamsdk.com), it's written on ffmpeg library and supports iOS

encoding H.264 video (or similar) on the iPhone directly?

What's the best way to encode video (with audio) on the iPhone? It looks like QTKit isn't available... so I might have to link with ffmpeg, but ffmpeg doesn't look like it encodes H.264 (judging from their home page.)
If it is possible, I'm also curious how fast I can expect it to perform on the ARM. I imagine it might take minutes to encode a 20sec movie.
Both ffmpeg and mencoder will encode H2.264 videos when combined with x264, but I'd imagine getting it all running iPhone would be an absolute nightmare, let alone the performance of it once you've got it running.
A while ago I wrote an AVI encoder for the iPhone that used raw file I/O. I just started work on a QuickTime encoder that encodes BMP data into a quicktime container. If it is H.264 you want to encode, I would try making a server that uses QTKit and having your app connect to that for conversion.