iPhone video camera streaming - iphone

Can anyone confirm how an app like this is done?
PocketCam
Is the way to capture the camera's video stream using AVFoundation?
http://developer.apple.com/iphone/library/documentation/AVFoundation/Reference/AVCaptureVideoDataOutput_Class/Reference/Reference.html#//apple_ref/doc/uid/TP40009544
To be clear, I dont want to capture the video and save it, I want to stream it over a wireless network as an IP camera.
thanks,

I can't confirm how PocketCam works, however, as of iOS4, AVFoundation is the correct way to do this. You will receive a callback with the frame you need. At that point you would push the frame as an image to some server computer listening on the network waiting for frames. Keep in mind that you could be receiving a lot of frames and you may not have enough bandwidth to push all of them depending on the quality/size/resolution of the frames. Here is a technical note from Apple that discusses how to implement frame captures with AVFoundation.

Related

record video in cocos2d iOS game, low resolution for video and high resolution for normal cases

I am using cocos2d's CCRenderTexture to record video of my game. But if recording video in retina display resolution will cost lot of CPU and memory, so I want to use low resolution for video record but keep retina-resolution for normal game play. is it possible?
I've tried "[[CCDirector sharedDirector] enableRetinaDisplay:NO];" during record video, but it seems not work. the generated output totally wrong.
This is not feasible.
You'd have to render each frame twice, once on the screen, then onto the render texture. A serious drop in framerate is inevitable even if you lower the resolution of the render texture somehow.
The reason is simply that you'll also have to write each render texture as an image to flash memory. This is extremely slow. You'll also end up with a huge amount of data. If each (PNG/JPG) image file ends up being a reasonably small 50 KB then one second of recorded data at 60 fps will consume 3 Megabytes of flash memory. One minute would be around 180 Megabytes.
To record a demo of your game, most games follow the simple principle of recording the user input, and then playing back the user input as if the user had issued these commands. This requires careful planning, no breaking changes when updating the app (or invalidating old demos), and no use of non-deterministic randomizers (ie seeded with time).
If you need to record a demo for making a trailer video, there's plenty of screengrabbing solutions around. Some even specialize in grabbing iPhone video, either from the device (usually requires a source code/library component) or from the Simulator.
You should check out Kamcord SDK for recording game play. Check at http://kamcord.com/
Kamcord has a built-in gameplay video and audio recording technology for iOS. It allows you, the game developer, to capture gameplay videos with an API. Your users can then replay and share these gameplay videos via YouTube, Facebook, Twitter, and email.

Capture video without displaying the actual video feed

So I have an application that can currently capture video with the front facing iphone camera and then do some processing on the video feed real-time. What I'm trying to do, however, is make this process run in the background and put other controls onscreen. So for example, say I'd like to run the camera and process the image feed, but I want the user to see a black screen with some buttons on it. Any ideas on how to do this?
Just so we get terminology right, by "in the background", you mean running the camera capture while your application is in the foreground, but not displaying the actual video feed. This is possible, but I wanted to make clear that if you move your whole application into the background you will not have access to the camera then.
There are a few ways to do this, but the one that I've spent the most time with is grabbing frames of video (or photos) via AV Foundation. Using an AVCaptureDevice and AVCaptureSession, you can grab the frames of video and route them to an encoder for saving to disk or for processing using your own custom code. None of this requires the camera feed to be displayed onscreen, so you can put up whatever interface you like and do this video recording or photo capture without any onscreen indication.
I would caution that you should make it explicit to your users what you are doing, so that you do not run the risk of violating someone's privacy. Apple does not react kindly to those who do this (for good reason).
I encapsulate a lot of this within my open source GPUImage video and photo processing framework, so you could look at the code for the GPUImageVideoCamera class there to see how I configure the capture inputs. I hand the video frames off to OpenGL ES for the application of filters and other processing operations, but you could ignore that portion of it if you just wanted to do your own encoding or processing.
Heres an exemple code from Apple's doc:
http://developer.apple.com/library/ios/#samplecode/PhotoPicker/Introduction/Intro.html
there is also the way to customize the camera interface.

Playing a real time video stream from iPhone camera on a 20 second delay

I am trying to see if it is possible to record a video from the iPhone's camera and write this to a file. I then want the video to start playing on the screen a set time after. This all needs to happen continuously. For example, I want the video on the screen to always be 20 seconds behind what the camera is recording.
Some background:
I have a friend who is a coach and would like for his players to be able to see their last play. This could be accomplished by a feed going to a TV from an iPad always 20 seconds behind what is recorded. This needs to continually run until practice is over. (I would connect the iPad to the TV either with a cable or AirPlay to an Apple TV). The video would never need to be saved and should just be discarded after playing.
Is this even possible with the APIs AVFoundation offers? Will the iPhone let you write to a file and read from a file at the same time to accomplish this? Any other better way to accomplish this?
Thanks for your time.
Instead of writing to a file, how about saving your frames in a circular buffer big enough to hold X seconds of video?
The way I would start to do this would be to look at what's provided in AVCaptureVideoDataOutput and its delegate methods (where you can get the frame data from).

iPhone Dev: Process each frame of a live recording movie for AR app?

I'm doing research into AR on the iPhone and am trying to figure out how people are getting each frame of video? I'm wanting to figure out AR using computer vision( OpenCV ). So basically I will have a pattern on a piece of paper that I will find using OpenCV and place a graphic on top of the pattern.
I know about the movie class UIImagePickerController, but am unsure how you would go about getting to each frame.
Can someone point me in the right direction?
UIImagePickerController is the means for displaying a camera view and taking single pictures with a camera-like front end. It's not what you're looking for.
Instead you need to look into AVFoundation, particularly the classes surrounding AVCaptureSession. You'll want to acquire a meaningful AVCaptureDevice (which can be the front or back camera on the iPhone 4 and current iPod Touch), create an AVCaptureDeviceInput that references it and add that as an input to an AVCaptureSession. Then just create an AVCaptureVideoDataOutput and set it up with a meaningful delegate and a Grand Central Dispatch dispatch queue.
When you start the session going, you'll receive delegate callbacks on the queue you created providing CMSampleBufferRefs, from which you can pull a CVImageBufferRef and hence the pixel data.

Replacing face time video calls iOS4 - iPhone

I would like to change what the camera 'captures' to something else during a video call.
Lets say I have an image that I want to be seen on the other side instead of the video sent from the camera.
I want to 'hack' the camera on the iPhone - get control on the data being sent.
Is this feasible?
Not without jailbreaking and a lot of work inside MobilePhone.app
Start by running class-dump-z on /Applications/MobilePhone.app/MobilePhone