Playing a real time video stream from iPhone camera on a 20 second delay - iphone

I am trying to see if it is possible to record a video from the iPhone's camera and write this to a file. I then want the video to start playing on the screen a set time after. This all needs to happen continuously. For example, I want the video on the screen to always be 20 seconds behind what the camera is recording.
Some background:
I have a friend who is a coach and would like for his players to be able to see their last play. This could be accomplished by a feed going to a TV from an iPad always 20 seconds behind what is recorded. This needs to continually run until practice is over. (I would connect the iPad to the TV either with a cable or AirPlay to an Apple TV). The video would never need to be saved and should just be discarded after playing.
Is this even possible with the APIs AVFoundation offers? Will the iPhone let you write to a file and read from a file at the same time to accomplish this? Any other better way to accomplish this?
Thanks for your time.

Instead of writing to a file, how about saving your frames in a circular buffer big enough to hold X seconds of video?
The way I would start to do this would be to look at what's provided in AVCaptureVideoDataOutput and its delegate methods (where you can get the frame data from).

Related

record video in cocos2d iOS game, low resolution for video and high resolution for normal cases

I am using cocos2d's CCRenderTexture to record video of my game. But if recording video in retina display resolution will cost lot of CPU and memory, so I want to use low resolution for video record but keep retina-resolution for normal game play. is it possible?
I've tried "[[CCDirector sharedDirector] enableRetinaDisplay:NO];" during record video, but it seems not work. the generated output totally wrong.
This is not feasible.
You'd have to render each frame twice, once on the screen, then onto the render texture. A serious drop in framerate is inevitable even if you lower the resolution of the render texture somehow.
The reason is simply that you'll also have to write each render texture as an image to flash memory. This is extremely slow. You'll also end up with a huge amount of data. If each (PNG/JPG) image file ends up being a reasonably small 50 KB then one second of recorded data at 60 fps will consume 3 Megabytes of flash memory. One minute would be around 180 Megabytes.
To record a demo of your game, most games follow the simple principle of recording the user input, and then playing back the user input as if the user had issued these commands. This requires careful planning, no breaking changes when updating the app (or invalidating old demos), and no use of non-deterministic randomizers (ie seeded with time).
If you need to record a demo for making a trailer video, there's plenty of screengrabbing solutions around. Some even specialize in grabbing iPhone video, either from the device (usually requires a source code/library component) or from the Simulator.
You should check out Kamcord SDK for recording game play. Check at http://kamcord.com/
Kamcord has a built-in gameplay video and audio recording technology for iOS. It allows you, the game developer, to capture gameplay videos with an API. Your users can then replay and share these gameplay videos via YouTube, Facebook, Twitter, and email.

How to play more than one video in cocoa touch?

I'm trying to play more than one video at the same time in Cocoa Touch.
I can play one video with MPMoviePlayerController, but it does not allow to play more than one: "Note: Although you can create multiple MPMoviePlayerController objects and present their views in your interface, only one movie player at a time can play its movie."
Basically I'm trying to display complex, high resolution animation in a loop with autostart, but using PNGs makes the app size way too big (+500 MB)
I converted one object of the animation to a movie which is about 50x smaller now.
I think its not possible to play two video at the same time. But you can play video in queue refer this article....

Play mp3 file smoothly upon dragging a scroll using AVToolbox or openAL

I have been facing this since so many days but I have not reach to any conclusion.
My problem is : I want to play an mp3 file but not simply by clicking on a play button.
It is this way I want to play it.
*There is a slider that I can drag using finger, I want that the mp3 should play with the frequency with which I am dragging the finger (or speed with which I am dragging my finger, so that it will give an effect of fast forwarding (funny type of voice)) or if I drag slider slowyly the output will be slow *
The problem is the output of the sound is not coming out smooth. its very distorted and disturbed voice.
I want the outuput to be smoother.
Please help. Any suggestions please. Presently I am using AVAudioPlayer and passing the time value based upon slider input to play the file. (It does not seems to be feasible though).
I feel that it is possible using openAL only and no other way. Because using openAL we can modify the frequency of the sound file (pitch)
CAN SOME ONE PLEASE REFER ME A LINK TO openAL implementation for iPhone . I have never played a sound file using openAL
Help!!
You won't be able to do it with AVAudioPlayer, as it does not support pitch operations.
You can load and decode the entire track into memory for playback with OpenAL (which supports pitch), or you can do realtime loading/decoding and pitch changing using Audio Units (MUCH lower level, and more complicated, though).

How to step frames in a movie recorded by iPhone

Is there a way to have my iPhone program step frame by frame through a movie recorded by the iPhone? What I want to do is have the user record a quicktime movie, then be able to step through the movie frame by frame.
Alternately, I suppose if there was a way to extract every single frame from the movie to a jpg, then I could easily step through the pictures. Anyone know of a way to do this???
I suppose the third option (which might not get past Apple's store) is to capture the movie the way the old jailbroken apps did, which is somehow capture the pictures directly from the camera view????
Any help appreciated. Thanks in advance!!!!
You cannot step through a movie frame by frame. That functionality does not exist in the public API.
You can include your own media decoder code (open source or not) and use that to parse your movies of course. It is perfectly fine to do that.

Play audio and video at a same time in iPhone application

Is it possible to play audio and video file at a same time? I want to play different audio file and video file at a same time and also want control for both, so is it possible?
Sorry, not that I know of. The movie view automatically stops all audio files and takes up the whole screen, so you are forced to listen to the audio for the video.
If the audio is part of the video file, yes.
If it's an MP3 file or some other external type how are you planning on playing it? The phone might allow this to happen depending on what you're up too.
You might be in luck soon though...
Another helpful link is here.