record video in cocos2d iOS game, low resolution for video and high resolution for normal cases - iphone

I am using cocos2d's CCRenderTexture to record video of my game. But if recording video in retina display resolution will cost lot of CPU and memory, so I want to use low resolution for video record but keep retina-resolution for normal game play. is it possible?
I've tried "[[CCDirector sharedDirector] enableRetinaDisplay:NO];" during record video, but it seems not work. the generated output totally wrong.

This is not feasible.
You'd have to render each frame twice, once on the screen, then onto the render texture. A serious drop in framerate is inevitable even if you lower the resolution of the render texture somehow.
The reason is simply that you'll also have to write each render texture as an image to flash memory. This is extremely slow. You'll also end up with a huge amount of data. If each (PNG/JPG) image file ends up being a reasonably small 50 KB then one second of recorded data at 60 fps will consume 3 Megabytes of flash memory. One minute would be around 180 Megabytes.
To record a demo of your game, most games follow the simple principle of recording the user input, and then playing back the user input as if the user had issued these commands. This requires careful planning, no breaking changes when updating the app (or invalidating old demos), and no use of non-deterministic randomizers (ie seeded with time).
If you need to record a demo for making a trailer video, there's plenty of screengrabbing solutions around. Some even specialize in grabbing iPhone video, either from the device (usually requires a source code/library component) or from the Simulator.

You should check out Kamcord SDK for recording game play. Check at http://kamcord.com/
Kamcord has a built-in gameplay video and audio recording technology for iOS. It allows you, the game developer, to capture gameplay videos with an API. Your users can then replay and share these gameplay videos via YouTube, Facebook, Twitter, and email.

Related

Capture video without displaying the actual video feed

So I have an application that can currently capture video with the front facing iphone camera and then do some processing on the video feed real-time. What I'm trying to do, however, is make this process run in the background and put other controls onscreen. So for example, say I'd like to run the camera and process the image feed, but I want the user to see a black screen with some buttons on it. Any ideas on how to do this?
Just so we get terminology right, by "in the background", you mean running the camera capture while your application is in the foreground, but not displaying the actual video feed. This is possible, but I wanted to make clear that if you move your whole application into the background you will not have access to the camera then.
There are a few ways to do this, but the one that I've spent the most time with is grabbing frames of video (or photos) via AV Foundation. Using an AVCaptureDevice and AVCaptureSession, you can grab the frames of video and route them to an encoder for saving to disk or for processing using your own custom code. None of this requires the camera feed to be displayed onscreen, so you can put up whatever interface you like and do this video recording or photo capture without any onscreen indication.
I would caution that you should make it explicit to your users what you are doing, so that you do not run the risk of violating someone's privacy. Apple does not react kindly to those who do this (for good reason).
I encapsulate a lot of this within my open source GPUImage video and photo processing framework, so you could look at the code for the GPUImageVideoCamera class there to see how I configure the capture inputs. I hand the video frames off to OpenGL ES for the application of filters and other processing operations, but you could ignore that portion of it if you just wanted to do your own encoding or processing.
Heres an exemple code from Apple's doc:
http://developer.apple.com/library/ios/#samplecode/PhotoPicker/Introduction/Intro.html
there is also the way to customize the camera interface.

Playing a real time video stream from iPhone camera on a 20 second delay

I am trying to see if it is possible to record a video from the iPhone's camera and write this to a file. I then want the video to start playing on the screen a set time after. This all needs to happen continuously. For example, I want the video on the screen to always be 20 seconds behind what the camera is recording.
Some background:
I have a friend who is a coach and would like for his players to be able to see their last play. This could be accomplished by a feed going to a TV from an iPad always 20 seconds behind what is recorded. This needs to continually run until practice is over. (I would connect the iPad to the TV either with a cable or AirPlay to an Apple TV). The video would never need to be saved and should just be discarded after playing.
Is this even possible with the APIs AVFoundation offers? Will the iPhone let you write to a file and read from a file at the same time to accomplish this? Any other better way to accomplish this?
Thanks for your time.
Instead of writing to a file, how about saving your frames in a circular buffer big enough to hold X seconds of video?
The way I would start to do this would be to look at what's provided in AVCaptureVideoDataOutput and its delegate methods (where you can get the frame data from).

How to play a video slowly for marking

I am Creating application for coaching. I struck with the marking on video. So I choose ffmpeg for converting video to image frame. That make me time delay as well as memory issues. I need to provide the user play the video slowly as frame by frame. Is there any other way to do that with out image conversion. V1 Golf did that process very quick manner. Please help me.
I would try converting video frame in separate thread and I would extract a few frames ahead as images in the background when user gets into 'slow motion mode'.
Here is example for one frame, so you should be quick with others: Video frame capture by NSOperation.
This should reduce delays and frames could be converted while user is eye-consuming subsequent frames.

Play mp3 file smoothly upon dragging a scroll using AVToolbox or openAL

I have been facing this since so many days but I have not reach to any conclusion.
My problem is : I want to play an mp3 file but not simply by clicking on a play button.
It is this way I want to play it.
*There is a slider that I can drag using finger, I want that the mp3 should play with the frequency with which I am dragging the finger (or speed with which I am dragging my finger, so that it will give an effect of fast forwarding (funny type of voice)) or if I drag slider slowyly the output will be slow *
The problem is the output of the sound is not coming out smooth. its very distorted and disturbed voice.
I want the outuput to be smoother.
Please help. Any suggestions please. Presently I am using AVAudioPlayer and passing the time value based upon slider input to play the file. (It does not seems to be feasible though).
I feel that it is possible using openAL only and no other way. Because using openAL we can modify the frequency of the sound file (pitch)
CAN SOME ONE PLEASE REFER ME A LINK TO openAL implementation for iPhone . I have never played a sound file using openAL
Help!!
You won't be able to do it with AVAudioPlayer, as it does not support pitch operations.
You can load and decode the entire track into memory for playback with OpenAL (which supports pitch), or you can do realtime loading/decoding and pitch changing using Audio Units (MUCH lower level, and more complicated, though).

Audio Toolbox playback only plays part of output buffers

I'm working on a project which is using Audio Toolbox for recording and playback of PCM data, and I'm having trouble with playback. In the simulator, I can record and play audio just fine, using a custom class to handle storing and sourcing PCM bytes for the recording and playback buffers as needed. On device (iPhone (3.0.1) and iPod 2G (3.1.2)) recording works fine, the audio files produced are correct, but in-app playback stutters, like it's only playing part of each playback buffer. My buffers are one second long, and I've got 3 buffers, which are preloaded before playback starts; stuttering occurs during those first 3 seconds as well, which I think rules out a latency problem.
I've written Audio Toolbox code before that worked, and I'm not doing anything strange here except that I'm using my own class to source PCM data instead of AudioFileReadBytes()
I know the data that comes out of my source is good, because it plays right in the sim, and it writes to disk as a correct audio file
I've played around with sample rates a bit; I'm normally using 11025Hz sampling to cut down on file size (it's all voice, so it sounds fine). at 44100Hz, but with the same size of buffers, I get the same stuttering problem, but the audio segments come a lot faster, about 4 times faster. That's why I think it's only playing part of each buffer.
The only reason I can conceive that it would only play part of each buffer is a latency problem... like the audio toolbox code is running out of full buffers while I'm still filling an empty one. But that would cause it to play the preloaded buffers correctly, and then start stuttering, and that doesn't happen, it stutters the whole way through
I've tried humongous buffers, like 10MB buffers, and I just get silence and a single stutter of audio at the end of playback. I've also tried preloading more buffers than normal, like 10 seconds worth of audio, and it behaves the same.
The audio session is being set with AVAudioSession, not the Audio Toolbox calls, and it's being set to the Playback category for playback
I have no idea how to try and attack this problem, it makes no sense to me that it works fine on the simulator but not the device.
Code for the playing callback and the set up for the audio queue services: http://pastebin.com/mfaa546c
It turns out that the use of NSData's GetBytes:length: was causing the problem. The buffer filled with that method was playing incorrectly. However, doing a memcpy from that buffer to another buffer would prevent the problem.