how many fps can iPhone's UIGetScreenImage() actually do? - iphone

Now that Apple is officially allowing UIGetScreenImage() to be used in iPhone apps, I've seen a number of blogs saying that this "opens the floodgates" for video capture on iPhones, including older models. But I've also seen blogs that say the fastest frame rate they can get with UIGetScreenImage() is like 6 FPS.
Can anyone share specific frame-rate results you've gotten with UIGetScreenImage() (or other approved APIs)? Does restricting the area of the screen captured improve frame rate significantly?
Also, for the wishful thinking segment of today's program, does anyone have pointers to code/library that uses UIGetScreenImage() to capture video? For instance, I'd like an API something like Capture( int fps, Rect bounds, int durationMs ) that would turn on the camera and for the given duration record a sequence of .png files at the given frame rate, copying from the given screen rect.

There is no specific frame rate. UIGetScreenImage() is not a movie recorder. It just try to return as soon as it could, unfortunately still very slow.
Restricting the area of the screen captured is useless. UIGetScreenImage doesn't take any input parameters. Cropping the output image could make the frame rate even worse due to the excess work.

UIGetScreenImage() returns an image of current screen display. It's said to be slow but whether it's fast enough depends on the use case. The video recording app iCamcorder is using this function.
According to their blog,
iCamcorder records at an remarkable average minimum of 10 frames per second and a maximum of 15 frames per second.
The UIGetScreenImage method Apple recently allowed developers to use captures the current screen contents. Unfortunately it is really slow, about 15% of the processing time of the App just goes into calling this method. http://www.drahtwerk.biz/EN/Blog.aspx/iCamcorder-v19-and-Giveaway/?newsID=27
So the raw performance of UIGetScreenImage() should be at least much higher than 15 fps.
To crop the returned image, you can try
extern CGImageRef UIGetScreenImage(void);
...
CGImageRef cgoriginal = UIGetScreenImage();
CGImageRef cgimg = CGImageCreateWithImageInRect(cgoriginal, rect);
UIImage *viewImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgoriginal);
CGImageRelease(cgimg);

Related

How Can I delay the stream to UIImageview using AVCaptureVideoPreviewLayer from camera?

How Can I delay the stream to UIImageview using AVCaptureVideoPreviewLayer from camera?
See below how I bind them, but I just can't figure how to delay it (I don't want it in real time)
AVCaptureVideoPreviewLayer* captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.imageView.bounds;
[self.imageView.layer addSublayer:captureVideoPreviewLayer];
First you're going to want to remove the preview layer frame you have right now, as there is no method to delay those preview frames out of the box.
You're going to want to create a buffer. If we're talking about a few frames, you could have a NSMutableArray that you're filling up on one end with UIImages while you're feeding your image view from the other end.
Your UIImage would come from the didOutputSampleBuffer method, use something like this UIImage created from CMSampleBufferRef not displayed in UIImageView?
Now, few challenges you will have to deal with:
you're talking about having a multiple seconds delay, 5 seconds would be about 150 frames. Storing 150 UIImage in memory isn't gonna happen, unless they're very tiny and on the latest devices
You would solve that by saving the images to disk and have your array only store the path of those images instead of the images themselves. Now you're probably going to run into performance issues, as you're going to do read/write operations in real time, your framerate is going to suffer from that
Because of that bad frame rate, you're going to have to make sure you're not losing synchronization between recorded feed and live feed, otherwise you'll start with a 5 sec delay and end up with a much longer delay
Good luck with that, it can be done with some trade-off (slower frame rate...) but it can be done. (I have done something very similar myself multiple times, can't share the code for IP reasons).

record video in cocos2d iOS game, low resolution for video and high resolution for normal cases

I am using cocos2d's CCRenderTexture to record video of my game. But if recording video in retina display resolution will cost lot of CPU and memory, so I want to use low resolution for video record but keep retina-resolution for normal game play. is it possible?
I've tried "[[CCDirector sharedDirector] enableRetinaDisplay:NO];" during record video, but it seems not work. the generated output totally wrong.
This is not feasible.
You'd have to render each frame twice, once on the screen, then onto the render texture. A serious drop in framerate is inevitable even if you lower the resolution of the render texture somehow.
The reason is simply that you'll also have to write each render texture as an image to flash memory. This is extremely slow. You'll also end up with a huge amount of data. If each (PNG/JPG) image file ends up being a reasonably small 50 KB then one second of recorded data at 60 fps will consume 3 Megabytes of flash memory. One minute would be around 180 Megabytes.
To record a demo of your game, most games follow the simple principle of recording the user input, and then playing back the user input as if the user had issued these commands. This requires careful planning, no breaking changes when updating the app (or invalidating old demos), and no use of non-deterministic randomizers (ie seeded with time).
If you need to record a demo for making a trailer video, there's plenty of screengrabbing solutions around. Some even specialize in grabbing iPhone video, either from the device (usually requires a source code/library component) or from the Simulator.
You should check out Kamcord SDK for recording game play. Check at http://kamcord.com/
Kamcord has a built-in gameplay video and audio recording technology for iOS. It allows you, the game developer, to capture gameplay videos with an API. Your users can then replay and share these gameplay videos via YouTube, Facebook, Twitter, and email.

How to play a video slowly for marking

I am Creating application for coaching. I struck with the marking on video. So I choose ffmpeg for converting video to image frame. That make me time delay as well as memory issues. I need to provide the user play the video slowly as frame by frame. Is there any other way to do that with out image conversion. V1 Golf did that process very quick manner. Please help me.
I would try converting video frame in separate thread and I would extract a few frames ahead as images in the background when user gets into 'slow motion mode'.
Here is example for one frame, so you should be quick with others: Video frame capture by NSOperation.
This should reduce delays and frames could be converted while user is eye-consuming subsequent frames.

How to set iPhone video output image size

I'm trying to do some image processing on iPhone.
I'm using http://developer.apple.com/library/ios/#qa/qa2010/qa1702.html to capture the camera frames.
I saw that I can set AVCaptureVideoDataOutput image format using setVideoSettings, but is it possible to get the images in lower resolution?
If not, is the an efficient way to downscale the resulted image?
Thanks,
Asaf.
This is how we can get a lower resolution output so we get a higher FPS when manipulating the image:
// sessionPreset governs the quality of the capture. we don't need high-resolution images,
// so we'll set the session preset to low quality.
self.captureSession.sessionPreset = AVCaptureSessionPresetLow;
Asaf Pinhassi.

How do I test a camera in the iPhone simulator?

Is there any way to test the iPhone camera in the simulator without having to deploy on a device? This seems awfully tedious.
There are a number of device specific features that you have to test on the device, but it's no harder than using the simulator. Just build a debug target for the device and leave it attached to the computer.
List of actions that require an actual device:
the actual phone
the camera
the accelerometer
real GPS data
the compass
vibration
push notifications...
I needed to test some custom overlays for photos. The overlays needed to be adjusted based on the size/resolution of the image.
I approached this in a way that was similar to the suggestion from Stefan, I decided to code up a "dummy" camera response.
When the simulator is running I execute this dummy code instead of the standard "captureStillImageAsynchronouslyFromConnection".
In this dummy code, I build up a "black photo" of the necessary resolution and then send it through the pipelined to be treated like a normal photo. Essentially providing the feel of a very fast camera.
CGSize sz = UIDeviceOrientationIsPortrait([[UIDevice currentDevice] orientation]) ? CGSizeMake(2448, 3264) : CGSizeMake(3264, 2448);
UIGraphicsBeginImageContextWithOptions(sz, YES, 1);
[[UIColor blackColor] setFill];
UIRectFill(CGRectMake(0, 0, sz.width, sz.height));
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImageJPEGRepresentation(image, 1.0);
The image above is equivalent to a 8MP photos that most of the current day devices send out. Obviously to test other resolutions you would change the size.
I never tried it, but you can give it a try!
iCimulator
Nope (unless they've added a way to do it in 3.2, haven't checked yet).
I wrote a replacement view to use in debug mode. It implements the same API and makes the same delegate callbacks. In my case I made it return a random image from my test set. Pretty trivial to write.
A common reason for the need of accessing the camera is to make screenshots for the AppStore.
Since the camera is not available in the simulator, a good trick ( the only one I know ) is to resize your view at the size you need, just the time to take the screenshots. You will crop them later.
Sure, you need to have the device with the bigger screen available.
The iPad is perfect to test layouts and make snapshots for all devices.
Screenshots for iPhone6+ will have to be stretched a little ( scaled by 1,078125 - Not a big deal… )
Good link to a iOS Devices resolutions quick ref : http://www.iosres.com/
Edit : In a recent project, where a custom camera view controller is used, I have replaced the AVPreview by an UIImageView in a target that I only use to run in the simulator. This way I can automate screenshots for iTunesConnect upload. Note that camera control buttons are not in an overlay, but in a view over the camera preview.
The #Craig answer below describes another method that I found quite smart - It also works with camera overlay, contrarily to mine.
Just found a repo on git that helps Simulate camera functions on iOS Simulator with images, videos, or your MacBook Camera.
Repo