How to take Picture Programmatically in iphone App using device camera? - iphone

In iPhone App,
Can we take pictures at some perticular time intervals programmatically by using iphone device camera ?
If yes then please let me know how we can take pictures programmatically in iPhone App?
Please Help and Suggest.
Thanks,

UIImagePickerController has a takePicture method that can be called programmatically.

You can use UIImagePickerController has a takePicture method to take picture.
For more control over the picture you can use AVFoundation header which contains avcapturestillimageoutput method to capture images. More Info

import this file in .h :
AVFoundation/AVCaptureInput.h
AVFoundation/AVCaptureDevice.h
AVFoundation/AVCaptureOutput.h
AVFoundation/AVMediaFormat.h
put in .m :
- (AVCaptureDevice *)frontCamera
{
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices)
{
if ([device position] == AVCaptureDevicePositionFront)
{
return device;
}
}
return nil;
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
CGImageRef cgImage = [self imageFromSampleBuffer:sampleBuffer];
self.theImage = [UIImage imageWithCGImage: cgImage ];
CGImageRelease( cgImage );
NSCalendar *sysCalendar = [[NSCalendar alloc]initWithCalendarIdentifier:NSCalendarIdentifierGregorian];
NSDateFormatter *df = [[NSDateFormatter alloc]init];
df.calendar = sysCalendar;
[df setDateFormat:#"dd_MM_yyyy hh:mm:ss"];
StrCapture = [NSString stringWithFormat:#"%#.jpeg",[df stringFromDate:[NSDate date]]];
NSLog(#"StrCapture : %#",StrCapture);
NSData *imageData = UIImageJPEGRepresentation(self.theImage,1);
NSFileManager *fileManager = [NSFileManager defaultManager];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *fullPath = [documentsDirectory stringByAppendingPathComponent:StrCapture];
[fileManager createFileAtPath:fullPath contents:imageData attributes:nil];
NSLog(#"ImagePAth : %#",fullPath);
}
- (CGImageRef) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0);
uint8_t baseAddress = (uint8_t )CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef newImage = CGBitmapContextCreateImage(newContext);
CGContextRelease(newContext);
CGColorSpaceRelease(colorSpace);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
return newImage;
}

Related

Sharing an image to Instagram on iOS

I am trying to open one of my pictures on instagram but every time I push the action button (as you can see from my code) it shows instagram icon and when I push the Instagram icon the app crashes. What am I doing wrong? I have been stuck on this for a while.
interface ViewController : UIViewController <UIDocumentInteractionControllerDelegate>{
IBOutlet UIImageView *onlyImageVIew;
IBOutlet UIImageView *myImageView;
}
#property (nonatomic, retain) UIDocumentInteractionController *docController;
-(IBAction)actionButton:(id)sender;
#end
-(IBAction)actionButton:(id)sender {
NSURL *instagramURL = [NSURL URLWithString:#"instagram://app"];
if ([[UIApplication sharedApplication] canOpenURL:instagramURL]) {
NSString *documentsDirectory = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents"];
NSString *savedImagePath = [documentsDirectory stringByAppendingPathComponent:#"Image.ig"];
UIImage *image = [UIImage imageNamed:#"01.png"];
NSData *imageData = UIImagePNGRepresentation(image);
[imageData writeToFile:savedImagePath atomically:YES];
NSURL *imageUrl = [NSURL fileURLWithPath:savedImagePath];
NSLog(#"%#",imageUrl);
UIDocumentInteractionController *docController = [[UIDocumentInteractionController alloc] init];
docController.delegate = self;
docController.UTI = #"com.instagram.photo";
docController.URL = imageUrl;
//[docController setURL:imageUrl];
[docController presentOpenInMenuFromRect:CGRectZero inView:self.view animated:YES];
}
}
Here is the crash when I implemented the bool function
2013-02-10 13:34:46.206 share to instagram[2197:907] file://localhost/var/mobile/Applications/0AABBF7B-F479-44E7-BA7F-B0FAA636F1CB/Documents/Image.ig
hope below code works for you.
-(void)ShareInstagram
{
UIImagePickerController *imgpicker=[[UIImagePickerController alloc] init];
imgpicker.delegate=self;
[self storeimage];
NSURL *instagramURL = [NSURL URLWithString:#"instagram://app"];
if ([[UIApplication sharedApplication] canOpenURL:instagramURL])
{
CGRect rect = CGRectMake(0 ,0 , 612, 612);
NSString *jpgPath = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/15717.ig"];
NSURL *igImageHookFile = [[NSURL alloc] initWithString:[[NSString alloc] initWithFormat:#"file://%#", jpgPath]];
dic.UTI = #"com.instagram.photo";
dic.delegate=self;
dic = [self setupControllerWithURL:igImageHookFile usingDelegate:self];
dic=[UIDocumentInteractionController interactionControllerWithURL:igImageHookFile];
dic.delegate=self;
[dic presentOpenInMenuFromRect: rect inView: self.view animated: YES ];
// [[UIApplication sharedApplication] openURL:instagramURL];
}
else
{
// NSLog(#"instagramImageShare");
UIAlertView *errorToShare = [[UIAlertView alloc] initWithTitle:#"Instagram unavailable " message:#"You need to install Instagram in your device in order to share this image" delegate:nil cancelButtonTitle:#"OK" otherButtonTitles:nil];
errorToShare.tag=3010;
[errorToShare show];
}
}
- (void) storeimage
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *savedImagePath = [documentsDirectory stringByAppendingPathComponent:#"15717.ig"];
UIImage *NewImg=[self resizedImage:imageCapture :CGRectMake(0, 0, 612, 612) ];
NSData *imageData = UIImagePNGRepresentation(NewImg);
[imageData writeToFile:savedImagePath atomically:NO];
}
-(UIImage*) resizedImage:(UIImage *)inImage: (CGRect) thumbRect
{
CGImageRef imageRef = [inImage CGImage];
CGImageAlphaInfo alphaInfo = CGImageGetAlphaInfo(imageRef);
// There's a wierdness with kCGImageAlphaNone and CGBitmapContextCreate
// see Supported Pixel Formats in the Quartz 2D Programming Guide
// Creating a Bitmap Graphics Context section
// only RGB 8 bit images with alpha of kCGImageAlphaNoneSkipFirst, kCGImageAlphaNoneSkipLast, kCGImageAlphaPremultipliedFirst,
// and kCGImageAlphaPremultipliedLast, with a few other oddball image kinds are supported
// The images on input here are likely to be png or jpeg files
if (alphaInfo == kCGImageAlphaNone)
alphaInfo = kCGImageAlphaNoneSkipLast;
// Build a bitmap context that's the size of the thumbRect
CGContextRef bitmap = CGBitmapContextCreate(
NULL,
thumbRect.size.width, // width
thumbRect.size.height, // height
CGImageGetBitsPerComponent(imageRef), // really needs to always be 8
4 * thumbRect.size.width, // rowbytes
CGImageGetColorSpace(imageRef),
alphaInfo
);
// Draw into the context, this scales the image
CGContextDrawImage(bitmap, thumbRect, imageRef);
// Get an image from the context and a UIImage
CGImageRef ref = CGBitmapContextCreateImage(bitmap);
UIImage* result = [UIImage imageWithCGImage:ref];
CGContextRelease(bitmap); // ok if NULL
CGImageRelease(ref);
return result;
}
- (UIDocumentInteractionController *) setupControllerWithURL: (NSURL*) fileURL usingDelegate: (id <UIDocumentInteractionControllerDelegate>) interactionDelegate
{
UIDocumentInteractionController *interactionController = [UIDocumentInteractionController interactionControllerWithURL:fileURL];
interactionController.delegate = self;
return interactionController;
}
- (void)documentInteractionControllerWillPresentOpenInMenu:(UIDocumentInteractionController *)controller
{
}
- (BOOL)documentInteractionController:(UIDocumentInteractionController *)controller canPerformAction:(SEL)action
{
// NSLog(#"5dsklfjkljas");
return YES;
}
- (BOOL)documentInteractionController:(UIDocumentInteractionController *)controller performAction:(SEL)action
{
// NSLog(#"dsfa");
return YES;
}
- (void)documentInteractionController:(UIDocumentInteractionController *)controller didEndSendingToApplication:(NSString *)application
{
// NSLog(#"fsafasd;");
}
This code works for me.
Best of Luck.
First, make sure your image isn't nil after the imageNamed: call.
Then, make sure the imageData isn't nil after the UIImagePNGRepresentation call.
Finally, check the result of writing the image to the file before attempting to use it:
BOOL writingResult = [imageData writeToFile:savedImagePath atomically:YES];
if(writingResult == NO)
{
NSLog(#"Failed to write to %#",savedImagePath);
return;
}
It's possible your file isn't being written, and so trying to share a file that doesn't exist is causing your crash.

Lag in UITableView scroll when image is loaded from document directory

I have a UITableView which shows Items with images and their name. The problem is that the scroll's lagging because of the loading of the images for each cell.
This is how I am saving the images captured from the iPhone Camera and resized to 320 by 480.
- (void)saveImage: (UIImage*)image{
if (image != nil)
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,
NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *imageName = [NSString stringWithFormat: #"image_%#.png",[NSDate date]];
item.imageName = imageName;
NSString* path = [documentsDirectory stringByAppendingPathComponent:
imageName];
NSLog(#"%#",path);
NSData* data = UIImagePNGRepresentation(image);
[data writeToFile:path atomically:YES];
NSUserDefaults *userDefaults = [NSUserDefaults standardUserDefaults];
[userDefaults setInteger:[image imageOrientation] forKey:#"kImageOrientation"];
} }
This is how I am resizing it
- (UIImage *)resizeImage:(UIImage*)image newSize:(CGSize)newSize {
CGRect newRect = CGRectIntegral(CGRectMake(0, 0, newSize.width, newSize.height));
CGImageRef imageRef = image.CGImage;
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0);
CGContextRef context = UIGraphicsGetCurrentContext();
// Set the quality level to use when rescaling
CGContextSetInterpolationQuality(context, kCGInterpolationHigh);
CGAffineTransform flipVertical = CGAffineTransformMake(1, 0, 0, -1, 0, newSize.height);
CGContextConcatCTM(context, flipVertical);
// Draw into the context; this scales the image
CGContextDrawImage(context, newRect, imageRef);
// Get the resized image from the context and a UIImage
CGImageRef newImageRef = CGBitmapContextCreateImage(context);
UIImage *newImage = [UIImage imageWithCGImage:newImageRef];
CGImageRelease(newImageRef);
UIGraphicsEndImageContext();
return newImage;}
And this is how I am loading them in cellForRowAtIndexPath
- (UIImage*)loadImage:(NSString *) imageName{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,
NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString* path = [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithString: imageName]];
UIImage* image = [UIImage imageWithContentsOfFile:path];
NSUserDefaults *userDefaults = [NSUserDefaults standardUserDefaults];
UIImage *orientedImage= [UIImage imageWithCGImage: image.CGImage scale:1.0 orientation: [userDefaults integerForKey:#"kImageOrientation"]];
return orientedImage;}
One approach I was thinking of is to have an NSMutableArray with the UIImages loaded in it once and then fetching the images from there. The problem with that approach is that not all cells will have images so say there are 10 cells but only three images in the last 3 cells, the NSMutableArray approach will be a problem here because the images added will be at the indices 0,1,2.
I have one more solution in my mind which I would like to highlight here for you experts to verify if it will work. I actually am using CoreData and have generated the Model for the Item. If I was to manually add a UIImage field in this class and save the image to them objects, will that work or would it be wrong to do that?

Memory Issue while reading video frames iPhone

I'm having memory issues while reading video frames from an existing video chosen from the iPhone library. First I added the UIImage-frames themselves into an Array, but I thought that the array was too big for the memory after a while, so instead I save the UIImages in the documents folder and add the imagepath to the array. However, I still get the same memory warnings even though checking with Instruments for allocations. The total allocated memory never gets over 2.5mb. Also there are no leaks found... Can anyone think of something?
-(void)addFrame:(UIImage *)image
{
NSString *imgPath = [NSString stringWithFormat:#"%#/Analysis%d-%d.png", docFolder, currentIndex, framesArray.count];
[UIImagePNGRepresentation(image) writeToFile:imgPath atomically:YES];
[framesArray addObject:imgPath];
frameCount++;
}
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[picker dismissModalViewControllerAnimated:YES];
[framesArray removeAllObjects];
frameCount = 0;
// incoming video
NSURL *videoURL = [info valueForKey:UIImagePickerControllerMediaURL];
//NSLog(#"Video : %#", videoURL);
// AVURLAsset to read input movie (i.e. mov recorded to local storage)
NSDictionary *inputOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVURLAsset *inputAsset = [[AVURLAsset alloc] initWithURL:videoURL options:inputOptions];
// Load the input asset tracks information
[inputAsset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:#"tracks"] completionHandler: ^{
NSError *error = nil;
nrFrames = CMTimeGetSeconds([inputAsset duration]) * 30;
NSLog(#"Total frames = %d", nrFrames);
// Check status of "tracks", make sure they were loaded
AVKeyValueStatus tracksStatus = [inputAsset statusOfValueForKey:#"tracks" error:&error];
if (!tracksStatus == AVKeyValueStatusLoaded)
// failed to load
return;
/* Read video samples from input asset video track */
AVAssetReader *reader = [AVAssetReader assetReaderWithAsset:inputAsset error:&error];
NSMutableDictionary *outputSettings = [NSMutableDictionary dictionary];
[outputSettings setObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey: (NSString*)kCVPixelBufferPixelFormatTypeKey];
AVAssetReaderTrackOutput *readerVideoTrackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:[[inputAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] outputSettings:outputSettings];
// Assign the tracks to the reader and start to read
[reader addOutput:readerVideoTrackOutput];
if ([reader startReading] == NO) {
// Handle error
NSLog(#"Error reading");
}
NSAutoreleasePool *pool = [NSAutoreleasePool new];
while (reader.status == AVAssetReaderStatusReading)
{
if(!memoryProblem)
{
CMSampleBufferRef sampleBufferRef = [readerVideoTrackOutput copyNextSampleBuffer];
if (sampleBufferRef)
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBufferRef);
/*Lock the image buffer*/
CVPixelBufferLockBaseAddress(imageBuffer,0);
/*Get information about the image*/
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
/*We unlock the image buffer*/
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
/*Create a CGImageRef from the CVImageBufferRef*/
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef newImage = CGBitmapContextCreateImage(newContext);
/*We release some components*/
CGContextRelease(newContext);
CGColorSpaceRelease(colorSpace);
UIImage *image= [UIImage imageWithCGImage:newImage scale:[UIScreen mainScreen].scale orientation:UIImageOrientationRight];
//[self addFrame:image];
[self performSelectorOnMainThread:#selector(addFrame:) withObject:image waitUntilDone:YES];
/*We release the CGImageRef*/
CGImageRelease(newImage);
CMSampleBufferInvalidate(sampleBufferRef);
CFRelease(sampleBufferRef);
sampleBufferRef = NULL;
}
}
else
{
break;
}
}
[pool release];
NSLog(#"Finished");
}];
}
You do one thing and try.
Move the NSAutoreleasePool inside the while loop and drain that inside the loop.
So that it would be like as follows:
while (reader.status == AVAssetReaderStatusReading)
{
NSAutoreleasePool *pool = [NSAutoreleasePool new];
.....
[pool drain];
}

iPhone Realtime Image Processing using OpenCV and AVFoundation Frameworks?

I want to doing image processing in real time by using openCV.
My final target is showing the result in realtime on the screen while the other side camera is capturing the video by using AVFoundation frameworks.
How can I process every video frame by OpenCV, and show the result on the screen in real time?
Use AVAssertReader
//Setup Reader
AVURLAsset * asset = [AVURLAsset URLAssetWithURL:urlvalue options:nil];
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:#"tracks"] completionHandler: ^{ dispatch_async(dispatch_get_main_queue(), ^{
AVAssetTrack * videoTrack = nil;
NSArray * tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
if ([tracks count] == 1) {
videoTrack = [tracks objectAtIndex:0];
NSError * error = nil;
_movieReader = [[AVAssetReader alloc] initWithAsset:asset error:&error];
if (error)
NSLog(error.localizedDescription);
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_4444AYpCbCr16]; NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[_movieReader addOutput:[AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:videoSettings]];
[_movieReader startReading];
}
});
}];
to get next movie frame
static int frameCount=0;
- (void) readNextMovieFrame {
if (_movieReader.status == AVAssetReaderStatusReading) {
AVAssetReaderTrackOutput * output = [_movieReader.outputs objectAtIndex:0];
CMSampleBufferRef sampleBuffer = [output copyNextSampleBuffer];
if (sampleBuffer) {
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// Lock the image buffer
CVPixelBufferLockBaseAddress(imageBuffer,0);
// Get information of the image
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
/*We unlock the image buffer*/
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
/*Create a CGImageRef from the CVImageBufferRef*/
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef newImage = CGBitmapContextCreateImage(newContext);
/*We release some components*/
CGContextRelease(newContext);
CGColorSpaceRelease(colorSpace);
/*We display the result on the custom layer*/
/*self.customLayer.contents = (id) newImage;*/
/*We display the result on the image view (We need to change the orientation of the image so that the video is displayed correctly)*/
UIImage *image= [UIImage imageWithCGImage:newImage scale:0.0 orientation:UIImageOrientationRight];
UIGraphicsBeginImageContext(image.size);
[image drawAtPoint:CGPointMake(0, 0)];
// UIImage *img=UIGraphicsGetImageFromCurrentImageContext();
videoImage=UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//videoImage=image;
// if (frameCount < 40) {
NSLog(#"readNextMovieFrame==%d",frameCount);
NSString* filename = [NSString stringWithFormat:#"Documents/frame_%d.png", frameCount];
NSString* pngPath = [NSHomeDirectory() stringByAppendingPathComponent:filename];
[UIImagePNGRepresentation(videoImage) writeToFile: pngPath atomically: YES];
frameCount++;
// }
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
CFRelease(sampleBuffer);
}
}
}
once your _movieReader reach end then you need to restart again.

How do you only capture select camera frames using AVCaptureSession?

I'm trying to use AVCaptureSession to get images from the front camera for processing. So far each time a new frame was available I simply assigned it to a variable, and ran an NSTimer that checks every tenth of a second if there's a new frame, and if there is it processes it.
I would like to get a frame, freeze the camera, and get the next frame whenever I like. Something like [captureSession getNextFrame] you know?
Here's a part of my code although I'm not sure how helpful it may be:
- (void)startFeed {
loopTimerIndex = 0;
NSArray *captureDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput
deviceInputWithDevice:[captureDevices objectAtIndex:1]
error:nil];
AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
captureOutput.minFrameDuration = CMTimeMake(1, 10);
captureOutput.alwaysDiscardsLateVideoFrames = true;
dispatch_queue_t queue;
queue = dispatch_queue_create("cameraQueue", nil);
[captureOutput setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
NSString *key = (NSString *)kCVPixelBufferPixelFormatTypeKey;
NSNumber *value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary *videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[captureOutput setVideoSettings:videoSettings];
captureSession = [[AVCaptureSession alloc] init];
captureSession.sessionPreset = AVCaptureSessionPresetLow;
[captureSession addInput:captureInput];
[captureSession addOutput:captureOutput];
imageView = [[UIImage alloc] init];
[captureSession startRunning];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection {
loopTimerIndex++;
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer, 0);
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef newImage = CGBitmapContextCreateImage(newContext);
CGContextRelease(newContext);
CGColorSpaceRelease(colorSpace);
imageView = [UIImage imageWithCGImage:newImage scale:1.0 orientation:UIImageOrientationLeftMirrored];
[delegate updatePresentor:imageView];
if(loopTimerIndex == 1) {
[delegate feedStarted];
}
CGImageRelease(newImage);
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
[pool drain];
}
You don't actively poll the camera to get frames back, because that's not how the capture process is architected. Instead, if you would like to only display frames every tenth of a second instead of every 1/30th or faster, you should just ignore the frames in between.
For example, you could maintain a timestamp that you would compare against every time -captureOutput:didOutputSampleBuffer:fromConnection: was triggered. If the timestamp is greater than or equal to 0.1 seconds from right now, process and display the camera frame and reset the timestamp to the current time. Otherwise, ignore the frame.