AVCaptureVideoOrientation landscape modes result in upside down still images - iphone

Im using AVCaptureSession for taking pictures and store pictures to album. When i click the button it takes the snapshot and store to album. But when i use landscape mode, then click the button it stores landscape modes result in upside down still images.
code:
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view.
[self setCaptureSession:[[AVCaptureSession alloc] init]];
[self addVideoInputFrontCamera:NO]; // set to YES for Front Camera, No for Back camera
[self addStillImageOutput];
[self setPreviewLayer:[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self captureSession]] ];
[[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CGRect layerRect = [[[self view] layer] bounds];
[[self previewLayer]setBounds:layerRect];
[[self previewLayer] setPosition:CGPointMake(CGRectGetMidX(layerRect),CGRectGetMidY(layerRect))];
[[[self view] layer] addSublayer:[self previewLayer]];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(saveImageToPhotoAlbum) name:kImageCapturedSuccessfully object:nil];
[[self captureSession] startRunning];
camera=[UIButton buttonWithType:UIButtonTypeCustom];
[camera setImage:[UIImage imageNamed:#"button.png"] forState:UIControlStateNormal];
[camera setFrame:CGRectMake(150, 10, 40, 30)];
[camera addTarget:self action:#selector(takephoto:) forControlEvents:UIControlEventTouchUpInside];
[self.view addSubview:camera];
}
Button for taking picture:
-(void)takephoto:(id)sender{
[self captureStillImage];
}
- (void)captureStillImage
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
NSLog(#"about to request a capture from: %#", [self stillImageOutput]);
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments) {
NSLog(#"attachements: %#", exifAttachments);
} else {
NSLog(#"no attachments");
}
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[self setStillImage:image];
// [image release];
[[NSNotificationCenter defaultCenter] postNotificationName:kImageCapturedSuccessfully object:nil];
}];
}

You need to set the videoConnection's videoOrientation property based on the orientation of the device. Do this in captureStillImage, after you have set the AVCaptureConnection.
UIDeviceOrientation deviceOrientation =
[[UIDevice currentDevice] orientation];
AVCaptureVideoOrientation avcaptureOrientation;
if ( deviceOrientation == UIDeviceOrientationLandscapeLeft )
avcaptureOrientation = AVCaptureVideoOrientationLandscapeRight;
else if ( deviceOrientation == UIDeviceOrientationLandscapeRight )
avcaptureOrientation = AVCaptureVideoOrientationLandscapeLeft;
[videoConnection setVideoOrientation:avcaptureOrientation];

Related

preview capture image using AVCaptureSession

Im using AVCaptureSession for taking snapshot and save to album. After taking snapshot the image will storing to album. But i need to show the image in camera preview. When i was using UIImagePickerController, after taking picture it displays the preview. But how to do with AVCaptureSession?
code:
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view.
[self setCaptureSession:[[AVCaptureSession alloc] init]];
[self addVideoInputFrontCamera:NO]; // set to YES for Front Camera, No for Back camera
[self addStillImageOutput];
[self setPreviewLayer:[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self captureSession]] ];
[[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CGRect layerRect = [[[self view] layer] bounds];
[[self previewLayer]setBounds:layerRect];
[[self previewLayer] setPosition:CGPointMake(CGRectGetMidX(layerRect),CGRectGetMidY(layerRect))];
[[[self view] layer] addSublayer:[self previewLayer]];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(saveImageToPhotoAlbum) name:kImageCapturedSuccessfully object:nil];
[[self captureSession] startRunning];
}
- (void)addVideoInputFrontCamera:(BOOL)front {
NSArray *devices = [AVCaptureDevice devices];
AVCaptureDevice *frontCamera;
AVCaptureDevice *backCamera;
for (AVCaptureDevice *device in devices) {
NSLog(#"Device name: %#", [device localizedName]);
if ([device hasMediaType:AVMediaTypeVideo]) {
if ([device position] == AVCaptureDevicePositionBack) {
NSLog(#"Device position : back");
backCamera = device;
}
else {
NSLog(#"Device position : front");
frontCamera = device;
}
}
}
NSError *error = nil;
if (front) {
AVCaptureDeviceInput *frontFacingCameraDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:frontCamera error:&error];
if (!error) {
if ([[self captureSession] canAddInput:frontFacingCameraDeviceInput]) {
[[self captureSession] addInput:frontFacingCameraDeviceInput];
} else {
NSLog(#"Couldn't add front facing video input");
}
}
} else {
AVCaptureDeviceInput *backFacingCameraDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:&error];
if (!error) {
if ([[self captureSession] canAddInput:backFacingCameraDeviceInput]) {
[[self captureSession] addInput:backFacingCameraDeviceInput];
} else {
NSLog(#"Couldn't add back facing video input");
}
}
}
}
- (void)addStillImageOutput
{
[self setStillImageOutput:[[AVCaptureStillImageOutput alloc] init] ];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil];
[[self stillImageOutput] setOutputSettings:outputSettings];
videoConnection = nil;
for (connection in [[self stillImageOutput] connections]) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
[[self captureSession] addOutput:[self stillImageOutput]];
}
- (void)captureStillImageWithOverlay
{
videoConnection = nil;
for (connection in [[self stillImageOutput] connections]) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
NSLog(#"about to request a capture from: %#", [self stillImageOutput]);
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments) {
NSLog(#"attachements: %#", exifAttachments);
} else {
NSLog(#"no attachments");
}
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[self setStillImage:image];
// [image release];
[[NSNotificationCenter defaultCenter] postNotificationName:kImageCapturedSuccessfully object:nil];
}];
}
- (void)saveImageToPhotoAlbum
{
UIImageWriteToSavedPhotosAlbum([self stillImage], self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
}

AVCaptureSession capture image landscape mode

Im using AVCaptureSession for taking pictures and store pictures to album. When i click the button it takes the snapshot and store to album. But when i use landscape mode, then click the button it stores landscape mode image. When preview the image it is in portrait mode.
code:
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view.
[self setCaptureSession:[[AVCaptureSession alloc] init]];
[self addVideoInputFrontCamera:NO]; // set to YES for Front Camera, No for Back camera
[self addStillImageOutput];
[self setPreviewLayer:[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self captureSession]] ];
// UIView *newView = [[UIView alloc] initWithFrame:self.view.bounds];
// [newView.layer addSublayer: self.previewLayer];
// [self.view addSubview: newView];
//previewLayer.frame=self.previewLayer.bounds;
[[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
// [[self previewLayer] setVideoGravity:AVLayerVideoGravityResize];
// self.previewLayer.orientation = AVCaptureVideoOrientationPortrait;
CGRect layerRect = [[[self view] layer] bounds];
[[self previewLayer]setBounds:layerRect];
[[self previewLayer] setPosition:CGPointMake(CGRectGetMidX(layerRect),CGRectGetMidY(layerRect))];
[[[self view] layer] addSublayer:[self previewLayer]];
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(saveImageToPhotoAlbum) name:kImageCapturedSuccessfully object:nil];
[[self captureSession] startRunning];
camera=[UIButton buttonWithType:UIButtonTypeCustom];
[camera setImage:[UIImage imageNamed:#"button.png"] forState:UIControlStateNormal];
[camera setFrame:CGRectMake(150, 10, 40, 30)];
[camera addTarget:self action:#selector(takephoto:) forControlEvents:UIControlEventTouchUpInside];
[self.view addSubview:camera];
}
capture snapshot:
- (void)takephoto:(id)sender
{
[self captureStillImage];
}
- (void)captureStillImage
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
NSLog(#"about to request a capture from: %#", [self stillImageOutput]);
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments) {
NSLog(#"attachements: %#", exifAttachments);
} else {
NSLog(#"no attachments");
}
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[self setStillImage:image];
// [image release];
[[NSNotificationCenter defaultCenter] postNotificationName:kImageCapturedSuccessfully object:nil];
}];
}
I'm using self.previewLayer.orientation = UIInterfaceOrientationLandscapeLeft; for landscape mode
In addition to previewLayer orientation you should set the videoOrientation for correcting this issue.
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
after this
[videoConnection setVideoOrientation:AVCaptureVideoOrientationLandscapeLeft];
set the orientation by checking weather app in landscape or portrait mode..

AVCaptureSession With STT(Speech to text)

I am using AVCapture Session to record video same time i am using STT(google Speech to text api) to convert voice into text. I have facing a problem when I click on the speak button then camera get freezes. Any correct answer will be acceptable. Thanks in advance .
To start camera in
-(void)viewDidLoad;
if ([[self captureManager] setupSession]) {
// Create video preview layer and add it to the UI
AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:[[self captureManager] session]];
UIView *view = [self videoPreviewView];
CALayer *viewLayer = [view layer];
[viewLayer setMasksToBounds:YES];
CGRect bounds = [view bounds];
[newCaptureVideoPreviewLayer setFrame:bounds];
if ([newCaptureVideoPreviewLayer isOrientationSupported]) {
[newCaptureVideoPreviewLayer setOrientation:AVCaptureVideoOrientationLandscapeLeft|AVCaptureVideoOrientationLandscapeRight];
}
[newCaptureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[viewLayer insertSublayer:newCaptureVideoPreviewLayer below:[[viewLayer sublayers] objectAtIndex:0]];
[self setCaptureVideoPreviewLayer:newCaptureVideoPreviewLayer];
[newCaptureVideoPreviewLayer release];
// Start the session. This is done asychronously since -startRunning doesn't return until the session is running.
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[[[self captureManager] session] startRunning];
});
[self updateButtonStates];
}
- (BOOL) setupSession
{
BOOL success = NO;
// Set torch and flash mode to auto
if ([[self backFacingCamera] hasFlash]) {
if ([[self backFacingCamera] lockForConfiguration:nil]) {
if ([[self backFacingCamera] isFlashModeSupported:AVCaptureFlashModeAuto]) {
[[self backFacingCamera] setFlashMode:AVCaptureFlashModeAuto];
}
[[self backFacingCamera] unlockForConfiguration];
}
}
if ([[self backFacingCamera] hasTorch]) {
if ([[self backFacingCamera] lockForConfiguration:nil]) {
if ([[self backFacingCamera] isTorchModeSupported:AVCaptureTorchModeAuto]) {
[[self backFacingCamera] setTorchMode:AVCaptureTorchModeAuto];
}
[[self backFacingCamera] unlockForConfiguration];
}
}
// Init the device inputs
AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self frontFacingCamera] error:nil];
AVCaptureDeviceInput *newAudioInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self audioDevice] error:nil];
// Setup the still image file output
AVCaptureStillImageOutput *newStillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:
AVVideoCodecJPEG, AVVideoCodecKey,
nil];
[newStillImageOutput setOutputSettings:outputSettings];
[outputSettings release];
// Create session (use default AVCaptureSessionPresetHigh)
AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];
// Add inputs and output to the capture session
if ([newCaptureSession canAddInput:newVideoInput]) {
[newCaptureSession addInput:newVideoInput];
}
if ([newCaptureSession canAddInput:newAudioInput]) {
[newCaptureSession addInput:newAudioInput];
}
if ([newCaptureSession canAddOutput:newStillImageOutput]) {
[newCaptureSession addOutput:newStillImageOutput];
}
[self setStillImageOutput:newStillImageOutput];
[self setVideoInput:newVideoInput];
[self setAudioInput:newAudioInput];
[self setSession:newCaptureSession];
[newStillImageOutput release];
[newVideoInput release];
[newAudioInput release];
[newCaptureSession release];
// Set up the movie file output
NSURL *outputFileURL = [self tempFileURL];
AVCamRecorder *newRecorder = [[AVCamRecorder alloc] initWithSession:[self session] outputFileURL:outputFileURL];
[newRecorder setDelegate:self];
// Send an error to the delegate if video recording is unavailable
if (![newRecorder recordsVideo] && [newRecorder recordsAudio]) {
NSString *localizedDescription = NSLocalizedString(#"Video recording unavailable", #"Video recording unavailable description");
NSString *localizedFailureReason = NSLocalizedString(#"Movies recorded on this device will only contain audio. They will be accessible through iTunes file sharing.", #"Video recording unavailable failure reason");
NSDictionary *errorDict = [NSDictionary dictionaryWithObjectsAndKeys:
localizedDescription, NSLocalizedDescriptionKey,
localizedFailureReason, NSLocalizedFailureReasonErrorKey,
nil];
NSError *noVideoError = [NSError errorWithDomain:#"AVCam" code:0 userInfo:errorDict];
if ([[self delegate] respondsToSelector:#selector(captureManager:didFailWithError:)]) {
[[self delegate] captureManager:self didFailWithError:noVideoError];
}
}
[self setRecorder:newRecorder];
[newRecorder release];
success = YES;
return success;
}

Capture overlayimage using AVFoundation framework

I want to take picture with overlay image.
code:
- (void)addStillImageOutput
{
// NSLog(#"You capture image");
[self setStillImageOutput:[[[AVCaptureStillImageOutput alloc] init] autorelease]];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil];
[[self stillImageOutput] setOutputSettings:outputSettings];
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
[[self captureSession] addOutput:[self stillImageOutput]];
}
- (void)captureStillImage
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
NSLog(#"about to request a capture from: %#", [self stillImageOutput]);
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments) {
NSLog(#"attachements: %#", exifAttachments);
} else {
NSLog(#"no attachments");
}
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[self setStillImage:image];
[image release];
[[NSNotificationCenter defaultCenter] postNotificationName:kImageCapturedSuccessfully object:nil];
}];
}
OverlayImage:
-(void)ButtonPressed1{
UIImageView *overlayImageView = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"sof.png"]];
[overlayImageView setFrame:CGRectMake(30, 100, 260, 200)];
[[self view] addSubview:overlayImageView];
[overlayImageView release];
}
captureStillImageAsynchronouslyFromConnection captures only data from connection (AVCaptureConnection) and the additional images are not in connection. They are in the view.
So, to "generate" an image with all elements (picture and overlay), you must have to do something like this:
UIGraphicsBeginImageContext(stillImage.size);
[stillImage drawInRect:CGRectMake(0, 0, picture.size.width, picture.size.height)];
[overlayImageView drawInRect:CGRectMake(overlayImageView.frame.origin.x, overlayImageView.frame.origin.y, overlayImageView.frame.size.width, overlayImageView.frame.size.height)];
UIImage *result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
This code is considering that picture size is equal to screen size. If picture size is different, you have to calculate the coordinates to place the overlay in the drawRect method. Sorry for bad english.

How to detect faces within a video session with iOS 5

Within my App I'm using a videosession where I can take pictures with. However I'd like to have face detection within this videosession. I looked at Apple's example "SquareCam" which is exactly what I'm looking for. However implementing their code in my project is driving me bonkers.
#import "CaptureSessionManager.h"
#import <ImageIO/ImageIO.h>
#implementation CaptureSessionManager
#synthesize captureSession;
#synthesize previewLayer;
#synthesize stillImageOutput;
#synthesize stillImage;
#pragma mark Capture Session Configuration
- (id)init {
if ((self = [super init])) {
[self setCaptureSession:[[AVCaptureSession alloc] init]];
}
return self;
}
- (void)didReceiveMemoryWarning
{
// Releases the view if it doesn't have a superview.
NSLog(#"memorywarning");
// Release any cached data, images, etc that aren't in use.
}
- (void)addVideoPreviewLayer {
[self setPreviewLayer:[[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self captureSession]] autorelease]];
[[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
}
- (void)addVideoInput {
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([videoDevice isFocusModeSupported:AVCaptureFocusModeLocked]) {
NSError *error = nil;
if ([videoDevice lockForConfiguration:&error]) {
NSLog(#"focus");
videoDevice.focusMode = AVCaptureFocusModeLocked;
videoDevice.focusMode = AVCaptureFocusModeContinuousAutoFocus;
//videoDevice.focusMode = AVCaptureSessionPresetPhoto
videoDevice.whiteBalanceMode = AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance;
[captureSession setSessionPreset:AVCaptureSessionPresetPhoto];
[videoDevice unlockForConfiguration];
}
else {
}
}
// Respond to the failure as appropriate.
if (videoDevice) {
NSError *error;
AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (!error) {
if ([[self captureSession] canAddInput:videoIn])
[[self captureSession] addInput:videoIn];
else
NSLog(#"Couldn't add video input");
}
else
NSLog(#"Couldn't create video input");
}
else
NSLog(#"Couldn't create video capture device");
}
- (void)addStillImageOutput
{
[self setStillImageOutput:[[[AVCaptureStillImageOutput alloc] init] autorelease]];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil];
[[self stillImageOutput] setOutputSettings:outputSettings];
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
[[self captureSession] addOutput:[self stillImageOutput]];
[outputSettings release];
}
- (void)captureStillImage
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
NSLog(#"about to request a capture from: %#", [self stillImageOutput]);
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments) {
NSLog(#"attachements: %#", exifAttachments);
} else {
NSLog(#"no attachments");
}
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[self setStillImage:image];
[image release];
[[NSNotificationCenter defaultCenter] postNotificationName:kImageCapturedSuccessfully object:nil];
}];
}
- (void)dealloc {
[[self captureSession] stopRunning];
[previewLayer release], previewLayer = nil;
[captureSession release], captureSession = nil;
[stillImageOutput release], stillImageOutput = nil;
[stillImage release], stillImage = nil;
[super dealloc];
}
#end
Besides my videosession I did succeeded with recognizing faces within a UIImage which I imported inside my project. I did this with the example of #Abhinav Jha (How to properly instantiate CIDetector class object in iOS 5 face detection API).
CIImage *ciImage = [[CIImage alloc] initWithImage:[UIImage imageNamed:#"Photo.JPG"]];
if (ciImage == nil)
[imageView setImage:[UIImage imageNamed:#"Photo.JPG"]];
NSDictionary *options = [[NSDictionary alloc] initWithObjectsAndKeys:
#"CIDetectorAccuracy", #"CIDetectorAccuracyHigh",nil];
CIDetector *ciDetector = [CIDetector detectorOfType:CIDetectorTypeFace
context:nil
options:options];
NSArray *features = [ciDetector featuresInImage:ciImage];
NSLog(#"no of face detected: %d", [features count]);
Hope someone can point me in the right direction with combining the two examples!