UIImage orientation incorrect from AVCaptureStillImageOutput - iphone

I'm trying to capture pixel data from an AVCaptureStillImageOutput and noticed that upon cropping the image to a CGImage, it becomes re-oriented. To test this, I output a temporary image to the photo library. Now I've noticed that even before the image is cropped, it's thumbnail is rotated while the full image is not. (This later becomes a problem when I pass the UIImage to my custom pixelDataManager that requires proper dimensions of the image.)
Setting captureVideoPreviewLayer.orientation = AVCaptureVideoOrientationPortrait; and [videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait]; did not seem to change anything.
Any thoughts???
I'll break it up into sections...
1) Setting up the session, input, and output:
// Create a capture session
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
// Add capture layer to visible view
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResize;
captureVideoPreviewLayer.frame = screenBounds;
captureVideoPreviewLayer.bounds = screenBounds;
captureVideoPreviewLayer.orientation = AVCaptureVideoOrientationPortrait;
[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
// Setup error handling
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
// Start session
[session startRunning];
// Output image data
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
2) Setting up the video connection:
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
if ([videoConnection isVideoOrientationSupported])
{
[videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
}
3) Output the captured image:
NSLog(#"about to request a capture from: %#", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
//......
}

When you do this, though the image is being provided to you as a UIImage, it's actually using the underlying Quartz CGImage data which has a different origin point (lower left I think) which means that when you use the image it's rotated to the side.
I found a C function that you can call with the UIImage as parameter that fixes it and returns the fixed UIImage.
http://blog.logichigh.com/2008/06/05/uiimage-fix/

Related

Integrate iphone camera into ViewController's subview UIView

I am using following code to embed Camera into my application view.
Here is my code
- (void)viewDidLoad
{
[super viewDidLoad];
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.cameraView.bounds;
[self.cameraView.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input)
{
[Utilities alertDisplay:#"Error" message:#"Camera not found. Please use Photo Gallery instead."];
}
[session addInput:input];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
}
-(AVCaptureDevice *)backFacingCameraIfAvailable{
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices){
if (device.position == AVCaptureDevicePositionBack){
captureDevice = device;
break;
}
}
// couldn't find one on the front, so just get the default video device.
if (!captureDevice){
captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
return captureDevice;
}
- (IBAction)scanButtonPressed:(id)sender
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
NSLog(#"about to request a capture from: %#", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
//self.vImage.image = image;
}];
}
The problem i am facing is, I don't get any camera opened in my cameraView and also on scanBtnPressed i get
stillImageOutput.connections = 0 objects.
What is wrong?
Well, I just copy pasted your code into a blank project and it worked fine when changing self.cameraView.layer to self.view.layer. However, I did try creating self.cameraView and never initializing it and it had similar consequences to those you described.
Overall, I would check to make sure that self.cameraView isn't nil. If it's done programmatically, make sure you're calling alloc/init and setting a frame, and if it's an IBOutlet make sure it's properly linked.

How to get FNumber and ISOSpeedRatings from iPhone cam in realtime

I need to get camera parameters (exif data) such as FNumber, ISOSpeedRatings in realtime without taking real photos. Is there any way to do that?
Here is the complete solution. Dont forget to import appropriate frameworks and headers.
#import <AVFoundation/AVFoundation.h>
#import <ImageIO/CGImageProperties.h>
AVCaptureStillImageOutput *stillImageOutput;
AVCaptureSession *session;
- (void)viewDidLoad
{
[super viewDidLoad];
[self setupCaptureSession];
// Do any additional setup after loading the view, typically from a nib.
}
-(void)captureNow{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *__strong error) {
CFDictionaryRef exifAttachments = CMGetAttachment( imageDataSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
}];
}
// Create and configure a capture session and start it running
- (void)setupCaptureSession
{
NSError *error = nil;
// Create the session
session = [[AVCaptureSession alloc] init];
// Configure the session to produce lower resolution video frames, if your
// processing algorithm can cope. We'll specify medium quality for the
// chosen device.
session.sessionPreset = AVCaptureSessionPreset352x288;
// Find a suitable AVCaptureDevice
AVCaptureDevice *device = [AVCaptureDevice
defaultDeviceWithMediaType:AVMediaTypeVideo];
[device lockForConfiguration:nil];
device.whiteBalanceMode = AVCaptureWhiteBalanceModeLocked;
device.focusMode = AVCaptureFocusModeLocked;
[device unlockForConfiguration];
// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input) {
// Handling the error appropriately.
}
[session addInput:input];
stillImageOutput = [AVCaptureStillImageOutput new];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
if ([session canAddOutput:stillImageOutput])
[session addOutput:stillImageOutput];
// Start the session running to start the flow of data
[session startRunning];
[self captureNow];
}

Capture image & Video without using uiimagepicker controller

I need to capture Image & Video without opening imagepickerController.
You can Capture video and photo using AVCaptureSession
refer to iPhone SDK 4 AVFoundation - How to use captureStillImageAsynchronouslyFromConnection correctly?
-(void) viewDidAppear:(BOOL)animated
{
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
CALayer *viewLayer = self.vImagePreview.layer;
NSLog(#"viewLayer = %#", viewLayer);
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.vImagePreview.bounds;
[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
}
-(IBAction) captureNow
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
NSLog(#"about to request a capture from: %#", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
self.vImage.image = image;
}];
}

How to capture image without displaying preview in iOS

I want to capture images at specific instances, for example when a button is pushed; but I don't want to show any video preview screen. I guess captureStillImageAsynchronouslyFromConnection is what I need to use for this scenario. Currently, I can capture image if I show a video preview. However, if I remove the code to show the preview, the app crashes with the following output:
2012-04-07 11:25:54.898 imCapWOPreview[748:707] *** Terminating app
due to uncaught exception 'NSInvalidArgumentException', reason: '***
-[AVCaptureStillImageOutput captureStillImageAsynchronouslyFromConnection:completionHandler:] -
inactive/invalid connection passed.'
*** First throw call stack: (0x336ee8bf 0x301e21e5 0x3697c35d 0x34187 0x33648435 0x310949eb 0x310949a7 0x31094985 0x310946f5 0x3109502d
0x3109350f 0x31092f01 0x310794ed 0x31078d2d 0x37db7df3 0x336c2553
0x336c24f5 0x336c1343 0x336444dd 0x336443a5 0x37db6fcd 0x310a7743
0x33887 0x3382c) terminate called throwing an exception(lldb)
So here is my implementation:
BIDViewController.h:
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#interface BIDViewController : UIViewController
{
AVCaptureStillImageOutput *stillImageOutput;
}
#property (strong, nonatomic) IBOutlet UIView *videoPreview;
- (IBAction)doCap:(id)sender;
#end
Relevant staff inside BIDViewController.m :
#import "BIDViewController.h"
#interface BIDViewController ()
#end
#implementation BIDViewController
#synthesize capturedIm;
#synthesize videoPreview;
- (void)viewDidLoad
{
[super viewDidLoad];
[self setupAVCapture];
}
- (BOOL)setupAVCapture
{
NSError *error = nil;
AVCaptureSession *session = [AVCaptureSession new];
[session setSessionPreset:AVCaptureSessionPresetHigh];
/*
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.videoPreview.bounds;
[self.videoPreview.layer addSublayer:captureVideoPreviewLayer];
*/
// Select a video device, make an input
AVCaptureDevice *backCamera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:&error];
if (error)
return NO;
if ([session canAddInput:input])
[session addInput:input];
// Make a still image output
stillImageOutput = [AVCaptureStillImageOutput new];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
if ([session canAddOutput:stillImageOutput])
[session addOutput:stillImageOutput];
[session startRunning];
return YES;
}
- (IBAction)doCap:(id)sender {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *__strong error) {
// Do something with the captured image
}];
}
With the above code, if doCap is called, then the crash occurs. On the other hand, if I remove the following comments in setupAVCapture function
/*
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.videoPreview.bounds;
[self.videoPreview.layer addSublayer:captureVideoPreviewLayer];
*/
then it works without any problem.
In summary, my questions is, How can I capture images at controlled instances without showing preview ?
I use the following code for capturing from front facing camera (if available) or using the back camera. Works well on my iPhone 4S.
-(void)viewDidLoad{
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureDevice *device = [self frontFacingCameraIfAvailable];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
}
[session addInput:input];
//stillImageOutput is a global variable in .h file: "AVCaptureStillImageOutput *stillImageOutput;"
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];
[session startRunning];
}
-(AVCaptureDevice *)frontFacingCameraIfAvailable{
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices){
if (device.position == AVCaptureDevicePositionFront){
captureDevice = device;
break;
}
}
// couldn't find one on the front, so just get the default video device.
if (!captureDevice){
captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
return captureDevice;
}
-(IBAction)captureNow{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections){
for (AVCaptureInputPort *port in [connection inputPorts]){
if ([[port mediaType] isEqual:AVMediaTypeVideo]){
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
NSLog(#"about to request a capture from: %#", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error){
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments){
// Do something with the attachments if you want to.
NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
self.vImage.image = image;
}];
}
Well, i was facing a similar issue where by the captureStillImageAsynchronouslyFromConnection:stillImageConnection was raising an exception that the passed connection is invalid. Later on, i figured out that when i made properties for the session and stillImageOutPut to retain values, the issue got resolved.
I am a javascript developer. Wanted to create an iOS native framework for my cross-platform Javascript project
When I started to do the same, I faced many issues with methods deprecated and other runtime errors.
After fixing all the issues, below is the answer which is compliant with iOS 13.5
The code helps you to take a picture on a button click without a preview.
Your .h file
#interface NoPreviewCameraViewController : UIViewController <AVCapturePhotoCaptureDelegate> {
AVCaptureSession *captureSession;
AVCapturePhotoOutput *photoOutput;
AVCapturePhotoSettings *photoSetting;
AVCaptureConnection *captureConnection;
UIImageView *imageView;
}
#end
Your .m file
- (void)viewDidLoad {
[super viewDidLoad];
imageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height - 140)];
[self.view addSubview:imageView];
UIButton *takePicture = [UIButton buttonWithType:UIButtonTypeCustom];
[takePicture addTarget:self action:#selector(takePicture:) forControlEvents:UIControlEventTouchUpInside];
[takePicture setTitle:#"Take Picture" forState:UIControlStateNormal];
takePicture.frame = CGRectMake(40.0, self.view.frame.size.height - 140, self.view.frame.size.width - 40, 40);
[self.view addSubview:takePicture];
[self initCaptureSession];
}
- (void) initCaptureSession {
captureSession = [[AVCaptureSession alloc] init];
if([captureSession canSetSessionPreset: AVCaptureSessionPresetPhoto] ) {
[captureSession setSessionPreset:AVCaptureSessionPresetPhoto];
}
AVCaptureDeviceInput *deviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:nil];
if ([captureSession canAddInput:deviceInput]) {
[captureSession addInput:deviceInput];
}
photoOutput = [[AVCapturePhotoOutput alloc] init];
if ([captureSession canAddOutput:photoOutput]) {
[captureSession addOutput:photoOutput];
}
[captureSession startRunning];
}
-(void) setNewPhotoSetting {
photoSetting = [AVCapturePhotoSettings photoSettingsWithFormat:#{AVVideoCodecKey : AVVideoCodecTypeJPEG}];
[photoOutput setPhotoSettingsForSceneMonitoring:photoSetting];
}
- (IBAction)takePicture:(id)sender {
captureConnection = nil;
[self setNewPhotoSetting];
for (AVCaptureConnection *connection in photoOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual: AVMediaTypeVideo]) {
captureConnection = connection;
NSLog(#"Value of connection = %#", connection);
NSLog(#"Value of captureConnection = %#", captureConnection);
break;
}
}
if (captureConnection) {
break;
}
}
[photoOutput capturePhotoWithSettings:photoSetting delegate:self];
}

iPhone image ratio captured from AVCaptureSession

I am using somebody's source code for capturing image with AVCaptureSession. However,I found that CaptureSessionManager's previewLayer is shotter then the final captured image.
I found that the resulted image is always with ratio 720x1280=9:16. Now I want to crop the resulted image to an UIImage with ratio 320:480 so that it will only capture the portion visible in previewLayer. Any Idea? Thanks a lot.
Relevant Questions in stackoverflow(NO good answer yet):
Q1,
Q2
Source Code:
- (id)init {
if ((self = [super init])) {
[self setCaptureSession:[[[AVCaptureSession alloc] init] autorelease]];
}
return self;
}
- (void)addVideoPreviewLayer {
[self setPreviewLayer:[[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self captureSession]] autorelease]];
[[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
}
- (void)addVideoInput {
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (videoDevice) {
NSError *error;
if ([videoDevice isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus] && [videoDevice lockForConfiguration:&error]) {
[videoDevice setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[videoDevice unlockForConfiguration];
}
AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (!error) {
if ([[self captureSession] canAddInput:videoIn])
[[self captureSession] addInput:videoIn];
else
NSLog(#"Couldn't add video input");
}
else
NSLog(#"Couldn't create video input");
}
else
NSLog(#"Couldn't create video capture device");
}
- (void)addStillImageOutput
{
[self setStillImageOutput:[[[AVCaptureStillImageOutput alloc] init] autorelease]];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil];
[[self stillImageOutput] setOutputSettings:outputSettings];
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
[[self captureSession] addOutput:[self stillImageOutput]];
}
- (void)captureStillImage
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
NSLog(#"about to request a capture from: %#", [self stillImageOutput]);
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments) {
NSLog(#"attachements: %#", exifAttachments);
} else {
NSLog(#"no attachments");
}
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[self setStillImage:image];
[image release];
[[NSNotificationCenter defaultCenter] postNotificationName:kImageCapturedSuccessfully object:nil];
}];
}
Edit after doing some more research and testing:
AVCaptureSession's property "sessionPreset" has the following constants, I haven't checked each one of them, but noted that most of them ratio is either 9:16, or 3:4,
NSString *const AVCaptureSessionPresetPhoto;
NSString *const AVCaptureSessionPresetHigh;
NSString *const AVCaptureSessionPresetMedium;
NSString *const AVCaptureSessionPresetLow;
NSString *const AVCaptureSessionPreset352x288;
NSString *const AVCaptureSessionPreset640x480;
NSString *const AVCaptureSessionPresetiFrame960x540;
NSString *const AVCaptureSessionPreset1280x720;
NSString *const AVCaptureSessionPresetiFrame1280x720;
In My project, I have the fullscreen preview(frame size is 320x480)
also: [[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
I have done it in this way: take the photo in size 9:16 and crop it to 320:480, exactly the visible part of the previewlayer. It looks perfect.
The piece of code for resizing and croping to replace with old code is
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [UIImage imageWithData:imageData];
UIImage *scaledimage=[ImageHelper scaleAndRotateImage:image];
//going to crop the image 9:16 to 2:3;with Width fixed
float width=scaledimage.size.width;
float height=scaledimage.size.height;
float top_adjust=(height-width*3/2.0)/2.0;
[self setStillImage:[scaledimage croppedImage:rectToCrop]];
iPhone's camera is natively 4:3. The 16:9 images you get are already cropped from 4:3. Cropping those 16:9 images again to 4:3 is not what you want. Instead get the native 4:3 images from iPhone's camera by setting self.captureSession.sessionPreset = AVCaptureSessionPresetPhoto (before adding any inputs/outputs to the session).