I am trying to set the AVFoundation camera size based on my UIImageView (selectedImage) size like this:
self.sess = [AVCaptureSession new];
self.snapper = [AVCaptureStillImageOutput new];
self.snapper.outputSettings = #{AVVideoCodecKey: AVVideoCodecJPEG};
self.sess.sessionPreset = AVCaptureSessionPresetPhoto;
[self.sess addOutput:self.snapper];
AVCaptureDevice* cam = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:cam error:nil];
[self.sess addInput:input];
AVCaptureVideoPreviewLayer* lay = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.sess];
lay.videoGravity = AVLayerVideoGravityResizeAspectFill;
lay.frame = selectedImage.frame;
[self.view.layer addSublayer:lay];
In the app, it shows it just fine - 320X320 size, but when I look in the photo gallery, the saved image is longer than a square.
I also tried to remove lay.videoGravity = AVLayerVideoGravityResizeAspectFill; but then the image in the app is not filling the screen.
How can I set also the shot image size to be what the user sees in the camera with no extra tails?
This is the code that saves the image to gallery:
AVCaptureConnection *vc = [self.snapper connectionWithMediaType:AVMediaTypeVideo];
// deal with image when it arrives
typedef void(^MyBufBlock)(CMSampleBufferRef, NSError*);
MyBufBlock h = ^(CMSampleBufferRef buf, NSError *err) {
NSData* data = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:buf];
UIImage* im = [UIImage imageWithData:data];
dispatch_async(dispatch_get_main_queue(), ^{
selectedImage.hidden = NO;
selectedImage.contentMode = UIViewContentModeScaleAspectFill;
selectedImage.clipsToBounds = YES;
selectedImage.image = im;
[self.previewLayer removeFromSuperlayer];
self.previewLayer = nil;
[self.sess stopRunning];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:[selectedImage.image CGImage] orientation:(ALAssetOrientation)[selectedImage.image imageOrientation] completionBlock:nil];
I believe you'll need to crop the image yourself; the reduced size of your viewfinder only affected its visible shape, not the amount of data it would be processing.
An example on how to crop an image can be found at: How to properly crop an image taken on iPhone 4G (with EXIF rotation data)?
Related
How does one use a UIImageView to show a .mov files visible frame, the preview frame that is visible in e.g. finder or any youtube video etc.
I would like to be able to get the preview frame from .mov files saved to an apps library directory, so they will not be AVAssets.
The AVAssetImageGenerator in AVFoundation can be used to load videos both in albums and the local app dirs.
Here's a helper method that will return an image from a video URL (inside or outside the app) at any given time interval:
+ (UIImage *)thumbnailImageForVideo:(NSURL *)videoURL atTime:(NSTimeInterval)time {
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
NSParameterAssert(asset);
AVAssetImageGenerator *assetImageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
assetImageGenerator.appliesPreferredTrackTransform = YES;
assetImageGenerator.apertureMode = AVAssetImageGeneratorApertureModeEncodedPixels;
CGImageRef thumbnailImageRef = NULL;
CFTimeInterval thumbnailImageTime = time;
NSError *thumbnailImageGenerationError = nil;
thumbnailImageRef = [assetImageGenerator copyCGImageAtTime:CMTimeMake(thumbnailImageTime, 60) actualTime:NULL error:&thumbnailImageGenerationError];
NSAssert(thumbnailImageRef, #"CGImageRef shall never be nil.");
UIImage *thumbnailImage = thumbnailImageRef ? [[UIImage alloc] initWithCGImage:thumbnailImageRef] : nil;
return thumbnailImage;
}
I am taking photo with AVFoundation like here:
http://red-glasses.com/index.php/tutorials/ios4-take-photos-with-live-video-preview-using-avfoundation/
One snippet:
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
Picture size as far as i found out is set here:
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset =AVCaptureSessionPresetMedium; //medium on my iphone4 is 640*480
But here are only few settings possible and cant be set to custom size.
I would like taken pictures to have aspect ratio 1:1 if possible, for example 400x400 pixels (square pictures).
It seems instagram is doing it (or are they cropping images?).
How can i specify the size of taken picture to be square? I know how to change size on the phone, but if the taken picture is not square, the result is not good.
Any idea?
You should probably be resizing your UIImage by creating a new context. Examples below
.....
UIImage *image = [[UIImage alloc] initWithData:imageData];
UIImage *tempImage = nil;
CGSize targetSize = CGSizeMake(400,400);
UIGraphicsBeginImageContext(targetSize);
CGRect thumbnailRect = CGRectMake(0, 0, 0, 0);
thumbnailRect.origin = CGPointMake(0.0,0.0);
thumbnailRect.size.width = targetSize.width;
thumbnailRect.size.height = targetSize.height;
[image drawInRect:thumbnailRect];
tempImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// tmpImage is resized to 400x400
Hope this helps !
I have found one code for taking a picture using AVCaptureSession, and this is a great tutorial by "JJob". It is the best tutorial about AVCaptureSession; please read all the comments and previous tutorials of this site this will help you a lot. You can download a working sample code from here.
I wish to compress the image before storing it as an NSData object.
Below is the code, that helps me take NSData object of an Image.
NSURL *referenceURL = [info objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *library1 = [[ALAssetsLibrary alloc] init];
[library1 assetForURL:referenceURL resultBlock:^(ALAsset *asset)
{
int byteArraySize = asset.defaultRepresentation.size;
NSMutableData* rawData = [[NSMutableData alloc]initWithCapacity:byteArraySize];
void* bufferPointer = [rawData mutableBytes];
NSError* error=nil;
[asset.defaultRepresentation getBytes:bufferPointer fromOffset:0 length:byteArraySize error:&error];
if (error) {
NSLog(#"%#",error);
}
rawData = [NSMutableData dataWithBytes:bufferPointer length:byteArraySize];
}
Any Help will be appreciated.
UIImagePickerController does return a compressed image, but you can control the format and compression as well with this built in UIKit function and a related function for PNGs:
NSData* UIImageJPEGRepresentation(UIImage *image, CGFloat compressionQuality);
You might need to create an NSURL if referenceURL returns a string.
NSImage *image = [UIImage imageWithData:[NSData dataWithContentsOfURL: referenceURL]];
NSData *compressedImage = UIImageJPEGRepresentation(image, .1); //.1 is low quality
If you're using a UIImagePickerController, the image returned will be a JPEG, which is already compressed (I think). If not, you can use AVAssetWriter to write the image as a JPEG or PNG.
simple to use:-
-(UIImage *)fireYourImageForCompression:(UIImage *)imgComing{
NSData *dataImgBefore = [[NSData alloc] initWithData:UIImageJPEGRepresentation((imgComing), 1.0)];//.1 BEFORE COMPRESSION
int imageSizeBefore = (int)dataImgBefore.length;
NSLog(#"SIZE OF IMAGE: %i ", imageSizeBefore);
NSLog(#"SIZE OF IMAGE in Kb: %i ", imageSizeBefore/1024);
NSData *dataCompressedImage = UIImageJPEGRepresentation(imgComing, .1); //.1 is low quality
int sizeCompressedImage = (int)dataCompressedImage.length;
NSLog(#"SIZE AFTER COMPRESSION OF IMAGE: %i ", sizeCompressedImage);
NSLog(#"SIZE AFTER COMPRESSION OF IMAGE in Kb: %i ", sizeCompressedImage/1024); //AFTER
//now change your image from compressed data
imgComing = [UIImage imageWithData:dataCompressedImage];
return imgComing;}
I'm developing an iphone app with XMPP.
I'm trying to print the images of the logged in users in the chat room. This is the code I have.
XMPPvCardAvatarModule *avatar = [[XMPPvCardAvatarModule alloc]initWithvCardTempModule:[[self appDelegate]xmppvCardTempModule]];
XMPPJID *jidUser = [XMPPJID jidWithString:key];
NSData *foto = [[NSData alloc]initWithData:[avatar photoDataForJID:jidUser]];UIImage *pic = [UIImage imageWithData:fotoData];
UIImageView *picVista = [[UIImageView alloc] initWithImage:pic];
[self.commentsGente addSubview:picVista];
But that NSData is always 0KB, so there's obviously no image. How to do this?
NSData *photoData = [[[self appDelegate] xmppvCardAvatarModule]photoDataForJID:user.jid];
UIImageView *picVista;
picVista.image = [UIImage imageWithData:photoData];
try this it works fine for me
I have an odd problem with captureStillImageAsynchronouslyFromConnection. If I save the image using jpegStillImageNSDataRepresentation while the video is mirrored, the image in the camera roll is rotated 90 degrees clockwise. However, if it's not mirrored, the orientation is fine.
I'll post the code, anyone else have this problem/know of a fix?
Update: Just ran some tests, the heights and widths (640x480) are fine and reflect the device's orientation. When I Take a picture in portrait, it reports UIImageOrientationLeft and when mirrored, UIImageOrientationLeftMirrored.
Update 2: When I view the saved photo in the camera roll, the preview of the image has the right orientation, as does the image when you swipe between photos, but when the photo is fully loaded, it rotates 90 degrees. Could this be a camera roll problem? (I'm on 4.3.3)
- (void) captureImageAndSaveToCameraRoll
{
AVCaptureConnection *stillImageConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeVideo fromConnections:[[self stillImageOutput] connections]];
if ([stillImageConnection isVideoOrientationSupported])
[stillImageConnection setVideoOrientation:[self orientation]];
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:stillImageConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
ALAssetsLibraryWriteImageCompletionBlock completionBlock = ^(NSURL *assetURL, NSError *error) {
if (error) {
if ([[self delegate] respondsToSelector:#selector(captureManager:didFailWithError:)]) {
[[self delegate] captureManager:self didFailWithError:error];
}
}
};
if (imageDataSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[library writeImageToSavedPhotosAlbum:[image CGImage]
orientation:(ALAssetOrientation)[image imageOrientation]
completionBlock:completionBlock];
[image release];
[library release];
}
else
completionBlock(nil, error);
if ([[self delegate] respondsToSelector:#selector(captureManagerStillImageCaptured:)]) {
[[self delegate] captureManagerStillImageCaptured:self];
}
}];
}
I'm wondering if you the newly created image really has it's orientation set as you are assuming here:
[library writeImageToSavedPhotosAlbum:[image CGImage] orientation: (ALAssetOrientation) [image imageOrientation] completionBlock:completionBlock];
[image imageOrientation] in particular seems potentially the source of the problem, especially with the cast required...
you say you are seeing the image orientation somewhere, is it from [image imageOrientation] just after creating it with:
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation: imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
I'm wondering if you are assuming that this metadata is in the imageData returned from AVCaptureStillImageOutput jpegStillImageNSDataRepresentation: when that in fact only contains the image pixel data itself?
According to some reports, the iPhone's photos application back in 2009 didn't support any of the mirrored orientations at all; it's certainly possible that they partially fixed this but still have bugs in some cases (I know, for example, that UIImageView doesn't handle contentStretch correctly in some cases). This should be easy enough to test: just load the sample images from that blog into the camera roll and see what happens.
Create the category for UIImage described in iOS UIImagePickerController result image orientation after upload
Call this method on the image before saving it to your library.