I have an odd problem with captureStillImageAsynchronouslyFromConnection. If I save the image using jpegStillImageNSDataRepresentation while the video is mirrored, the image in the camera roll is rotated 90 degrees clockwise. However, if it's not mirrored, the orientation is fine.
I'll post the code, anyone else have this problem/know of a fix?
Update: Just ran some tests, the heights and widths (640x480) are fine and reflect the device's orientation. When I Take a picture in portrait, it reports UIImageOrientationLeft and when mirrored, UIImageOrientationLeftMirrored.
Update 2: When I view the saved photo in the camera roll, the preview of the image has the right orientation, as does the image when you swipe between photos, but when the photo is fully loaded, it rotates 90 degrees. Could this be a camera roll problem? (I'm on 4.3.3)
- (void) captureImageAndSaveToCameraRoll
{
AVCaptureConnection *stillImageConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeVideo fromConnections:[[self stillImageOutput] connections]];
if ([stillImageConnection isVideoOrientationSupported])
[stillImageConnection setVideoOrientation:[self orientation]];
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:stillImageConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
ALAssetsLibraryWriteImageCompletionBlock completionBlock = ^(NSURL *assetURL, NSError *error) {
if (error) {
if ([[self delegate] respondsToSelector:#selector(captureManager:didFailWithError:)]) {
[[self delegate] captureManager:self didFailWithError:error];
}
}
};
if (imageDataSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
UIImage *image = [[UIImage alloc] initWithData:imageData];
[library writeImageToSavedPhotosAlbum:[image CGImage]
orientation:(ALAssetOrientation)[image imageOrientation]
completionBlock:completionBlock];
[image release];
[library release];
}
else
completionBlock(nil, error);
if ([[self delegate] respondsToSelector:#selector(captureManagerStillImageCaptured:)]) {
[[self delegate] captureManagerStillImageCaptured:self];
}
}];
}
I'm wondering if you the newly created image really has it's orientation set as you are assuming here:
[library writeImageToSavedPhotosAlbum:[image CGImage] orientation: (ALAssetOrientation) [image imageOrientation] completionBlock:completionBlock];
[image imageOrientation] in particular seems potentially the source of the problem, especially with the cast required...
you say you are seeing the image orientation somewhere, is it from [image imageOrientation] just after creating it with:
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation: imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
I'm wondering if you are assuming that this metadata is in the imageData returned from AVCaptureStillImageOutput jpegStillImageNSDataRepresentation: when that in fact only contains the image pixel data itself?
According to some reports, the iPhone's photos application back in 2009 didn't support any of the mirrored orientations at all; it's certainly possible that they partially fixed this but still have bugs in some cases (I know, for example, that UIImageView doesn't handle contentStretch correctly in some cases). This should be easy enough to test: just load the sample images from that blog into the camera roll and see what happens.
Create the category for UIImage described in iOS UIImagePickerController result image orientation after upload
Call this method on the image before saving it to your library.
Related
The iOS MKMapView, if you supply it a raster source that is in 256x256 pixels, will load the four tiles of the same region that are +1 zoom-level. Therefore it seems as if the tiles are in high-dpi-mode. Awesome!
Now I have an app that uses a third party source of raster tiles. The problem is that the data looks horribly low-dpi.
Is there a way to tell Mapbox that it should load the next zoom level of each tile of a given source and use that instead?
So instead of loading tile 0/0/0.jpg for the whole world it should load 1/0/0.jpg, 1/1/0.jpg, 1/0/1.jpg and 1/1/1.jpg and use those for zoom level 0. So basically instead of having one 256x256 image it would have four of them, giving it a 512x512 image that looks much crisper.
The question would be... is there a way to do that not only for iOS, but in the description of the source? So that it works for Web and Android as well?
You can request a 512X512 tile from Mapbox using the #2x flag on your request for both Raster and Vector tiles
Your query should look like this. Notice the #2x right before the query string:
https://api.mapbox.com/v4/mapbox.satellite/1/0/0#2x.png?access_token={access_token}
Why not just upscale the 256 image in your tile overlay subclass?
- (void)loadTileAtPath:(MKTileOverlayPath)path result:(void (^)(NSData *, NSError *))result {
if (!result) {
return;
}
self.tileSize = CGSizeMake(512, 512);
NSString *URLString = [self.internalTemplate stringByReplacingOccurrencesOfString:#"{x}" withString:[NSString stringWithFormat:#"%li", (long)path.x]];
URLString = [URLString stringByReplacingOccurrencesOfString:#"{y}" withString:[NSString stringWithFormat:#"%li", (long)path.y]];
URLString = [URLString stringByReplacingOccurrencesOfString:#"{z}" withString:[NSString stringWithFormat:#"%li", (long)path.z]];
NSURLRequest *request = [NSURLRequest requestWithURL:[NSURL URLWithString:URLString]];
NSURLSessionDataTask *task = [[NSURLSession sharedSession] dataTaskWithRequest:request completionHandler:^(NSData *data, NSURLResponse *response, NSError *error) {
if (error) {
NSLog(#"%#", error);
}
UIImage *image = [UIImage imageWithData:data];
UIImage *resized = [self imageWithImage:image scaledToSize:CGSizeMake(512, 512)];
result(UIImagePNGRepresentation(resized), error);
}];
[task resume];
}
- (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
UIGraphicsBeginImageContextWithOptions(newSize, NO, 1.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
I am trying to set the AVFoundation camera size based on my UIImageView (selectedImage) size like this:
self.sess = [AVCaptureSession new];
self.snapper = [AVCaptureStillImageOutput new];
self.snapper.outputSettings = #{AVVideoCodecKey: AVVideoCodecJPEG};
self.sess.sessionPreset = AVCaptureSessionPresetPhoto;
[self.sess addOutput:self.snapper];
AVCaptureDevice* cam = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:cam error:nil];
[self.sess addInput:input];
AVCaptureVideoPreviewLayer* lay = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.sess];
lay.videoGravity = AVLayerVideoGravityResizeAspectFill;
lay.frame = selectedImage.frame;
[self.view.layer addSublayer:lay];
In the app, it shows it just fine - 320X320 size, but when I look in the photo gallery, the saved image is longer than a square.
I also tried to remove lay.videoGravity = AVLayerVideoGravityResizeAspectFill; but then the image in the app is not filling the screen.
How can I set also the shot image size to be what the user sees in the camera with no extra tails?
This is the code that saves the image to gallery:
AVCaptureConnection *vc = [self.snapper connectionWithMediaType:AVMediaTypeVideo];
// deal with image when it arrives
typedef void(^MyBufBlock)(CMSampleBufferRef, NSError*);
MyBufBlock h = ^(CMSampleBufferRef buf, NSError *err) {
NSData* data = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:buf];
UIImage* im = [UIImage imageWithData:data];
dispatch_async(dispatch_get_main_queue(), ^{
selectedImage.hidden = NO;
selectedImage.contentMode = UIViewContentModeScaleAspectFill;
selectedImage.clipsToBounds = YES;
selectedImage.image = im;
[self.previewLayer removeFromSuperlayer];
self.previewLayer = nil;
[self.sess stopRunning];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:[selectedImage.image CGImage] orientation:(ALAssetOrientation)[selectedImage.image imageOrientation] completionBlock:nil];
I believe you'll need to crop the image yourself; the reduced size of your viewfinder only affected its visible shape, not the amount of data it would be processing.
An example on how to crop an image can be found at: How to properly crop an image taken on iPhone 4G (with EXIF rotation data)?
I'm putting together a photo gallery program. I's their a way to store the images in my app to the image gallery??? I'v seen sample code to get images using the uiimagepicker, but nothing for storing them in the photo gallery?
You can use the ALAssetsLibrary for saving the image to photoLibrary.
UIImage *saveImage = //your image;
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:[saveImage CGImage] orientation:(ALAssetOrientation)[saveImage imageOrientation] completionBlock:^(NSURL *assetURL, NSError *error)
{
}];
Also you can store image using UIImageWriteToSavedPhotosAlbum() of UIKit.
As H2CO3 stated in the comments, UIImageWriteToSavedPhotosAlbum() is what you're looking for.
Below is the most simple use of this method. You can replace "image" with an image from your photo gallery, any image from your app, etc.
UIImage *image = [UIImage imageNamed:#"image.png"];
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
I am taking photo with AVFoundation like here:
http://red-glasses.com/index.php/tutorials/ios4-take-photos-with-live-video-preview-using-avfoundation/
One snippet:
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
Picture size as far as i found out is set here:
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset =AVCaptureSessionPresetMedium; //medium on my iphone4 is 640*480
But here are only few settings possible and cant be set to custom size.
I would like taken pictures to have aspect ratio 1:1 if possible, for example 400x400 pixels (square pictures).
It seems instagram is doing it (or are they cropping images?).
How can i specify the size of taken picture to be square? I know how to change size on the phone, but if the taken picture is not square, the result is not good.
Any idea?
You should probably be resizing your UIImage by creating a new context. Examples below
.....
UIImage *image = [[UIImage alloc] initWithData:imageData];
UIImage *tempImage = nil;
CGSize targetSize = CGSizeMake(400,400);
UIGraphicsBeginImageContext(targetSize);
CGRect thumbnailRect = CGRectMake(0, 0, 0, 0);
thumbnailRect.origin = CGPointMake(0.0,0.0);
thumbnailRect.size.width = targetSize.width;
thumbnailRect.size.height = targetSize.height;
[image drawInRect:thumbnailRect];
tempImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// tmpImage is resized to 400x400
Hope this helps !
I have found one code for taking a picture using AVCaptureSession, and this is a great tutorial by "JJob". It is the best tutorial about AVCaptureSession; please read all the comments and previous tutorials of this site this will help you a lot. You can download a working sample code from here.
I've created my own custom Table View Cell comprising of a number, an image (thumbnail) and a description of something that is successfully being stored in a sqlite database using Core Data.
The image below will give a better idea of what I currently have in Interface Builder:
In the screen before, I am storing the filepath of an image taken with the camera and saving it to the database. I want to be able to load each UIImageView on the Table View Cells with the filepaths stored against each item.
I have tried the following which does not work:
UIImageView * thingPhotoView = (UIImageView *)
[cell viewWithTag:11];
NSString *path = thing.photo;
thingPhotoView.image = [UIImage imageWithContentsOfFile:path];
I've not used ALAssetLibrary but judging by the URL that is what you are using. Therefore you will need to use ALAssetsLibrary to access the file
Check the docs here ALAssetsLibrary
You probably want this method
assetForURL:resultBlock:failureBlock:
Which might look like this
ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
[library assetForURL:thing.photo
resultBlock:^(ALAsset *asset) {
thingPhotoView.image = [UIImage imageWithCGImage:[asset thumbnail]];
} failureBlock:^(NSError *error) {
NSLog(#"Couldn't load asset %# => %#", error, [error localizedDescription]);
}];