Camera images not displaying correctly - iphone

In my app, the user picks an image with UIImagePickerController. They can then view a gallery of the pictures they've selected. If they pick screenshots taken with the iphone, the images display correctly, but if they pick images taken with the camera (or take a new photo with the camera), after a short while an image will appear black, followed by every other image appearing black. I've tried for days to get rid of this behaviour without any success. The code is pretty straight forward:
if ([[info objectForKey:UIImagePickerControllerMediaType] isEqualToString:#"public.image"]) {
[mediaSource addImage:[info objectForKey:UIImagePickerControllerOriginalImage]];
}
mediaSource adds the image to an NSDictionary, and then the gallery puts this image into a uiimageview when it's needed, although for testing purposes i've tried simply displaying the image straight off, which gives the same result. The key variable here seems to be that it only happens with images from the camera, so perhaps the solution is to somehow remake these images before displaying them again.
Any ideas?

Its not [info objectForKey:UIImagePickerControllerOriginalImage].
Change this to [info valueForKey:UIImagePickerControllerOriginalImage] and try it.
And BTW, what is thumbnail?

Related

How do you get the cropped version of an image using ALAsset?

I'm trying to get the cropped version of an image that's pulled using ALAsset. Specifically, I'm selecting items from the user's Photo Library and then uploading them. The issue is that in the library thumbnail view, iOS is showing us the cropped version. When you select that thumbnail and pull that image's asset using ALAsset, I get the full resolution version.
I did some research and couldn't find anything that helps in getting a second coordinate system of where the cropping happens.
To test it, you need iOS5 to edit the image in your library. Select an image in your image library, select "Edit", and crop the image. When you get the ALAsset you'll get the full image, and if you sync using iPhoto, iPhoto also pulls the full image. Also, you can re-edit the image and undo your crop.
This is how I'm getting the image:
UIImage *tmpImage = [UIImage imageWithCGImage:[[asset defaultRepresentation] fullResolutionImage]];
That gives me the full resolution image, obviously. There's a fullScreenImage flag which scales the full resolution image to the size of the screen. That's not what I want.
The ALAssetRepresenation class has a scale field, but that's a float value, which is also what I don't want.
If anyone can tell me where this cropped coordinate system can be found, I'd appreciate it.
Your Options:
Option 1 (ALAssetLibrary)
Use the - (CGImageRef)fullScreenImage method of AlAssetRepresentation.
Pros:
All the hard work is done for you, you get an image that looks just like the one in the Photos app. This includes cropping, and other changes. Easy.
Cons:
The resolution is "screen size", only as big as the device you are using, not the full possible resolution of the cropped image. If this doesn't concern you, then this is the perfect option.
Option 2 (ALAssetLibrary)
Extract the cropping data using the AdjustmentXMP key in the image's metadata (what #tom is referring to). Apply the crop.
Benefit:
It is possible to get a cropped image at the best possible resolution
Cons
You only get the cropping edits, not any other adjustments (like red-eye)
Who knows what Apple will support in the future in "Edit" mode, you may have to apply more edits in the future.
It's complicated, you first have to parse the XML data to read the crop rectangle, crop the unrotated image, and then apply the rotation.
Option 3 (Wishful Thinking)
Beg Apple to include a method like fullResolutionEditedImage which gives you the best possible quality photo, with all edits applied.
Pros:
Everything magically solved.
Cons:
Apple may never add this method.
Option 4 (UIImagePickerController)
This option only applies if you are using the image picker, you can't use it directly with the asset library
In the NSDictionary returned by -(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
You can extract the full sized, adjusted image from the UIImagePickerControllerOriginalImage key. Save this image somewhere. Then, instead of retrieving the image from the asset library, load the copy you made.
Pros:
You get the full size image, with adjustments
This is the only option Apple gives us for getting the full size image with all adjustments (like red-eye, etc), and not just the crop. This is particularly important in iOS 7 with the introduction of filters that can drastically alter the image.
Cons:
Can only be used with the image picker (not ALAssetRepresentation)
You must keep around a full-sized copy of the image. Depending on the number of such images, the disk usage by your app could grow substantially.
Update for iOS 7: you may wish to consider Option 4, or Option 1, as iOS 7 supports many operations now like filters, and your users will probably notice if they are missing. These two options support filters (and other edits), with Option 4 giving you a higher resolution result.
When a photo has been cropped with the iOS Photos App, the cropping coordinates can be found in the ALAssetRepresentation's metadata dictionary. fullResolutionImage will give you the uncropped photo, you have to perform the cropping yourself.
The AdjustmentXMP metadata contains not only the cropping coordinates but also indicates if auto-enhance or remove-red-eyes has been applied.
As of iOS 6.0 CIFilter provides filterArrayFromSerializedXMP:inputImageExtent:error: Probably you can use the ALAssetRepresentation's AdjustmentXMP metadata here and apply the CIFilter onto the ALAssetRepresentation's fullResolutionImage to recreate the modified image.
Be aware that the iOS Photos App handles JPG and RAW images differently. For JPG images a new ALAsset with the XMP metadata is stored in the Camera Roll. For RAW images an ALAssetRepresentation is added to the original ALAsset. I'm not sure if this additional ALAssetRepresentation is the modified image and if it has the AdjustmentXMP metadata. In addition to JPG and RAW images you should also test the behaviour for RAW+JPG images.

Saving Image to default camera folder - IOS

I'm currently finishing a project. But there is one last thing I need and can't figure out how to do.
I'm looking for a way to save images to the default folder (camera roll folder). I was thinking of making a button. So if the user decides to save his image he can tap on the save button.
If it matters, I have 4 imageview on top of each other. Would it be possible to save all those view to one image?
Thank you,
Here is how to save an image to the camera roll:
-(IBAction)savePhoto{
UIImageWriteToSavedPhotosAlbum(myImageView.image ,nil, nil, nil);
}

Performance problem about showing images

My problem is i have 50 images each is 157X157 pixels and take 25kb.They are connected to a slider and when the user slides the slider they change 0 to 100. It works but my problem is iphone 3g and 3gs is very slow its hard to see the images and sometimes it stuck for a second to show the next image. I use UIImageview to show the images.How can i do it better so the phone that before retina display can show them without stucking? Thanks for answers and your time.
edit: All pictures is in the file they are not taking from url.
Preload them when your app launches. Sending each of them a -size message should be enough to force them to load.
Maybe you can preload a lowRez version of your pictures ( like 78px )
when the slider move, you display the lowRez, and when it's stop, you swap to original picture.
convert them to the size they will be drawn at (don't rescale them on the fly).
ps: a sample would help a lot

Loading image slow my app down

I'm new to iphone dev, i'm just trying to get something done.
I do my first XML parser App. When my app launches i set a tableView with a custom cell. There is in the custom cell two labels and one image.
When the table view is launched i call my custom cell method which take the string od the image's adress and then make an url with it, a nsdata from the content of an url than i create my image with the content of this data.
But my table view is really really slow even on simulator.
what is the best way to display images on iphone via the internet ? ?
i know they have to be in a png format but even with png it is too slow
thanks for all
i use :
> NSURL *imgUrl = [NSURL URLWithString:_text];
> NSData *imgData = [NSData dataWithContentsOfURL:imgUrl];
> limage.image = [UIImage imageWithData:imgData];
Is your image saved at its final resolution? Say you want to display a 25x25 pixel image in the table, is the file saved as a 25x25 image or is it saved as some higher resolution? This will cause some slowness. However you should be loading the image in a background thread so that when scrolling the scrolling isn't holding up waiting for the image to load. Remember that doing network transfer to get the image is most likely the longest part of the process. You should also cache the image in some way instead of just assigning the image to the limage.image property. This way it won't redownload the image each time it redraws the cell.
Edit
Also you don't have to use PNG images for the pictures in the table cells. You can use regular JPG or GIF images if you want. The image format used should be what ever format is appropriate for the type of image content you will have. You only have to use PNGs for the icons, and splash screens, although it is preferred to use PNG in all embedded resources.
[NSData dataWithContentsOfURL:imgUrl];
Will certainly slow your app down since it will wait for your image to be done loading, called a synchronous call. What you really want is asynchronous calls for fetching the content of the image.
Have a look at http://joehewitt.com/post/the-three20-project/ where you'll find an nice subclass of UIImage that supports handling loading of images by url.

Image overlapping in photo viewer on three20 framework?

I just integrated the photo viewer from three20 framework. Its working fine but some time images are overlapping, that ia happening only for thumnail image while original image is perfectly loaded. Till the original image loaded , at that point of time images are overlapping.
Did any one face this problem and have any solution for that?
Thanks
If images are overlapping, you are not correctly setting their size when you are including them in the photo view controller. You have to (unfortunately) tell three20 the exact size so it knows how to display them in paging mode of the scrollview.
Make sure you are resizing your thumbnails similar sizes to his (somewhere around 100 pixels tall or wide, based on if it's in portrait or landscape)
[[[MockPhoto alloc]
initWithURL:#"http://farm4.static.flickr.com/3444/3223645618_13fe36887a_o.jpg"
smallURL:#"http://farm4.static.flickr.com/3444/3223645618_f5e2fa7fea_t.jpg"
size:CGSizeMake(320, 480) // see how he sets the size here for each and every photo? this is crucial
caption:#"These are the wood tiles that we had installed after the accident."] autorelease],
If you look at the thumbnail, it is 67pixels by 100pixels: http://farm4.static.flickr.com/3444/3223645618_f5e2fa7fea_t.jpg
If you look at the regular photo, it is 320pixels by 480pixels. : http://farm4.static.flickr.com/3444/3223645618_13fe36887a_o.jpg
These are two independent files, the three20 code does not create the thumbnail for you based on the larger photo. You must do this manually or subclass whatever container class he uses to do it for you.
Just by setting line 135 of TTPhotoView.m to
self.contentMode = UIViewContentModeScaleAspectFit
will help.