ABPersonSetImageData is extremely slow - iphone

I am writing an app that interfaces with the iPhone address book.
Here is the relevant section of my code (from UIImagePickerControllerDelegate)
-(void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingImage:(UIImage *)image
editingInfo:(NSDictionary *)editingInfo
{
ABPersonSetImageData(record, (__bridge CFDataRef)UIImagePNGRepresentation(image), &error);
}
My app lets you take a picture with the camera (using UIImagePictureController), and then stores it as the contact for someone in your address book.
I'm finding that the operation above hangs for 5-10 seconds. 1) Is there a better approach? 2) Why is this so slow?

Saving as a JPEG:
UIImageJPEGRepresentation (UIImage *image, CGFloat compressionQuality);
will be faster than UIImagePNGRepresentation, especially if compressionQuality is set to a low value. However, this is still a CPU intensive process, so there's no way to avoid the wait.
The best you can do is show a message that work is being done, so the interface doesn't feel unresponsive. Use something like SVProgressHUD to do that.

Related

Speed up UIImagePickerController by using a low res image then later swapping in the high res image

There is a great wiki about image loading from the camera picker. Which made me aware of costs of taking an image at full resolution.
At the moment, when a photo is picked, I push a new view controller and display the image at full resolution. Pushing the view is a really slow and choppy experience (about 1 fps!) that I want to smooth out. Comparing to picking a photo on Instagram, I notice that they use a low resolution image and later swap in the full image. (I need the full res image because the user should be able to zoom and pan)
The idea I want is somthing like this:
- (void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
UIImage* fullImage = [info objectForKey:UIImagePickerControllerOriginalImage];
// Push a view controller and give it the image.....
}
- (void) viewDidLoad {
CGSize smallerImageSize = _imageView.bounds;
UIImage* smallerImage = [MyHelper quickAndDirtyImageResize:_fullImage
toSize:smallerImageSize];
// Set the low res image for now... then later swap in the high res
_imageView.image = smallerImage;
// Swap in high res image async
// This is the part im unsure about... Im sure UIKit isn't thread-safe!
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_LOW, NULL), ^{
_imageView.image = _fullImage;
});
}
I think that UIImage isn't memory mapped in until it is used. Therefore it dons't slow things down until its given to the imageView. Is this correct?
I think image decoding is already done asynchronously by the system, however, it is sill slowing the phone down considerably while its loading.
Is there a way do perform some of the work required to display an image in a very low priority background queue?
You're trying to do things the most complicated way :)
Why not just prepare the small image before pushing the view controller and pass it to them? Look at this code:
- (void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
UIImage *fullImage = [info objectForKey:UIImagePickerControllerOriginalImage];
UIImage *smallImage = [fullImage imageScaledToSize:self.view.bounds];
// Push a view controller and give it BOTH images
}
// And in your pushed view controller
- (void)viewDidLoad
{
_imageView.image = self.smallImage;
}
- (void)viewDidAppear:(BOOL)animated
{
[super viewDidAppear:animated];
_imageView.image = self.fullImage;
}
The main thing is that viewDidAppear: will be called right after the animation is done so you can switch images here without any worries.
In addition to Andrey's answer, instead of using imageScaledToSize, use CGImageSourceCreateThumbnailAtIndex. In fact, it's very possible (I am pretty sure it's the case) that any image used in the photo album already has a thumbnail. So instead of bothering with the image itself, grab the existing thumbnail and display it, then use Andrey's code to switch in the main image. This way you do as little work as possible during the animation period.
Calling CGImageSourceCreateThumbnailAtIndex will return the thumbnail image, whether it's already there or needs to be generated. So it'll be quite safe to use, and probably at least as fast as imageScaledToSize.
You can find complete code samples to use it in the Apple docs, no need to duplicate it here.
Have you tried using ALAssetsLibrary to load the thumbnail of that image instead of trying to load the image at full resolution? It's faster than resize it as well.

App crashes after showing memory warning in ios 5.1.1 only while taking image from camera

Hi I have gone through many question on SO this too but its not helping me for ios 5.1.1.When i take image for first 2 times its working fine then n3rd time the app shows memory warning and gets crashed.Here is my code:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
[picker dismissModalViewControllerAnimated:YES];
UIImage *image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
if (image) {
if ([appdel.arrImageData count]==0) {
count=0;
}
count++;
[appdel.arrImageData addObject:[image copy]];
}}
Any help would be appreciated.
Each time you take a picture, you keep a copy of it in arrImageData and so filling up memory until iOS kills you app since you take too much memory. Re-think your design so you keep only one image in memory. If you need all the pictures for what ever reason, save it in the documents or cache directory and clean memory before taking another picture.
i Corrected your code check it it will help or not
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage *image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
if (image) {
if ([appdel.arrImageData count]==0) {
count=0;
}
count++;
[appdel.arrImageData addObject:[image copy]];
}
[picker dismissModalViewControllerAnimated:YES];
[picker release];
}
I do not see the memory management of your ImagePickerController. But I had a problem with deallocating my picker just after I called dismissModal...
Try to dismiss modal view and picker after you get the image.
UPDATE:
I agree with you. I had too many issues related to trying to make pickerController work for many iOS versions on several devices starting from iOS 3 to 5. As result I've skipped supporting of iOS 3 and started implementing own code to work with pictures and movies based on AV Foundation Programming Guide, AV Foundation Framework Reference.
An app quite often receives memory warning when it is using UIImagePickerController. What happens is when you take the image and take the image again and again your memory keeps on increasing every time (If you are not managing memory correctly.In my case it used to increase every 1.5MB). So it might work for the first,second or third time and receive memory warning the very next time or it could receive memory warning on the very first time if there are too many apps running in the background.
What is important is here how you to handle this memory warning. Once an app receives a memory warning the viewDidUnload of all the active view controllers is called where you should release all unwanted objects which can be created again. So your app might be crashing because you are doing something wrong there.. So in short we will need to see both your .h and .m files..
here you can get what you want.
in that code i have simply put autorelease pool to release memory.
i hope this may help you.

writeImageToSavedPhotosAlbum save only a few images

I've been following apples example, QA1702, on how to capture images using the AVFoundation. I won't cite the code here because of space concern. A brief description of what I'm trying to achieve:
Use the iPhone camera to pass a "video" (actually a sequence of images) to a web server, and I know this is possible. However in order to be able to pass the an image using the HTTP POST as in this example, I have to save the image. not necessarily in the photos album but I wan't to be able to view the pictures there as well in debug purposes.
The apple QA1702 contains 3 methods:
- (void)setupCaptureSession
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
//this is modified to be void as you might see, will get back to this
- (void) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
in setupCaptureSession I start the session as in the example. the captureOutput is only running the imageFromSampleBuffer, and that's where I've added some changes:
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
// Unlock the pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
// Free up the context and color space
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
//library is declared in .h and is a ALAssetsLibrary
[library writeImageToSavedPhotosAlbum:quartzImage orientation:ALAssetOrientationDown completionBlock:nil];
// Release the Quartz image
CGImageRelease(quartzImage);
I've removed the creation of the UIImage and changed it to void typ since I do the writeImageToSavedPhotosAlbum: with the CGImageRef here instead.
The problem as I see it is that during the 10sec that I capture images ~150 calls to captureOutput are made, and therefor the same amount to writeImageToSavedPhotos but only ~5-10 pictures are saved. I'm aware of the memory abuse this is but since I'm not getting any warnings I can't figure out why not more images are created. and what can I do about it? Is it because, and I'm only guessing now, writeImageToSavedPhotos starts new threads and the iPhone can't handle more than a certain amount of threads. I've read something about NSOperationQueue, should I look into it?
On a side note, I use a NSTimer in setupCaptureSession:
[NSTimer scheduledTimerWithTimeInterval: 10.0 target:self selector:#selector(timerFireMethod:) userInfo:nil repeats: NO];
however I want to start it in first call to captureOutput in order to avoid time elapsing during the startup of video camera. but if I move this code line to captureOutput then timerFireMethod: is never called? any ideas?
This is solvable with NSOperationQueue, but is no longer interesting to me since writing to file is way to ineffective for most applications.

How to programmatically take a low resolution snapshot using camera?

I am developing a fax client that is able to take snapshots. It communicates to the server via web service calls. Is there a way to reduce the image resolution so that we can have faster uploads and downloads? Right now the user has to wait a good 50 seconds for anything to happen.
This is the code I am using presently:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingImage:(UIImage *)img editingInfo:(NSDictionary *)editInfo {
[image setImage:img];
[[picker parentViewController] dismissModalViewControllerAnimated:YES];
}
You have to resize the image the ImagePickerController gives you before you send it. You can find many informations and UIImage categories on this article : http://vocaro.com/trevor/blog/2009/10/12/resize-a-uiimage-the-right-way/

IPhone - UIImage to Binary String

I have this bit of code that get the image i have picked from the Camera and displays it into a UIImageView. I want to convert the image to a binary string so i can then make a URL call and pass it back to the server. How would I modify this code to get my binary string?
- (void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingImage:(UIImage *)image
editingInfo:(NSDictionary *)editingInfo {
imageView.image = image;
[picker dismissModalViewControllerAnimated:YES];
}
You should better pass the NSData got from image jpeg representation to the server by POST method.
Marco
You can find an example of how to do it from Apple in this technote. Not as easy as it should be, but appears that it's the only way to do it. Is it possible for you to get the data before it becomes a UIImage to make your life a little easier? If not, just try the method presented at the link.