i am trying to create thumbnail of a image i am importing from photo library or camera using a UIImagePickerController using - (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingImage:(UIImage *)selectedImage editingInfo:(NSDictionary *)editingInfo when i use the camera the thumbnail of the image clicked is easily created but when the same is done by selecting a image from photo library my app crashes.. When i enabled NSZombie in my app it shows f-[UIImage drawInRect:]: message sent to deallocated instance 0x5abc140 on myThumbnail variable..
What am i possible doing wrong??
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingImage:(UIImage *)selectedImage editingInfo:(NSDictionary *)editingInfo
{
[self saveImage:selectedImage];
}
- (void)saveImage:(UIImage *)img
{
myThumbNail = img;
UIGraphicsBeginImageContext(CGSizeMake(60.0,60.0));
[myThumbNail drawInRect:CGRectMake(0.0, 0.0, 60.0, 60.0)];// gets killed here
}
What is image in your saveImage: method? I think it should be img.
Hope this helps
- (void)saveImage:(UIImage *)img
{
[img retain];
//alloc your myThumbNail before this
myThumbNail = img;
[img release];
UIGraphicsBeginImageContext(CGSizeMake(60.0,60.0));
[myThumbNail drawInRect:CGRectMake(0.0, 0.0, 60.0, 60.0)];// gets killed here
}
+ (UIImage *)resizeImage:(UIImage *)image toResolution:(int)resolution
{
NSData *imageData = UIImagePNGRepresentation(image);
CGImageSourceRef src = CGImageSourceCreateWithData((__bridge CFDataRef)imageData, NULL);
CFDictionaryRef options = (__bridge CFDictionaryRef) #{
(id) kCGImageSourceCreateThumbnailWithTransform : #YES,
(id) kCGImageSourceCreateThumbnailFromImageAlways : #YES,
(id) kCGImageSourceThumbnailMaxPixelSize : #(resolution)
};
CGImageRef thumbnail = CGImageSourceCreateThumbnailAtIndex(src, 0, options);
CFRelease(src);
UIImage *img = [[UIImage alloc]initWithCGImage:thumbnail];
return img;
}
This method takes a UIImage and the resolution as the arguments and returns a UIImage with the required resolution.
For example: If you want to convert 1024x1024 resolution image to 240x240, then pass 1024x1024 UIImage in argument1 and 240 in argument2. In return you get 240x240 UIImage (or whatever thumbnail size you want).
Are you calling alloc init on myThumbNail anywhere? It sounds like you either have not allocated memory for it in the first place, or you have released it somewhere you shouldn't have.
Related
I am using UIImagePickerViewController to take photo from iPhone default camera in my App and storing it in Document directory. It is taking long time to complete the process and also it is displaying very slowly on tableview.Does resizing image help here?
-(IBAction)takePhoto:(id)sender
{
if ([UIImagePickerController isSourceTypeAvailable: UIImagePickerControllerSourceTypeCamera])
{
imgPicker.sourceType = UIImagePickerControllerSourceTypeCamera;
[self presentModalViewController:imgPicker animated:YES];
}
}
-(void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
UIImage *pickedImage = [info objectForKey:UIImagePickerControllerOriginalImage];
[self dismissModalViewControllerAnimated:YES];
NSData *imageData = UIImagePNGRepresentation(pickedImage);
NSString *path = [SAVEDIMAGE_DIR stringByAppendingPathComponent:#"image.png"];
[imageData writeToFile:path atomically:YES];
}
Sure!
I do the following in my app:
store the image in an image store in a background thread
create a thumbnail (also in background thread), store this thumbnail in a core data table; in an field of type ID
So I get a smooth UI, where the User can take about a picture every 2 seconds.
The smoothness of the table views are also no problem. Although I populate the TableViewCells ImageViews also from a background-thread (preparing the image in the background, assigning to the UIImageView in the mainthread, of course).
I hope, that helps you. Further questions are welcome.
Some code for your convenience:
As Imagestore I use these: https://github.com/snowdon/Homepwner/blob/master/Homepwner/ImageStore.m
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
[self performSelectorInBackground:#selector(saveFoto:) withObject:info];
// you should add some code for indicating the save process
}
// saves the photo in background-thread
-(void)saveFoto:(NSDictionary*)info {
// the following is some stuff that I do in my app - you will probably do some other things
UIImage *image = [ImageHelper normalizeImageRotation: [info objectForKey:UIImagePickerControllerOriginalImage]];
UIImage *thumb = [ImageHelper image:image fitInSize:CGSizeMake(imgWidth, imgWidth) trimmed:YES];
NSString *myGUID = myGUIDCreator();
[[ImageStore defaultImageStore] setImage:image forKey:myGUID];
myCoreDataManagedObject.thumb = thumb;
[self performSelectorOnMainThread:#selector(showYourResultsInTheUI:) withObject:thumb waitUntilDone:NO]; // every UI-Update has to be done in the mainthread!
}
I'm attempting to take and save photos with the camera rapidly, as quickly as the iPhone can. The problem is that they don't save until the end and it then takes forever, or about 1/2 to 3/4 don't save at all (Write busy error or -[NSKeyedUnarchiver initForReadingWithData:]: data is NULL).
I bet I'm just overloading the phone's memory, but I can't think of a way to handle it efficiently. The standard iPhone camera app can handle it just fine -- snap away at almost 1 photo/second and it saves with no problem.
Any ideas on how to manage the process/memory better so that it can save as it goes but still shoot rapidly?
Here's a bit of my code. takePicture is called whenever self.readyToTake = YES.
- (void)takePicture {
self.delegate = self;
[super takePicture];
self.readyToTake = NO;
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
self.readyToTake = YES;
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
UIImageWriteToSavedPhotosAlbum(image, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
}
- (void)image:(UIImage*)image didFinishSavingWithError:(NSError *)error contextInfo:(NSDictionary*)info {
if (error)
{
NSLog([NSString stringWithFormat:#"** ERROR SAVING PHOTO: %#", [error localizedDescription]]);
}
}
Thank you for your help!
EDIT
If I resize the photos to much smaller dimensions before saving, like 480x640, I have no problem saving quickly. However, I'm wanting to capture and save full-size images. The native Camera app seems to handle it fine.
you can do like this.
ie., enable taking the next photo only after the previous photo is saved.
Also introduce autorelease pool for memory management.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
UIImageWriteToSavedPhotosAlbum(image, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
[pool drain];
}
- (void)image:(UIImage*)image didFinishSavingWithError:(NSError *)error contextInfo:(NSDictionary*)info {
if (error)
{
NSLog([NSString stringWithFormat:#"** ERROR SAVING PHOTO: %#", [error localizedDescription]]);
}
self.readyToTake = YES;
}
The first thing I'd try is offloading those calls asynchronously to a dispatch queue:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
self.readyToTake = YES;
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
UIImageWriteToSavedPhotosAlbum(image, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
});
}
If you're still having trouble, I'd look into AVCaptureSession and friends. This gives you a lot more control and generally better performance. There's a great sample project using this method at the developer portal: SquareCam
EDIT
You could also try using AssetsLibrary instead of UIImageWriteToSavedPhotosAlbum (This is what we use in all our apps) Something like:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
self.readyToTake = YES;
// you probably want to create this once and keep it around as a property
// on your controller
ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
[library writeImageToSavedPhotosAlbum:image.CGImageRef orientation:(ALAssetOrientation)image.orientation completionBlock:^(NSURL *assetURL, NSError *error) {
// do whatever you need to do when the image is saved
}];
}
there is one best example app on developer.apple.com site in which the solution for your problem is shown in it they save photos for 1second interval automatically when user tap on "timed" button here is its code it surely helps you as i can able to store 5 consicutive photos 1/second speed
Please see page MyViewController.m and after getting images in array you can save them to the photo album using some function on another queue
Can any body tell me about how UIIMAGEPICKER can crop picture, and save this crop picture in iphone directory in order to save path of the image in databse and re-use it?
Try this:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info{
UIImage *img = [info objectForKey:UIImagePickerView.UIImagePickerControllerEditedImage];
//I think that's how you're supposed to get the image. Fiddle with it.
img.size = CGSizeMake(x, y);
img.scale = 0.75; //scale the image to 75 percent of it's original size.
UIImageWriteToSavedPhotosAlbum (*img, self, #selector(photoSaved), nil);
}
-(void)photoSaved image: (UIImage *) image didFinishSavingWithError: (NSError *) error contextInfo: (void *) contextInfo;{
if(!error){
NSLog(#"Photo saved to library!");
} else{
NSLog(#"Saving failed :(");
}
}
I have written an application in which I want to convert a .jpg image to .bmp format but I it fails with the error:
class_name' may not respond to
methd_name".
My code is following :
#import "CalculateRGBViewController.h"
#implementation CalculateRGBViewController
#synthesize skinImage;
#synthesize lblRedColor,btn,img;
struct pixel {
unsigned char r, g, b,a;
};
-(IBAction)btnClick:(id) sender{
UIImagePickerController *picker = [[UIImagePickerController alloc] init];
picker.delegate = self;
picker.sourceType = UIImagePickerControllerSourceTypePhotoLibrary;
[self presentModalViewController:picker animated:YES];
}
-(void)imagePickerController:(UIImagePickerController *) picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
[picker dismissModalViewControllerAnimated:YES];
skinImage. image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
}
+(void)calculateRGB:(UIImage *)skinImage {
struct pixel *pixels = (struct pixel *) calloc(1, skinImage.size.width * skinImage.size.height * sizeof(struct pixel));
if (pixels != nil)
{
// Create a new bitmap
CGContextRef context = CGBitmapContextCreate(
(void*) pixels,
skinImage.size.width,
skinImage.size.height,
8,
skinImage.size.width * 4,
CGImageGetColorSpace(skinImage.CGImage),
kCGImageAlphaPremultipliedLast
);
NSLog( #"Pixel data one red (%i)", context);
}
}
-(IBAction)btnCalRGB:(id) sender
{
[self calculateRGB];
}
The shown me following code.
-(IBAction)btnCalRGB:(id) sender
{
[self calculateRGB];
}
Warning: CalculateRGBController may not respond to calculateRGB in btnCalRGB button function.
Edit01:
I have also implemented in my code but it again showing same warning.
[self calculateRGB:anUIImage];
-(IBAction)btnCalRGB:(id) sender;
but it again showing same warning. When the button above is pressed, the program throws and exception.
Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** +[UIImage skinImage]: unrecognized selector sent to
You call the method wrong. You must call it like this;
[self calculateRGB:anUIImage];
Your method takes an UIImage as an argument, so you must send it when you call the method. You should also add
+(void)calculateRGB:(UIImage *)skinImage;
to your .h file
Use this
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingImage:(UIImage *)image editingInfo:(NSDictionary *)editingInfo {
instead of this
-(void)imagePickerController:(UIImagePickerController *) picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
and you can write
skinImage.image = image;
First of all, to get rid of the error:
+(void)calculateRGB:(UIImage *)skinImage
Should be
- (void)calculateRGB:(UIImage *)skinImage
The + is for class methods, you're about to call this on instances (this is we you need to use the -). So the error is because you try to call the instance method but there is no instance method named calculateRGB:.
The method wants an UIImage to do it's calculation on. This means you should call it (like EEE told you already):
[self calculateRGB:anImage];
anImage would be an UIImage instance you provide. This could be the skinImage that you're already using in the implementation, as shown in your code.
Apart from this, I would recommend not to use an UIViewController subclass for stuff like calculating the RGB values of an image. You should probably use a category of UIImage for this.
When you take photos with the iPhone camera, they are automatically geo-tagged.
I have written the following code to take photos in my GPS Recorder app, but the photos are not being geo-tagged.
What am I doing wrong? Should I be using a completely different method, or is there some way I can modify my code to add the geo-tag and other data to the EXIF info?
Note that self.photoInfo is the info dictionary returned by this delegate method:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
Here's the code:
UIImage *oImage = [self.photoInfo objectForKey:UIImagePickerControllerOriginalImage];
CGSize size = oImage.size;
oImage = [CameraController imageWithImage:oImage
scaledToSize:CGSizeMake(size.width/2.56,size.height/2.56)];
NSDictionary *info = [[NSDictionary alloc] initWithObjectsAndKeys:
oImage, #"image", titleString, #"title", nil];
TrailTrackerAppDelegate *t = (TrailTrackerAppDelegate *)[[UIApplication sharedApplication] delegate];
[NSThread detachNewThreadSelector:#selector(saveImage:) toTarget:t withObject:info];
+ (UIImage*)imageWithImage:(UIImage*)image scaledToSize:(CGSize)newSize {
UIGraphicsBeginImageContext( newSize );
[image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
Here is a photo taken with the Camera, which has all the good EXIF data: http://www.flickr.com/photos/33766454#N02/3978868900/
And here is on taken with my code, which doesn't have the EXIF data:
http://www.flickr.com/photos/33766454#N02/3978860380/
UIImage does not really have any kind of EXIF, so you are not given an image with any EXIF. If you generate a JPG or PNG, you have to add any EXIF you need in place...