iOS: How to load image from storage - iphone

I've saved an image to storage using this:
UIImageWriteToSavedPhotosAlbum(image, self,
#selector(image:didFinishSavingWithError:contextInfo:), nil);
Now, how can I load that image into an image view?

Image views have an image property. Assign the image to it.
Edit: If you want to save a persistent reference to the entry in the photo album, don't use that function to save it, use writeImageToSavedPhotosAlbum:metadata:completionBlock: instead. The completion block will be passed an assetURL parameter.

This is the code I now use:
//Load the asset URL from my photo object
NSURL *asseturl = [NSURL URLWithString:photo.photoURL];
ALAssetsLibrary* myasset = [[ALAssetsLibrary alloc] init];
//If the image was found
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
//Get the full resolution image
CGImageRef iref = [rep fullResolutionImage];
if (iref) {
//Get the orientation of the image
ALAssetOrientation orientation = [[myasset valueForProperty:#"ALAssetPropertyOrientation"] intValue];
//Set the image view image
[self.BigImageView setImage:[UIImage imageWithCGImage:iref scale:1.0 orientation:orientation]];
}
};
//If the image isn't available
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Can't get the image - %#",[myerror localizedDescription]);
[self.BigImageView setImage:nil];
};
[myasset assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];

Related

Placing UIImage jacks up memory

Simply adding a UIImage with this code, makes the memory jump around 50MB. The image itself is {2448, 3264} pixels in size:
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref)
{
UIImage *selectedBackground = [UIImage imageWithCGImage:iref scale:[rep scale] orientation:(UIImageOrientation)[rep orientation]];
CGImageRelease(iref);
[self setBackgroundWithImage:selectedBackground orColor:nil];
}
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Can't get image - %#",[myerror localizedDescription]);
};
ALAssetsLibrary* assetslibrary = [[ALAssetsLibrary alloc] init];
[assetslibrary assetForURL:photoURL resultBlock:resultblock failureBlock:failureblock];
Why the huge jump in memory?

MFMailComposeViewController: Attaching Images from Photo Gallery

I am having a bit of trouble attaching images from the Photo Gallery to an email.
Basically, one of the features of my application allow the user to take photos. When they snap the shot, I record the URL Reference to the image in Core Data. I understand that you have to go through the ALAssetRepresentation to get to the image. I have this up and running within my application for when the user wants to review an image that they have taken.
I am now attempting to allow the user to attach all of the photos taken for an event to an email. While doing this, I iterate through the Core Data entity that stores the URL References, call a method that returns a UIImage from the ALAssetsLibrary and then attaches it using the NSData/UIImageJPEGRepresentation and MFMailComposeViewController/addAttachmentData methods.
Problem is: When the email is presented to the user, there are small blue squares representing the images and the image is not attached.
Here is the code:
- (void)sendReportReport
{
if ([MFMailComposeViewController canSendMail])
{
MFMailComposeViewController *mailer = [[MFMailComposeViewController alloc] init];
mailer.mailComposeDelegate = self;
[mailer setSubject:#"Log: Report"];
NSArray *toRecipients = [NSArray arrayWithObjects:#"someone#someco.com", nil];
[mailer setToRecipients:toRecipients];
NSError *error;
NSFetchRequest *fetchPhotos = [[NSFetchRequest alloc] init];
NSEntityDescription *entity = [NSEntityDescription
entityForName:#"Photo" inManagedObjectContext:__managedObjectContext];
[fetchPhotos setEntity:entity];
NSArray *fetchedPhotos = [__managedObjectContext executeFetchRequest:fetchPhotos error:&error];
int counter;
for (NSManagedObject *managedObject in fetchedPhotos ) {
Photo *photo = (Photo *)managedObject;
// UIImage *myImage = [UIImage imageNamed:[NSString stringWithFormat:#"%#.png", counter++]];
NSData *imageData = UIImageJPEGRepresentation([self getImage:photo.referenceURL], 0.5);
// NSData *imageData = UIImagePNGRepresentation([self getImage:photo.referenceURL]);
// [mailer addAttachmentData:imageData mimeType:#"image/jpeg" fileName:[NSString stringWithFormat:#"%i", counter]];
[mailer addAttachmentData:imageData mimeType:#"image/jpeg" fileName:[NSString stringWithFormat:#"a.jpg"]];
counter++;
}
NSString *emailBody = [self getEmailBody];
[mailer setMessageBody:emailBody isHTML:NO];
[self presentModalViewController:mailer animated:YES];
}
else
{
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Failure"
message:#"Your device doesn't support the composer sheet"
delegate:nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alert show];
}
}
and the method that returns the UIImage:
#pragma mark - Get Photo from Asset Library
+ (ALAssetsLibrary *)defaultAssetsLibrary {
static dispatch_once_t pred = 0;
static ALAssetsLibrary *library = nil;
dispatch_once(&pred, ^{
library = [[ALAssetsLibrary alloc] init];
});
return library;
}
- (UIImage *)getImage:(NSString *)URLReference
{
__block UIImage *xPhoto = nil;
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
UIImage *xImage;
// get the image
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullScreenImage];
if (iref) {
xImage = [UIImage imageWithCGImage:iref];
}
xPhoto = xImage;
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Error fetching photo: %#",[myerror localizedDescription]);
};
NSURL *asseturl = [NSURL URLWithString:URLReference];
// create library and set callbacks
ALAssetsLibrary *al = [DetailsViewController defaultAssetsLibrary];
[al assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
return xPhoto;
}
NOTE: This above code does run, it simply does not attach the image. Also, note, I am able to successfully attach images from the gallery within my application, as long at I have set them into a UIImageView.Image already (basically, I am taking the pointer to the image from the UIImageView and passing it to the addAttachmentData method.) It is just when I attempt to iterate through Core Data and attach without first setting the image into a UIImageView that I have the trouble.
Any tips would be greatly appreciated!
Thanks!
Jason
Oh now I see it.. sry. You are using an asynchronous block to get the image from the asset library. But right after starting that operation, you are returning xImage. But the asynchronous operation will finish later. So you are returning nil.
You need to change your architectur to smth like this:
In your .h file you need two new members:
NSMutableArray* mArrayForImages;
NSInteger mUnfinishedRequests;
In your .m file do smth like that:
- (void)sendReportReport
{
// save image count
mUnfinishedRequests = [fetchedPhotos count];
// get fetchedPhotos
[...]
// first step: load images
for (NSManagedObject *managedObject in fetchedPhotos )
{
[self loadImage:photo.referenceURL];
}
}
Change your getImage method to loadImage:
- (void)loadImage:(NSString *)URLReference
{
NSURL *asseturl = [NSURL URLWithString:URLReference];
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
// get the image
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullScreenImage];
if (iref) {
[mArrayForImages addObject: [UIImage imageWithCGImage:iref]];
} else {
// handle error
}
[self performSelectorOnMainThread: #selector(imageRequestFinished)];
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Error fetching photo: %#",[myerror localizedDescription]);
[self performSelectorOnMainThread: #selector(imageRequestFinished)];
};
// create library and set callbacks
ALAssetsLibrary *al = [DetailsViewController defaultAssetsLibrary];
[al assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
}
Create a new callback method:
- (void) imageRequestFinished
{
mUnfinishedRequests--;
if(mUnfinishedRequests <= 0)
{
[self sendMail];
}
}
And an extra method for finally sending the mail, after getting the images:
- (void) sendMail
{
// crate mailcomposer etc
[...]
// attach images
for (UIImage *photo in mArrayForImages )
{
NSData *imageData = UIImageJPEGRepresentation(photo, 0.5);
[mailer addAttachmentData:imageData mimeType:#"image/jpeg" fileName:[NSString stringWithFormat:#"a.jpg"]];
}
// send mail
[...]
}

iphone image ALAsset problem

First of all sorry for the crappy formatting. I dont know how to format the code here.I am having a problem with retrieving the image from the photo library. I have pasted my code Please help me. I am having this problem from many days.
here, I am storing the image path in the database using the referenceURL. It is in the
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
NSURL *imageUrl = [info valueForKey:UIImagePickerControllerReferenceURL];
NSURL *imageUrl = [info valueForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *assetLibrary = [[ALAssetsLibrary alloc]init];
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
NSLog(#"asset");
CGImageRef iref = [myasset thumbnail];
if(iref)
{
NSLog(#"ifref");
UIImage *thethumbnail = [UIImage imageWithCGImage:iref];
NSLog(#"the thumbnail %#",thethumbnail);
[[self photo]setImage:thethumbnail];
lpdel.imageurl = UIImageJPEGRepresentation (thethumbnail, 1);
}
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"failed");
NSLog(#"cant get image -- %#",[myerror localizedDescription]);
};
if (imageUrl) {
NSLog(#"if url");
[assetLibrary assetForURL:imageUrl
resultBlock:resultblock
failureBlock:failureblock];
//lpdel.imageurl =imageUrl;
NSLog(#"the image string %#",lpdel.imageurl);
NSLog(#"lpdel.image%#",lpdel.imageurl);
[picker dismissModalViewControllerAnimated:YES];
[imageUrl release];
}
[picker dismissModalViewControllerAnimated:YES];
[picker release]
}
I am storing the imageURL in the database. Then in another View Controller I am trying to retrive the image using the same url using the below code
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
NSLog(#"asset");
CGImageRef iref = [myasset thumbnail];
if(iref)
{
NSLog(#"ifref");
thethumbnail = [UIImage imageWithCGImage:iref];
NSLog(#"the thumbnail to upload %#",thethumbnail);
[self uploadImage:UIImageJPEGRepresentation (thethumbnail, 1)];
//[[self photo]setImage:thethumbnail];
}
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"failed");
NSLog(#"cant get image -- %#",[myerror localizedDescription]);
};
if (image)
{
NSLog(#"if url");
ALAssetsLibrary *assetLibrary = [[[ALAssetsLibrary alloc]init]autorelease];
NSLog(#"url");
[assetLibrary assetForURL:image
resultBlock:resultblock
failureBlock:failureblock];
}
NSLog(#"the thumbnail %#",thethumbnail);
The application crashes at this point without any error message. Can anyone please help me on this
I found the solution. It was the problem with the last nslog. But I need to know why. the variable thethumbnail is declared with class scope. Has it got to do anything with blocks

Saving a thumbnail on a background thread

I'm trying to create thumbnails (288x288) of selected photos from iPad photo library. I have an array of ALAsset objects presented in a UITableView and as I select a row, a larger preview (288x288) of that image is displayed. In order to prevent main thread blocking, I'm trying to create the thumbnail on a background thread and also cache a copy of the thumbnail to the file system.
In a view controller when a tableview row is selected, I call loadPreviewImage in background:
- (void)tableView:(UITableView *)tableView
didSelectRowAtIndexPath:(NSIndexPath *)indexPath
{
// get the upload object from an array that contains a ALAsset object
upload = [uploads objectAtIndex:[indexPath row]];
[self performSelectorInBackground:#selector(loadPreviewImage:)
withObject:upload];
}
I pass a custom upload object that contains asseturl property:
- (void)loadPreviewImage:(MyUploadClass*)upload
{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
UIImage *preview = [upload previewImage];
[self performSelectorOnMainThread:#selector(setPreviewImage:)
withObject:preview
waitUntilDone:YES];
[pool release];
}
This is called on main thread to display the thumbnail after it's loaded:
- (void)setPreviewImage:(UIImage*)image
{
self.imageViewPreview.image = image;
[self layoutSubviews];
}
This is a method of MyUploadClass:
- (UIImage *)previewImage
{
__block UIImage *previewImage = [[UIImage imageWithContentsOfFile:
[self uploadPreviewFilePath]] retain];
if (previewImage == nil && asseturl)
{
ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
[library assetForURL:self.asseturl resultBlock:^(ALAsset *asset)
{
ALAssetRepresentation *rep = [asset defaultRepresentation];
previewImage = [UIImage imageWithCGImage: [rep fullScreenImage]];
previewImage = [[previewImage resizedImageWithContentMode:UIViewContentModeScaleAspectFit
bounds:CGSizeMake(288, 288)
interpolationQuality:kCGInterpolationHigh] retain];
NSData *previewData = UIImageJPEGRepresentation(previewImage, 1.0);
[previewData writeToFile:[self uploadPreviewFilePath] atomically:YES];
}
failureBlock:^(NSError *error){ }];
[library release];
}
return [previewImage autorelease];
}
The problem is that I always get nil previewImage the first time and only after the thumbnail is cached I get an image object. What am I doing wrong? Is there a better approach to this problem?
I didn't clearly understand how the resultBlock of ALAssetsLibrary operates, my mistake was to think that the execution is linear. It turns out that in my case the resultBlock executes on the main thread while the rest of the code in previewImage executes on a background thread. I was getting nil because previewImage returned before resultBlock had a chance to end its execution. I solved the problem by replacing previewImage with the following method:
- (void) loadPreviewImage:(CGSize)size withTarget:(id)target andCallback:(SEL)callback
{
NSString *path = [self uploadPreviewFilePath];
UIImage *previewImage = [UIImage imageWithContentsOfFile:path];
if (previewImage == nil && asseturl)
{
ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
[library assetForURL:self.asseturl resultBlock:^(ALAsset *asset)
{
if (asset) {
ALAssetRepresentation *rep = [asset defaultRepresentation];
UIImage *img = [UIImage imageWithCGImage: [rep fullScreenImage]];
img = [img resizedImageWithContentMode:UIViewContentModeScaleAspectFit
bounds:size interpolationQuality:kCGInterpolationHigh];
NSData *previewData = UIImageJPEGRepresentation(img, 1.0);
[previewData writeToFile:path atomically:YES];
[target performSelectorOnMainThread:callback
withObject:img
waitUntilDone:YES];
}
}
failureBlock:^(NSError *error){ }];
[library release];
}
else {
[target performSelectorOnMainThread:callback withObject:img waitUntilDone:YES];
}
}

display image from URL retrieved from ALAsset in iPhone

I am using a ALAsset Framework for accessing the files in the device's photo gallery.
So far I am able to access the thumbnail and display it.
I want to display the actual image in an image view but I am unable to figure out how to do this.
I tried using the URLs field in the ALAsset object but was unsuccessful.
Anybody knows how this can be done?
Here's some code where I was able to access the thumbnail and place it in a table cell:
- (UITableViewCell *)tableView:(UITableView *)tableView
cellForRowAtIndexPath:(NSIndexPath *)indexPath {
static NSString *CellIdentifier = #"Cell";
UITableViewCell *cell = [tableView dequeueReusableCellWithIdentifier:CellIdentifier];
if (cell == nil) {
cell = [[[UITableViewCell alloc] initWithStyle:UITableViewCellStyleDefault reuseIdentifier:CellIdentifier] autorelease];
}
//here 'asset' represents the ALAsset object
asset = [assets objectAtIndex:indexPath.row];
//i am accessing the thumbnail here
[cell.imageView setImage:[UIImage imageWithCGImage:[asset thumbnail]]];
[cell.textLabel setText:[NSString stringWithFormat:#"Photo %d", indexPath.row+1]];
return cell;
}
The API has changed the rules slightly and you dont get direct file system access to the iPhoto library any more. Instead you get asset library URL's like this.
assets-library://asset/asset.JPG?id=1000000003&ext=JPG
You use the ALAssetLibrary object to access the ALAsset object via the URL.
so from the docs for ALAssetLibrary throw this in a header (or your source)
typedef void (^ALAssetsLibraryAssetForURLResultBlock)(ALAsset *asset);
typedef void (^ALAssetsLibraryAccessFailureBlock)(NSError *error);
which isnt strictly needed but keeps things pretty.
and then in your source.
-(void)findLargeImage
{
NSString *mediaurl = [self.node valueForKey:kVMMediaURL];
//
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref) {
largeimage = [UIImage imageWithCGImage:iref];
[largeimage retain];
}
};
//
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"booya, cant get image - %#",[myerror localizedDescription]);
};
if(mediaurl && [mediaurl length] && ![[mediaurl pathExtension] isEqualToString:AUDIO_EXTENSION])
{
[largeimage release];
NSURL *asseturl = [NSURL URLWithString:mediaurl];
ALAssetsLibrary* assetslibrary = [[[ALAssetsLibrary alloc] init] autorelease];
[assetslibrary assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
}
}
A couple of things to note are that this uses blocks which were new to me before I started my iOS4 porting but you might like to look at
https://www.mikeash.com/pyblog/friday-qa-2008-12-26.html
and
https://developer.apple.com/library/content/documentation/Cocoa/Conceptual/Blocks/Articles/00_Introduction.html
They bend your head a little but if you think of them as notification selectors or callbacks it kind of helps.
Also
when findLargeImage returns the
resultblock wont have run yet as its
a callback. So largeImage wont be
valid yet.
largeImage needs to be an
instance variable not scoped to the
method.
I use this construct to do this when using the method but you may find something more suitable to your use.
[node.view findLargeImage];
UIImage *thumb = node.view.largeImage;
if (thumb) { blah blah }
Thats what I learned while trying to get this working anyway.
iOS 5 update
When the result block fires seems to be a bit slower with iOS5 & maybe single core devices so I couldnt rely on the image to be available directly after calling findLargeImage. So I changed it to call out to a delegate.
#protocol HiresImageDelegate <NSObject>
#optional
-(void)hiresImageAvailable:(UIImage *)aimage;
#end
and comme cá
//
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref) {
UIImage *largeimage = [UIImage imageWithCGImage:iref];
[delegate hiresImageAvailable:large];
}
};
Warren's answer worked well for me. One useful thing for some people is to include the image orientation and scale metadata at the same time. You do this in your result block like so:
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref)
{
UIImage *largeimage = [UIImage imageWithCGImage:iref scale:[rep scale] orientation:[rep orientation]];
[delegate hiresImageAvailable:large];
}
};
The imageWIthCGImage call in that case has scale and orientation added when it creates a UIImage for you.
[UIImage imageWithCGImage:iref scale:[rep scale] orientation:[rep orientation]];
One trick to note is that if you use [rep fullScreenImage] instead of [rep fullResolutionImage] on iOS 5 you get an image that is already rotated - it is however at the resolution of the iPhone screen - i.e. its at a lower resolution.
Just to combine Warren's and oknox's answers into a shorter snippet:
ALAssetsLibrary *assetsLibrary = [[ALAssetsLibrary alloc] init];
[assetsLibrary assetForURL:self.selectedPhotos[i] resultBlock: ^(ALAsset *asset){
ALAssetRepresentation *representation = [asset defaultRepresentation];
CGImageRef imageRef = [representation fullResolutionImage];
if (imageRef) {
UIImageView *imageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 100, 100)];
imageView.image = [UIImage imageWithCGImage:imageRef scale:representation.scale orientation:representation.orientation];
// ...
}
} failureBlock: ^{
// Handle failure.
}];
I personally like setting my failureBlock to nil.
NSURL* aURL = [NSURL URLWithString:#"URL here"];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:aURL resultBlock:^(ALAsset *asset)
{
UIImage *copyOfOriginalImage = [UIImage imageWithCGImage:[[asset defaultRepresentation] fullScreenImage] scale:0.5 orientation:UIImageOrientationUp];
cell.backgroundView = [[UIImageView alloc] initWithImage:copyOfOriginalImage];
}
failureBlock:^(NSError *error)
{
// error handling
NSLog(#"failure-----");
}];
just provide the UIReferenceURl you got for the image in photolibrary provided above... its just works fine . . .I diplayed it in
UIcollectionView cell
..if you just wanna display it in a
UIImageView
means
Change
cell.backgroundView = [[UIImageView alloc] initWithImage:copyOfOriginalImage];
To
imageView.image = copyOfOriginalImage;