Saving a thumbnail on a background thread - iphone

I'm trying to create thumbnails (288x288) of selected photos from iPad photo library. I have an array of ALAsset objects presented in a UITableView and as I select a row, a larger preview (288x288) of that image is displayed. In order to prevent main thread blocking, I'm trying to create the thumbnail on a background thread and also cache a copy of the thumbnail to the file system.
In a view controller when a tableview row is selected, I call loadPreviewImage in background:
- (void)tableView:(UITableView *)tableView
didSelectRowAtIndexPath:(NSIndexPath *)indexPath
{
// get the upload object from an array that contains a ALAsset object
upload = [uploads objectAtIndex:[indexPath row]];
[self performSelectorInBackground:#selector(loadPreviewImage:)
withObject:upload];
}
I pass a custom upload object that contains asseturl property:
- (void)loadPreviewImage:(MyUploadClass*)upload
{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
UIImage *preview = [upload previewImage];
[self performSelectorOnMainThread:#selector(setPreviewImage:)
withObject:preview
waitUntilDone:YES];
[pool release];
}
This is called on main thread to display the thumbnail after it's loaded:
- (void)setPreviewImage:(UIImage*)image
{
self.imageViewPreview.image = image;
[self layoutSubviews];
}
This is a method of MyUploadClass:
- (UIImage *)previewImage
{
__block UIImage *previewImage = [[UIImage imageWithContentsOfFile:
[self uploadPreviewFilePath]] retain];
if (previewImage == nil && asseturl)
{
ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
[library assetForURL:self.asseturl resultBlock:^(ALAsset *asset)
{
ALAssetRepresentation *rep = [asset defaultRepresentation];
previewImage = [UIImage imageWithCGImage: [rep fullScreenImage]];
previewImage = [[previewImage resizedImageWithContentMode:UIViewContentModeScaleAspectFit
bounds:CGSizeMake(288, 288)
interpolationQuality:kCGInterpolationHigh] retain];
NSData *previewData = UIImageJPEGRepresentation(previewImage, 1.0);
[previewData writeToFile:[self uploadPreviewFilePath] atomically:YES];
}
failureBlock:^(NSError *error){ }];
[library release];
}
return [previewImage autorelease];
}
The problem is that I always get nil previewImage the first time and only after the thumbnail is cached I get an image object. What am I doing wrong? Is there a better approach to this problem?

I didn't clearly understand how the resultBlock of ALAssetsLibrary operates, my mistake was to think that the execution is linear. It turns out that in my case the resultBlock executes on the main thread while the rest of the code in previewImage executes on a background thread. I was getting nil because previewImage returned before resultBlock had a chance to end its execution. I solved the problem by replacing previewImage with the following method:
- (void) loadPreviewImage:(CGSize)size withTarget:(id)target andCallback:(SEL)callback
{
NSString *path = [self uploadPreviewFilePath];
UIImage *previewImage = [UIImage imageWithContentsOfFile:path];
if (previewImage == nil && asseturl)
{
ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
[library assetForURL:self.asseturl resultBlock:^(ALAsset *asset)
{
if (asset) {
ALAssetRepresentation *rep = [asset defaultRepresentation];
UIImage *img = [UIImage imageWithCGImage: [rep fullScreenImage]];
img = [img resizedImageWithContentMode:UIViewContentModeScaleAspectFit
bounds:size interpolationQuality:kCGInterpolationHigh];
NSData *previewData = UIImageJPEGRepresentation(img, 1.0);
[previewData writeToFile:path atomically:YES];
[target performSelectorOnMainThread:callback
withObject:img
waitUntilDone:YES];
}
}
failureBlock:^(NSError *error){ }];
[library release];
}
else {
[target performSelectorOnMainThread:callback withObject:img waitUntilDone:YES];
}
}

Related

ALAsset Photo Library Image Performance Improvements When Its Loading Slow

Hi i've got a problem with display an image on my scrollView.
At first i create new UIImageView with asset url:
-(void) findLargeImage:(NSNumber*) arrayIndex
{
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep;
if([myasset defaultRepresentation] == nil) {
return;
} else {
rep = [myasset defaultRepresentation];
}
CGImageRef iref = [rep fullResolutionImage];
itemToAdd = [[UIImageView alloc] initWithFrame:CGRectMake([arrayIndex intValue]*320, 0, 320, 320)];
itemToAdd.image = [UIImage imageWithCGImage:iref];
[self.scrollView addSubview:itemToAdd];
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Cant get image - %#",[myerror localizedDescription]);
};
NSURL *asseturl = [NSURL URLWithString:[self.photoPath objectAtIndex:[arrayIndex intValue] ]];
ALAssetsLibrary* assetslibrary = [[ALAssetsLibrary alloc] init];
[assetslibrary assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
}
Where itemToAdd is a UIImageView define in interface:
__block UIImageView *itemToAdd;
And scrollView define as a property:
#property (nonatomic, strong) __block UIScrollView *scrollView;
Then in my viewWillAppear i do this:
- (void) viewWillAppear:(BOOL)animated {
self.scrollView.delegate = self;
[self findLargeImage:self.actualPhotoIndex];
[self.view addSubview:self.scrollView];
}
But image doesnt appear, should i refresh self.view after add image to scrollView, or should do something else?
ALAssetsLibrary block will execute in separate thread. So I suggest to do the UI related stuffs in main thread.
To do this either use dispatch_sync(dispatch_get_main_queue() or performSelectorOnMainThread
Some Important Notes:
Use AlAsset aspectRatioThumbnail instead of fullResolutionImage for high performance
Example:
CGImageRef iref = [myasset aspectRatioThumbnail];
itemToAdd.image = [UIImage imageWithCGImage:iref];
Example:
-(void) findLargeImage:(NSNumber*) arrayIndex
{
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
CGImageRef iref = [myasset aspectRatioThumbnail];
dispatch_sync(dispatch_get_main_queue(), ^{
itemToAdd = [[UIImageView alloc] initWithFrame:CGRectMake([arrayIndex intValue]*320, 0, 320, 320)];
itemToAdd.image = [UIImage imageWithCGImage:iref];
[self.scrollView addSubview:itemToAdd];
});//end block
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Cant get image - %#",[myerror localizedDescription]);
};
NSURL *asseturl = [NSURL URLWithString:[self.photoPath objectAtIndex:[arrayIndex intValue] ]];
ALAssetsLibrary* assetslibrary = [[ALAssetsLibrary alloc] init];
[assetslibrary assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
}
Also change the order of viewWillAppear()
- (void) viewWillAppear:(BOOL)animated {
self.scrollView.delegate = self;
[self.view addSubview:self.scrollView];
[self findLargeImage:self.actualPhotoIndex];
}
You are manipulating the view from another thread.
You must use the main thread for manipulating the view.
Add image to scrollView using:
dispatch_async(dispatch_get_main_queue(), ^{
[self.scrollView addSubview:itemToAdd];
}
or using:
[self.scrollView performSelectorOnMainThread:#selector(addSubview:) withObject:itemToAdd waitUntilDone:NO];
Please refer:
NSObject Class Reference
GCD

Is it possible to share an image via UIActivityViewController and retain exif data?

I have an app which saves images to a custom album within the camera roll via
[library writeImageToSavedPhotosAlbum:[newTestImage CGImage] metadata:metadata
completionBlock:^(NSURL *assetURL, NSError *error) {
//error
}
];
This works fine, however, I would then like to allow the user to share these images from within the app, I was doing it via
ALAsset *assetToShare = self.images[indexPath.row];
NSString *stringToShare = #"…..";
NSArray *dataToShare = #[assetToShare, stringToShare];
UIActivityViewController *activityVC = [[UIActivityViewController alloc]
initWithActivityItems:dataToShare applicationActivities:nil];
[self presentViewController:activityVC animated:YES completion: ^{
//Some Expression
}];}
However in doing so the image is stripped of all exif data, is there a way I can implement the same functionality but retain the metadata? Thanks very much for any assistance.
Edit: Code used to enter ALAssets
if ([[group valueForProperty:ALAssetsGroupPropertyName] isEqualToString:#"my pics"]) {
[self.assetGroups addObject:group];
_displayArray = [self.assetGroups objectAtIndex:0]; ...}];
Then:
[_displayArray enumerateAssetsUsingBlock:^(ALAsset *result, NSUInteger index, BOOL *stop) {
if (result) {
[self.images addObject:result]; }];
Have you tried something like this? Just passing the actual image.
ALAsset *assetToShare = self.images[indexPath.row];
ALAssetRepresentation *rep = [assetToShare defaultRepresentation];
UIImage *imageToShare = [UIImage imageWithCGImage:[rep fullResolutionImage]];
NSString *stringToShare = #"…..";
NSArray *dataToShare = #[imageToShare, stringToShare];
UIActivityViewController *activityVC = [[UIActivityViewController alloc]
initWithActivityItems:dataToShare applicationActivities:nil];
[self presentViewController:activityVC animated:YES completion: ^{
//Some Expression
}];}

Raw image data from camera

I've been searching this forum up and down but I couldn't find what I really need. I want to get raw image data from the camera. Up till now I tried to get the data out of the imageDataSampleBuffer from that method captureStillImageAsynchronouslyFromConnection:completionHandler: and to write it to an NSData object, but that didn't work.
Maybe I'm on the wrong track or maybe I'm just doing it wrong.
What I don't want is for the image to be compressed in any way.
The easy way is to use jpegStillImageNSDataRepresentation: from AVCaptureStillImageOutput, but like I said I don't want it to be compressed.
Thanks!
This is how i do it:
1: I first open the camera using:
- (void)openCamera
{
imagePicker = [[UIImagePickerController alloc] init];
imagePicker.delegate = self;
if([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera])
{
imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera;
imagePicker.mediaTypes = [NSArray arrayWithObjects:
(NSString *) kUTTypeImage,
(NSString *) kUTTypeMovie, nil];
imagePicker.allowsEditing = NO;
[self presentModalViewController:imagePicker animated:YES];
}
else {
lblError.text = NSLocalizedStringFromTable(#"noCameraFound", #"Errors", #"");
}
}
When the picture is taken this method gets called:
-(void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
// Save and get the path of the image
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if([mediaType isEqualToString:(NSString *)kUTTypeImage])
{
// Save the image
image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
[library writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:^(NSURL *assetURL, NSError *error){
if(!error) {
//Save path and location to database
NSString *pathLocation = [[NSString alloc] initWithFormat:#"%#", assetURL];
} else {
NSLog(#"CameraViewController: Error on saving image : %# {imagePickerController}", error);
}
}];
}
[imagePicker dismissModalViewControllerAnimated:YES];
}
Then with that path i get the picture from the library in FULL resolution (using the "1"):
-(void)preparePicture: (NSString *) filePathPicture{
ALAssetsLibraryAssetForURLResultBlock resultBlock = ^(ALAsset *myasset)
{
if(myasset != nil){
ALAssetRepresentation *assetRep = [myasset defaultRepresentation];
CGImageRef imageRef = [assetRep fullResolutionImage];
if (imageRef) {
NSData *imageData = UIImageJPEGRepresentation([UIImage imageWithCGImage:imageRef], 1);
}
}else {
//error
}
};
ALAssetsLibraryAccessFailureBlock failureBlock = ^(NSError *error)
{
NSString *errorString = [NSString stringWithFormat:#"can't get image, %#",[error localizedDescription]];
NSLog(#"%#", errorString);
};
if(filePathPicture && [filePathPicture length])
{
NSURL *assetUrl = [NSURL URLWithString:filePathPicture];
ALAssetsLibrary *assetslibrary = [[ALAssetsLibrary alloc] init];
[assetslibrary assetForURL:assetUrl
resultBlock:resultBlock
failureBlock:failureBlock];
}
}
Hope this helps you a bit further :-).

MFMailComposeViewController: Attaching Images from Photo Gallery

I am having a bit of trouble attaching images from the Photo Gallery to an email.
Basically, one of the features of my application allow the user to take photos. When they snap the shot, I record the URL Reference to the image in Core Data. I understand that you have to go through the ALAssetRepresentation to get to the image. I have this up and running within my application for when the user wants to review an image that they have taken.
I am now attempting to allow the user to attach all of the photos taken for an event to an email. While doing this, I iterate through the Core Data entity that stores the URL References, call a method that returns a UIImage from the ALAssetsLibrary and then attaches it using the NSData/UIImageJPEGRepresentation and MFMailComposeViewController/addAttachmentData methods.
Problem is: When the email is presented to the user, there are small blue squares representing the images and the image is not attached.
Here is the code:
- (void)sendReportReport
{
if ([MFMailComposeViewController canSendMail])
{
MFMailComposeViewController *mailer = [[MFMailComposeViewController alloc] init];
mailer.mailComposeDelegate = self;
[mailer setSubject:#"Log: Report"];
NSArray *toRecipients = [NSArray arrayWithObjects:#"someone#someco.com", nil];
[mailer setToRecipients:toRecipients];
NSError *error;
NSFetchRequest *fetchPhotos = [[NSFetchRequest alloc] init];
NSEntityDescription *entity = [NSEntityDescription
entityForName:#"Photo" inManagedObjectContext:__managedObjectContext];
[fetchPhotos setEntity:entity];
NSArray *fetchedPhotos = [__managedObjectContext executeFetchRequest:fetchPhotos error:&error];
int counter;
for (NSManagedObject *managedObject in fetchedPhotos ) {
Photo *photo = (Photo *)managedObject;
// UIImage *myImage = [UIImage imageNamed:[NSString stringWithFormat:#"%#.png", counter++]];
NSData *imageData = UIImageJPEGRepresentation([self getImage:photo.referenceURL], 0.5);
// NSData *imageData = UIImagePNGRepresentation([self getImage:photo.referenceURL]);
// [mailer addAttachmentData:imageData mimeType:#"image/jpeg" fileName:[NSString stringWithFormat:#"%i", counter]];
[mailer addAttachmentData:imageData mimeType:#"image/jpeg" fileName:[NSString stringWithFormat:#"a.jpg"]];
counter++;
}
NSString *emailBody = [self getEmailBody];
[mailer setMessageBody:emailBody isHTML:NO];
[self presentModalViewController:mailer animated:YES];
}
else
{
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Failure"
message:#"Your device doesn't support the composer sheet"
delegate:nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alert show];
}
}
and the method that returns the UIImage:
#pragma mark - Get Photo from Asset Library
+ (ALAssetsLibrary *)defaultAssetsLibrary {
static dispatch_once_t pred = 0;
static ALAssetsLibrary *library = nil;
dispatch_once(&pred, ^{
library = [[ALAssetsLibrary alloc] init];
});
return library;
}
- (UIImage *)getImage:(NSString *)URLReference
{
__block UIImage *xPhoto = nil;
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
UIImage *xImage;
// get the image
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullScreenImage];
if (iref) {
xImage = [UIImage imageWithCGImage:iref];
}
xPhoto = xImage;
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Error fetching photo: %#",[myerror localizedDescription]);
};
NSURL *asseturl = [NSURL URLWithString:URLReference];
// create library and set callbacks
ALAssetsLibrary *al = [DetailsViewController defaultAssetsLibrary];
[al assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
return xPhoto;
}
NOTE: This above code does run, it simply does not attach the image. Also, note, I am able to successfully attach images from the gallery within my application, as long at I have set them into a UIImageView.Image already (basically, I am taking the pointer to the image from the UIImageView and passing it to the addAttachmentData method.) It is just when I attempt to iterate through Core Data and attach without first setting the image into a UIImageView that I have the trouble.
Any tips would be greatly appreciated!
Thanks!
Jason
Oh now I see it.. sry. You are using an asynchronous block to get the image from the asset library. But right after starting that operation, you are returning xImage. But the asynchronous operation will finish later. So you are returning nil.
You need to change your architectur to smth like this:
In your .h file you need two new members:
NSMutableArray* mArrayForImages;
NSInteger mUnfinishedRequests;
In your .m file do smth like that:
- (void)sendReportReport
{
// save image count
mUnfinishedRequests = [fetchedPhotos count];
// get fetchedPhotos
[...]
// first step: load images
for (NSManagedObject *managedObject in fetchedPhotos )
{
[self loadImage:photo.referenceURL];
}
}
Change your getImage method to loadImage:
- (void)loadImage:(NSString *)URLReference
{
NSURL *asseturl = [NSURL URLWithString:URLReference];
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
// get the image
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullScreenImage];
if (iref) {
[mArrayForImages addObject: [UIImage imageWithCGImage:iref]];
} else {
// handle error
}
[self performSelectorOnMainThread: #selector(imageRequestFinished)];
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Error fetching photo: %#",[myerror localizedDescription]);
[self performSelectorOnMainThread: #selector(imageRequestFinished)];
};
// create library and set callbacks
ALAssetsLibrary *al = [DetailsViewController defaultAssetsLibrary];
[al assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
}
Create a new callback method:
- (void) imageRequestFinished
{
mUnfinishedRequests--;
if(mUnfinishedRequests <= 0)
{
[self sendMail];
}
}
And an extra method for finally sending the mail, after getting the images:
- (void) sendMail
{
// crate mailcomposer etc
[...]
// attach images
for (UIImage *photo in mArrayForImages )
{
NSData *imageData = UIImageJPEGRepresentation(photo, 0.5);
[mailer addAttachmentData:imageData mimeType:#"image/jpeg" fileName:[NSString stringWithFormat:#"a.jpg"]];
}
// send mail
[...]
}

display image from URL retrieved from ALAsset in iPhone

I am using a ALAsset Framework for accessing the files in the device's photo gallery.
So far I am able to access the thumbnail and display it.
I want to display the actual image in an image view but I am unable to figure out how to do this.
I tried using the URLs field in the ALAsset object but was unsuccessful.
Anybody knows how this can be done?
Here's some code where I was able to access the thumbnail and place it in a table cell:
- (UITableViewCell *)tableView:(UITableView *)tableView
cellForRowAtIndexPath:(NSIndexPath *)indexPath {
static NSString *CellIdentifier = #"Cell";
UITableViewCell *cell = [tableView dequeueReusableCellWithIdentifier:CellIdentifier];
if (cell == nil) {
cell = [[[UITableViewCell alloc] initWithStyle:UITableViewCellStyleDefault reuseIdentifier:CellIdentifier] autorelease];
}
//here 'asset' represents the ALAsset object
asset = [assets objectAtIndex:indexPath.row];
//i am accessing the thumbnail here
[cell.imageView setImage:[UIImage imageWithCGImage:[asset thumbnail]]];
[cell.textLabel setText:[NSString stringWithFormat:#"Photo %d", indexPath.row+1]];
return cell;
}
The API has changed the rules slightly and you dont get direct file system access to the iPhoto library any more. Instead you get asset library URL's like this.
assets-library://asset/asset.JPG?id=1000000003&ext=JPG
You use the ALAssetLibrary object to access the ALAsset object via the URL.
so from the docs for ALAssetLibrary throw this in a header (or your source)
typedef void (^ALAssetsLibraryAssetForURLResultBlock)(ALAsset *asset);
typedef void (^ALAssetsLibraryAccessFailureBlock)(NSError *error);
which isnt strictly needed but keeps things pretty.
and then in your source.
-(void)findLargeImage
{
NSString *mediaurl = [self.node valueForKey:kVMMediaURL];
//
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref) {
largeimage = [UIImage imageWithCGImage:iref];
[largeimage retain];
}
};
//
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"booya, cant get image - %#",[myerror localizedDescription]);
};
if(mediaurl && [mediaurl length] && ![[mediaurl pathExtension] isEqualToString:AUDIO_EXTENSION])
{
[largeimage release];
NSURL *asseturl = [NSURL URLWithString:mediaurl];
ALAssetsLibrary* assetslibrary = [[[ALAssetsLibrary alloc] init] autorelease];
[assetslibrary assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
}
}
A couple of things to note are that this uses blocks which were new to me before I started my iOS4 porting but you might like to look at
https://www.mikeash.com/pyblog/friday-qa-2008-12-26.html
and
https://developer.apple.com/library/content/documentation/Cocoa/Conceptual/Blocks/Articles/00_Introduction.html
They bend your head a little but if you think of them as notification selectors or callbacks it kind of helps.
Also
when findLargeImage returns the
resultblock wont have run yet as its
a callback. So largeImage wont be
valid yet.
largeImage needs to be an
instance variable not scoped to the
method.
I use this construct to do this when using the method but you may find something more suitable to your use.
[node.view findLargeImage];
UIImage *thumb = node.view.largeImage;
if (thumb) { blah blah }
Thats what I learned while trying to get this working anyway.
iOS 5 update
When the result block fires seems to be a bit slower with iOS5 & maybe single core devices so I couldnt rely on the image to be available directly after calling findLargeImage. So I changed it to call out to a delegate.
#protocol HiresImageDelegate <NSObject>
#optional
-(void)hiresImageAvailable:(UIImage *)aimage;
#end
and comme cá
//
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref) {
UIImage *largeimage = [UIImage imageWithCGImage:iref];
[delegate hiresImageAvailable:large];
}
};
Warren's answer worked well for me. One useful thing for some people is to include the image orientation and scale metadata at the same time. You do this in your result block like so:
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref)
{
UIImage *largeimage = [UIImage imageWithCGImage:iref scale:[rep scale] orientation:[rep orientation]];
[delegate hiresImageAvailable:large];
}
};
The imageWIthCGImage call in that case has scale and orientation added when it creates a UIImage for you.
[UIImage imageWithCGImage:iref scale:[rep scale] orientation:[rep orientation]];
One trick to note is that if you use [rep fullScreenImage] instead of [rep fullResolutionImage] on iOS 5 you get an image that is already rotated - it is however at the resolution of the iPhone screen - i.e. its at a lower resolution.
Just to combine Warren's and oknox's answers into a shorter snippet:
ALAssetsLibrary *assetsLibrary = [[ALAssetsLibrary alloc] init];
[assetsLibrary assetForURL:self.selectedPhotos[i] resultBlock: ^(ALAsset *asset){
ALAssetRepresentation *representation = [asset defaultRepresentation];
CGImageRef imageRef = [representation fullResolutionImage];
if (imageRef) {
UIImageView *imageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 100, 100)];
imageView.image = [UIImage imageWithCGImage:imageRef scale:representation.scale orientation:representation.orientation];
// ...
}
} failureBlock: ^{
// Handle failure.
}];
I personally like setting my failureBlock to nil.
NSURL* aURL = [NSURL URLWithString:#"URL here"];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:aURL resultBlock:^(ALAsset *asset)
{
UIImage *copyOfOriginalImage = [UIImage imageWithCGImage:[[asset defaultRepresentation] fullScreenImage] scale:0.5 orientation:UIImageOrientationUp];
cell.backgroundView = [[UIImageView alloc] initWithImage:copyOfOriginalImage];
}
failureBlock:^(NSError *error)
{
// error handling
NSLog(#"failure-----");
}];
just provide the UIReferenceURl you got for the image in photolibrary provided above... its just works fine . . .I diplayed it in
UIcollectionView cell
..if you just wanna display it in a
UIImageView
means
Change
cell.backgroundView = [[UIImageView alloc] initWithImage:copyOfOriginalImage];
To
imageView.image = copyOfOriginalImage;