I am using a ALAsset Framework for accessing the files in the device's photo gallery.
So far I am able to access the thumbnail and display it.
I want to display the actual image in an image view but I am unable to figure out how to do this.
I tried using the URLs field in the ALAsset object but was unsuccessful.
Anybody knows how this can be done?
Here's some code where I was able to access the thumbnail and place it in a table cell:
- (UITableViewCell *)tableView:(UITableView *)tableView
cellForRowAtIndexPath:(NSIndexPath *)indexPath {
static NSString *CellIdentifier = #"Cell";
UITableViewCell *cell = [tableView dequeueReusableCellWithIdentifier:CellIdentifier];
if (cell == nil) {
cell = [[[UITableViewCell alloc] initWithStyle:UITableViewCellStyleDefault reuseIdentifier:CellIdentifier] autorelease];
}
//here 'asset' represents the ALAsset object
asset = [assets objectAtIndex:indexPath.row];
//i am accessing the thumbnail here
[cell.imageView setImage:[UIImage imageWithCGImage:[asset thumbnail]]];
[cell.textLabel setText:[NSString stringWithFormat:#"Photo %d", indexPath.row+1]];
return cell;
}
The API has changed the rules slightly and you dont get direct file system access to the iPhoto library any more. Instead you get asset library URL's like this.
assets-library://asset/asset.JPG?id=1000000003&ext=JPG
You use the ALAssetLibrary object to access the ALAsset object via the URL.
so from the docs for ALAssetLibrary throw this in a header (or your source)
typedef void (^ALAssetsLibraryAssetForURLResultBlock)(ALAsset *asset);
typedef void (^ALAssetsLibraryAccessFailureBlock)(NSError *error);
which isnt strictly needed but keeps things pretty.
and then in your source.
-(void)findLargeImage
{
NSString *mediaurl = [self.node valueForKey:kVMMediaURL];
//
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref) {
largeimage = [UIImage imageWithCGImage:iref];
[largeimage retain];
}
};
//
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"booya, cant get image - %#",[myerror localizedDescription]);
};
if(mediaurl && [mediaurl length] && ![[mediaurl pathExtension] isEqualToString:AUDIO_EXTENSION])
{
[largeimage release];
NSURL *asseturl = [NSURL URLWithString:mediaurl];
ALAssetsLibrary* assetslibrary = [[[ALAssetsLibrary alloc] init] autorelease];
[assetslibrary assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
}
}
A couple of things to note are that this uses blocks which were new to me before I started my iOS4 porting but you might like to look at
https://www.mikeash.com/pyblog/friday-qa-2008-12-26.html
and
https://developer.apple.com/library/content/documentation/Cocoa/Conceptual/Blocks/Articles/00_Introduction.html
They bend your head a little but if you think of them as notification selectors or callbacks it kind of helps.
Also
when findLargeImage returns the
resultblock wont have run yet as its
a callback. So largeImage wont be
valid yet.
largeImage needs to be an
instance variable not scoped to the
method.
I use this construct to do this when using the method but you may find something more suitable to your use.
[node.view findLargeImage];
UIImage *thumb = node.view.largeImage;
if (thumb) { blah blah }
Thats what I learned while trying to get this working anyway.
iOS 5 update
When the result block fires seems to be a bit slower with iOS5 & maybe single core devices so I couldnt rely on the image to be available directly after calling findLargeImage. So I changed it to call out to a delegate.
#protocol HiresImageDelegate <NSObject>
#optional
-(void)hiresImageAvailable:(UIImage *)aimage;
#end
and comme cá
//
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref) {
UIImage *largeimage = [UIImage imageWithCGImage:iref];
[delegate hiresImageAvailable:large];
}
};
Warren's answer worked well for me. One useful thing for some people is to include the image orientation and scale metadata at the same time. You do this in your result block like so:
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref)
{
UIImage *largeimage = [UIImage imageWithCGImage:iref scale:[rep scale] orientation:[rep orientation]];
[delegate hiresImageAvailable:large];
}
};
The imageWIthCGImage call in that case has scale and orientation added when it creates a UIImage for you.
[UIImage imageWithCGImage:iref scale:[rep scale] orientation:[rep orientation]];
One trick to note is that if you use [rep fullScreenImage] instead of [rep fullResolutionImage] on iOS 5 you get an image that is already rotated - it is however at the resolution of the iPhone screen - i.e. its at a lower resolution.
Just to combine Warren's and oknox's answers into a shorter snippet:
ALAssetsLibrary *assetsLibrary = [[ALAssetsLibrary alloc] init];
[assetsLibrary assetForURL:self.selectedPhotos[i] resultBlock: ^(ALAsset *asset){
ALAssetRepresentation *representation = [asset defaultRepresentation];
CGImageRef imageRef = [representation fullResolutionImage];
if (imageRef) {
UIImageView *imageView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 100, 100)];
imageView.image = [UIImage imageWithCGImage:imageRef scale:representation.scale orientation:representation.orientation];
// ...
}
} failureBlock: ^{
// Handle failure.
}];
I personally like setting my failureBlock to nil.
NSURL* aURL = [NSURL URLWithString:#"URL here"];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:aURL resultBlock:^(ALAsset *asset)
{
UIImage *copyOfOriginalImage = [UIImage imageWithCGImage:[[asset defaultRepresentation] fullScreenImage] scale:0.5 orientation:UIImageOrientationUp];
cell.backgroundView = [[UIImageView alloc] initWithImage:copyOfOriginalImage];
}
failureBlock:^(NSError *error)
{
// error handling
NSLog(#"failure-----");
}];
just provide the UIReferenceURl you got for the image in photolibrary provided above... its just works fine . . .I diplayed it in
UIcollectionView cell
..if you just wanna display it in a
UIImageView
means
Change
cell.backgroundView = [[UIImageView alloc] initWithImage:copyOfOriginalImage];
To
imageView.image = copyOfOriginalImage;
Related
Hi i've got a problem with display an image on my scrollView.
At first i create new UIImageView with asset url:
-(void) findLargeImage:(NSNumber*) arrayIndex
{
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep;
if([myasset defaultRepresentation] == nil) {
return;
} else {
rep = [myasset defaultRepresentation];
}
CGImageRef iref = [rep fullResolutionImage];
itemToAdd = [[UIImageView alloc] initWithFrame:CGRectMake([arrayIndex intValue]*320, 0, 320, 320)];
itemToAdd.image = [UIImage imageWithCGImage:iref];
[self.scrollView addSubview:itemToAdd];
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Cant get image - %#",[myerror localizedDescription]);
};
NSURL *asseturl = [NSURL URLWithString:[self.photoPath objectAtIndex:[arrayIndex intValue] ]];
ALAssetsLibrary* assetslibrary = [[ALAssetsLibrary alloc] init];
[assetslibrary assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
}
Where itemToAdd is a UIImageView define in interface:
__block UIImageView *itemToAdd;
And scrollView define as a property:
#property (nonatomic, strong) __block UIScrollView *scrollView;
Then in my viewWillAppear i do this:
- (void) viewWillAppear:(BOOL)animated {
self.scrollView.delegate = self;
[self findLargeImage:self.actualPhotoIndex];
[self.view addSubview:self.scrollView];
}
But image doesnt appear, should i refresh self.view after add image to scrollView, or should do something else?
ALAssetsLibrary block will execute in separate thread. So I suggest to do the UI related stuffs in main thread.
To do this either use dispatch_sync(dispatch_get_main_queue() or performSelectorOnMainThread
Some Important Notes:
Use AlAsset aspectRatioThumbnail instead of fullResolutionImage for high performance
Example:
CGImageRef iref = [myasset aspectRatioThumbnail];
itemToAdd.image = [UIImage imageWithCGImage:iref];
Example:
-(void) findLargeImage:(NSNumber*) arrayIndex
{
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
CGImageRef iref = [myasset aspectRatioThumbnail];
dispatch_sync(dispatch_get_main_queue(), ^{
itemToAdd = [[UIImageView alloc] initWithFrame:CGRectMake([arrayIndex intValue]*320, 0, 320, 320)];
itemToAdd.image = [UIImage imageWithCGImage:iref];
[self.scrollView addSubview:itemToAdd];
});//end block
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Cant get image - %#",[myerror localizedDescription]);
};
NSURL *asseturl = [NSURL URLWithString:[self.photoPath objectAtIndex:[arrayIndex intValue] ]];
ALAssetsLibrary* assetslibrary = [[ALAssetsLibrary alloc] init];
[assetslibrary assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
}
Also change the order of viewWillAppear()
- (void) viewWillAppear:(BOOL)animated {
self.scrollView.delegate = self;
[self.view addSubview:self.scrollView];
[self findLargeImage:self.actualPhotoIndex];
}
You are manipulating the view from another thread.
You must use the main thread for manipulating the view.
Add image to scrollView using:
dispatch_async(dispatch_get_main_queue(), ^{
[self.scrollView addSubview:itemToAdd];
}
or using:
[self.scrollView performSelectorOnMainThread:#selector(addSubview:) withObject:itemToAdd waitUntilDone:NO];
Please refer:
NSObject Class Reference
GCD
I've saved an image to storage using this:
UIImageWriteToSavedPhotosAlbum(image, self,
#selector(image:didFinishSavingWithError:contextInfo:), nil);
Now, how can I load that image into an image view?
Image views have an image property. Assign the image to it.
Edit: If you want to save a persistent reference to the entry in the photo album, don't use that function to save it, use writeImageToSavedPhotosAlbum:metadata:completionBlock: instead. The completion block will be passed an assetURL parameter.
This is the code I now use:
//Load the asset URL from my photo object
NSURL *asseturl = [NSURL URLWithString:photo.photoURL];
ALAssetsLibrary* myasset = [[ALAssetsLibrary alloc] init];
//If the image was found
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
//Get the full resolution image
CGImageRef iref = [rep fullResolutionImage];
if (iref) {
//Get the orientation of the image
ALAssetOrientation orientation = [[myasset valueForProperty:#"ALAssetPropertyOrientation"] intValue];
//Set the image view image
[self.BigImageView setImage:[UIImage imageWithCGImage:iref scale:1.0 orientation:orientation]];
}
};
//If the image isn't available
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Can't get the image - %#",[myerror localizedDescription]);
[self.BigImageView setImage:nil];
};
[myasset assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
I am having a bit of trouble attaching images from the Photo Gallery to an email.
Basically, one of the features of my application allow the user to take photos. When they snap the shot, I record the URL Reference to the image in Core Data. I understand that you have to go through the ALAssetRepresentation to get to the image. I have this up and running within my application for when the user wants to review an image that they have taken.
I am now attempting to allow the user to attach all of the photos taken for an event to an email. While doing this, I iterate through the Core Data entity that stores the URL References, call a method that returns a UIImage from the ALAssetsLibrary and then attaches it using the NSData/UIImageJPEGRepresentation and MFMailComposeViewController/addAttachmentData methods.
Problem is: When the email is presented to the user, there are small blue squares representing the images and the image is not attached.
Here is the code:
- (void)sendReportReport
{
if ([MFMailComposeViewController canSendMail])
{
MFMailComposeViewController *mailer = [[MFMailComposeViewController alloc] init];
mailer.mailComposeDelegate = self;
[mailer setSubject:#"Log: Report"];
NSArray *toRecipients = [NSArray arrayWithObjects:#"someone#someco.com", nil];
[mailer setToRecipients:toRecipients];
NSError *error;
NSFetchRequest *fetchPhotos = [[NSFetchRequest alloc] init];
NSEntityDescription *entity = [NSEntityDescription
entityForName:#"Photo" inManagedObjectContext:__managedObjectContext];
[fetchPhotos setEntity:entity];
NSArray *fetchedPhotos = [__managedObjectContext executeFetchRequest:fetchPhotos error:&error];
int counter;
for (NSManagedObject *managedObject in fetchedPhotos ) {
Photo *photo = (Photo *)managedObject;
// UIImage *myImage = [UIImage imageNamed:[NSString stringWithFormat:#"%#.png", counter++]];
NSData *imageData = UIImageJPEGRepresentation([self getImage:photo.referenceURL], 0.5);
// NSData *imageData = UIImagePNGRepresentation([self getImage:photo.referenceURL]);
// [mailer addAttachmentData:imageData mimeType:#"image/jpeg" fileName:[NSString stringWithFormat:#"%i", counter]];
[mailer addAttachmentData:imageData mimeType:#"image/jpeg" fileName:[NSString stringWithFormat:#"a.jpg"]];
counter++;
}
NSString *emailBody = [self getEmailBody];
[mailer setMessageBody:emailBody isHTML:NO];
[self presentModalViewController:mailer animated:YES];
}
else
{
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Failure"
message:#"Your device doesn't support the composer sheet"
delegate:nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alert show];
}
}
and the method that returns the UIImage:
#pragma mark - Get Photo from Asset Library
+ (ALAssetsLibrary *)defaultAssetsLibrary {
static dispatch_once_t pred = 0;
static ALAssetsLibrary *library = nil;
dispatch_once(&pred, ^{
library = [[ALAssetsLibrary alloc] init];
});
return library;
}
- (UIImage *)getImage:(NSString *)URLReference
{
__block UIImage *xPhoto = nil;
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
UIImage *xImage;
// get the image
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullScreenImage];
if (iref) {
xImage = [UIImage imageWithCGImage:iref];
}
xPhoto = xImage;
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Error fetching photo: %#",[myerror localizedDescription]);
};
NSURL *asseturl = [NSURL URLWithString:URLReference];
// create library and set callbacks
ALAssetsLibrary *al = [DetailsViewController defaultAssetsLibrary];
[al assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
return xPhoto;
}
NOTE: This above code does run, it simply does not attach the image. Also, note, I am able to successfully attach images from the gallery within my application, as long at I have set them into a UIImageView.Image already (basically, I am taking the pointer to the image from the UIImageView and passing it to the addAttachmentData method.) It is just when I attempt to iterate through Core Data and attach without first setting the image into a UIImageView that I have the trouble.
Any tips would be greatly appreciated!
Thanks!
Jason
Oh now I see it.. sry. You are using an asynchronous block to get the image from the asset library. But right after starting that operation, you are returning xImage. But the asynchronous operation will finish later. So you are returning nil.
You need to change your architectur to smth like this:
In your .h file you need two new members:
NSMutableArray* mArrayForImages;
NSInteger mUnfinishedRequests;
In your .m file do smth like that:
- (void)sendReportReport
{
// save image count
mUnfinishedRequests = [fetchedPhotos count];
// get fetchedPhotos
[...]
// first step: load images
for (NSManagedObject *managedObject in fetchedPhotos )
{
[self loadImage:photo.referenceURL];
}
}
Change your getImage method to loadImage:
- (void)loadImage:(NSString *)URLReference
{
NSURL *asseturl = [NSURL URLWithString:URLReference];
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
// get the image
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullScreenImage];
if (iref) {
[mArrayForImages addObject: [UIImage imageWithCGImage:iref]];
} else {
// handle error
}
[self performSelectorOnMainThread: #selector(imageRequestFinished)];
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Error fetching photo: %#",[myerror localizedDescription]);
[self performSelectorOnMainThread: #selector(imageRequestFinished)];
};
// create library and set callbacks
ALAssetsLibrary *al = [DetailsViewController defaultAssetsLibrary];
[al assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
}
Create a new callback method:
- (void) imageRequestFinished
{
mUnfinishedRequests--;
if(mUnfinishedRequests <= 0)
{
[self sendMail];
}
}
And an extra method for finally sending the mail, after getting the images:
- (void) sendMail
{
// crate mailcomposer etc
[...]
// attach images
for (UIImage *photo in mArrayForImages )
{
NSData *imageData = UIImageJPEGRepresentation(photo, 0.5);
[mailer addAttachmentData:imageData mimeType:#"image/jpeg" fileName:[NSString stringWithFormat:#"a.jpg"]];
}
// send mail
[...]
}
this is a part of my code. I'm using a asyncImageView .Everything work good. But now i want to save in the iphone all images in a path. I know i have to use NSFileManager but where?
EDIT: now i try with my code but nothing save when i compile on my iphone
// Configure the cell.
NSDictionary *dico = [self.pseudoOnline objectAtIndex:indexPath.row];
cell.pseudo.text = [dico objectForKey:#"pseudo"];
cell.sexe.text = [dico objectForKey:#"sexe"];
cell.age.text = [dico objectForKey:#"age"];
UIImage *image = [[UIImage alloc] initWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:[dico objectForKey:#"photo"]]]];
NSString *deskTopDir = #"/Users/***/Desktop/imagesOnline";
NSString *nomPhoto = [[cell.pseudo text]stringByReplacingOccurrencesOfString:#"\n" withString:#""];;
NSLog(#"pseudo%#",nomPhoto);
NSString *jpegFilePath = [NSString stringWithFormat:#"%#/%#.jpeg",deskTopDir,nomPhoto];
NSData *data2 = [NSData dataWithData:UIImageJPEGRepresentation(image, 0.5f)]; quality
[data2 writeToFile:jpegFilePath atomically:YES];
NSLog(#"image %#",jpegFilePath);
[image release];
CGRect frame;
frame.size.width=45; frame.size.height=43;
frame.origin.x=-5; frame.origin.y=0;
asyncImageView *asyncImage = [[[asyncImageView alloc] initWithFrame:frame] autorelease];
asyncImage.tag =999;
[asyncImage loadImageFromURL:[NSURL URLWithString:[dico objectForKey:#"photo"]]];
[cell.contentView addSubview:asyncImage];
return cell;
so now it works i can download all the pictures. But now i want to load them
I'm not sure what asyncImageView is, but it appears to get an image.
If it gets the image the way your code implies, you can put your NSFileManger method call right after:
[cell.contentView addSubview:asyncImage];
If, on the other hand, the asyncImageView fetches the image asynchronously (as it's name implies), then you should put your NSFileManager method call in asyncImageView's callback delegate.
Basically, as soon as you actually have the image, you can save the image.
I'm trying to create thumbnails (288x288) of selected photos from iPad photo library. I have an array of ALAsset objects presented in a UITableView and as I select a row, a larger preview (288x288) of that image is displayed. In order to prevent main thread blocking, I'm trying to create the thumbnail on a background thread and also cache a copy of the thumbnail to the file system.
In a view controller when a tableview row is selected, I call loadPreviewImage in background:
- (void)tableView:(UITableView *)tableView
didSelectRowAtIndexPath:(NSIndexPath *)indexPath
{
// get the upload object from an array that contains a ALAsset object
upload = [uploads objectAtIndex:[indexPath row]];
[self performSelectorInBackground:#selector(loadPreviewImage:)
withObject:upload];
}
I pass a custom upload object that contains asseturl property:
- (void)loadPreviewImage:(MyUploadClass*)upload
{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
UIImage *preview = [upload previewImage];
[self performSelectorOnMainThread:#selector(setPreviewImage:)
withObject:preview
waitUntilDone:YES];
[pool release];
}
This is called on main thread to display the thumbnail after it's loaded:
- (void)setPreviewImage:(UIImage*)image
{
self.imageViewPreview.image = image;
[self layoutSubviews];
}
This is a method of MyUploadClass:
- (UIImage *)previewImage
{
__block UIImage *previewImage = [[UIImage imageWithContentsOfFile:
[self uploadPreviewFilePath]] retain];
if (previewImage == nil && asseturl)
{
ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
[library assetForURL:self.asseturl resultBlock:^(ALAsset *asset)
{
ALAssetRepresentation *rep = [asset defaultRepresentation];
previewImage = [UIImage imageWithCGImage: [rep fullScreenImage]];
previewImage = [[previewImage resizedImageWithContentMode:UIViewContentModeScaleAspectFit
bounds:CGSizeMake(288, 288)
interpolationQuality:kCGInterpolationHigh] retain];
NSData *previewData = UIImageJPEGRepresentation(previewImage, 1.0);
[previewData writeToFile:[self uploadPreviewFilePath] atomically:YES];
}
failureBlock:^(NSError *error){ }];
[library release];
}
return [previewImage autorelease];
}
The problem is that I always get nil previewImage the first time and only after the thumbnail is cached I get an image object. What am I doing wrong? Is there a better approach to this problem?
I didn't clearly understand how the resultBlock of ALAssetsLibrary operates, my mistake was to think that the execution is linear. It turns out that in my case the resultBlock executes on the main thread while the rest of the code in previewImage executes on a background thread. I was getting nil because previewImage returned before resultBlock had a chance to end its execution. I solved the problem by replacing previewImage with the following method:
- (void) loadPreviewImage:(CGSize)size withTarget:(id)target andCallback:(SEL)callback
{
NSString *path = [self uploadPreviewFilePath];
UIImage *previewImage = [UIImage imageWithContentsOfFile:path];
if (previewImage == nil && asseturl)
{
ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
[library assetForURL:self.asseturl resultBlock:^(ALAsset *asset)
{
if (asset) {
ALAssetRepresentation *rep = [asset defaultRepresentation];
UIImage *img = [UIImage imageWithCGImage: [rep fullScreenImage]];
img = [img resizedImageWithContentMode:UIViewContentModeScaleAspectFit
bounds:size interpolationQuality:kCGInterpolationHigh];
NSData *previewData = UIImageJPEGRepresentation(img, 1.0);
[previewData writeToFile:path atomically:YES];
[target performSelectorOnMainThread:callback
withObject:img
waitUntilDone:YES];
}
}
failureBlock:^(NSError *error){ }];
[library release];
}
else {
[target performSelectorOnMainThread:callback withObject:img waitUntilDone:YES];
}
}