iPhone: How to get a UIImage from a url? - iphone

How do you download an image and turn it into a UIImage?

NSURL *url = [NSURL URLWithString:#"http://example.com/image.jpg"];
NSData *data = [NSData dataWithContentsOfURL:url];
UIImage *img = [[[UIImage alloc] initWithData:data] autorelease];
However, this isn't asynchronous.

You should keep in mind that loading the image data with the sample code you've provided in your own answer will block the main thread until the download has completed. This is a useability no-no. The users of your app will think your app is unresponsive. If you want to download an image, prefer NSURLConnection to download it asynchronously in its own thread.
Read the Apple documentation about async downloads with NSURLConnection.
When your download completes, you can then instantiate your UIImage from the data object:
- (void)connectionDidFinishLoading:(NSURLConnection *)connection
{
if (requestData)
{
self.image = [[[UIImage alloc] initWithData:requestData] autorelease];
}
}
Where requestData and image are instance variables of your containing class, and image is a retained property. Be sure to release image in the dealloc routine of the class, e.g. using self.image=nil;.

It is true that Asynchronous loading is a must, but you can also achieve that with a background call if you just need a simple solution.
[self performSelectorInBackground:#selector(loadImage) withObject:nil];
- (void)loadImage
{
NSURL * url = [NSURL URLWithString:#"http://.../....jpg"];
NSData * data = [NSData dataWithContentsOfURL:url];
UIImage * image = [UIImage imageWithData:data];
if (image)
{
// Success use the image
...
}
else
{
// Failed (load an error image?)
...
}
}

You can load an image in a UIImageView from a URL without blocking the UI thread simply using the dispatch_async:
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSURL *url = [NSURL URLWithString:myURL];
NSData *data = [NSData dataWithContentsOfURL:url];
UIImage* image = [[UIImage alloc]initWithData:data];
dispatch_async(dispatch_get_main_queue(), ^{
[myImageView setImage:image];
});
});
The first dispatch_async is used to load the image from the URL in background (without blocking the UI). The second dispatch_async is used to actually inject the image just loaded in the UIImageView. Please, note that the second dispatch_async is designed to work on the main thread (the UI thread) that is, indeed, the thread used to update the UI.

There is a really good example here using blocks: http://ios-blog.co.uk/iphone-development-tutorials/uiimage-from-url-%E2%80%93-simplified-using-blocks/

Based on #philfreo's answer I created a NSURL Extension in swift using BrightFutures (for the asynchronous callback:
extension NSURL{
func downloadImage() -> Future<UIImage, NSError>{
let promise = Promise<UIImage, NSError>()
Queue.global.async{
if let data = NSData(contentsOfURL: self){
if let image = UIImage(data: data){
promise.success(image)
return
}
}
promise.failure(NSError())
}
return promise.future
}
}
This then lets me do:
import BrightFutures
let someURL = ...;
someURL.downloadImage().onSuccess{ image in
// Do something with the UIImage
}
Update
A simple synchronous extension could look like this:
extension NSURL{
func downloadImage() -> UIImage?{
if let data = NSData(contentsOfURL: self){
return UIImage(data: data){
}
return nil
}
}

This is what #philfreo's answer looks like in swift:
let urlString = "http://lorempixel.com/100/100/cats/"
let url = NSURL(string: urlString)!
let data = NSData(contentsOfURL: url)
let image = UIImage(data: data!)
This works in a playground:

Related

SignalR: sometimes no data returns when trying to load images

Once in a while when loading image like this:
dispatch_async(dispatch_get_global_queue(0, 0), ^
{
NSData *data = [[NSData alloc] initWithContentsOfURL:someImgUrl.jpg];
if (data == nil)
{
NSLog( #"data is nil with img url:%#" ,imgUrl);
return;
}
dispatch_async(dispatch_get_main_queue(), ^
{
img.image = [UIImage imageWithData:data];
});
});
my data is nil.
I used fiddler to sniff that, and saw that everytime it happened no request is shown in fiddler!
The only times it NEVER happens are
When I don't use SignalR client in my app.
Downloading the image synchronically:
NSData * imageData = [[NSData alloc] initWithContentsOfURL:someImgUrl.jpg ];
img.image = [UIImage imageWithData: imageData];
The way I initialize SignalR is this:
NSString *listenurl = [NSString stringWithFormat:#"%#/%#", SERVICE_URL, #"/echo"];
mConnection = [SRConnection connectionWithURL:listenurl];
[mConnection setDelegate:self];
[mConnection start:[[SRLongPollingTransport alloc] init]];
Anyone else use signalR client in ios and exprience this behaviour?
It seems that problem only happens when SignalR listens with the same domain name to the server where you try to load images from.
So the (lame) solution so far that I found is buy a second domain and listen to that one.
Don't know why it happens though...

NSOperation and NSOperationQueue does not work

I have the following code and it does not work. Is there something working behind it.
[operationQueue addOperationWithBlock:^{
imageData = [NSData dataWithContentsOfURL:imageURL];
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
UIImage *image = nil;
if(imageData){
UIImage *image = [UIImage imageWithData:imageData];
cell.imageView.image = image;
}
}];
}];
Even I create a subclass of NSOperation and then alloc init it, it does not work the way I think it to. I always have to invoke start to the NSOperation subclass to run but I suppose sending start message to NSOperation runs it in the main thread rather than running in the background thread.
I want to add an alternative solution using GCD :
backgroundQueue = dispatch_queue_create("com.razeware.imagegrabber.bgqueue", NULL);
dispatch_async(backgroundQueue, ^{
/* put the codes which makes UI unresponsive like reading from network*/
imageData = [NSData dataWithContentsOfURL:imageURL];
..... ;
dispatch_async(dispatch_get_main_queue(),^{
/* do the UI related work on main thread */
UIImage *image = [UIImage imageWithData:imageData];
cell.imageView.image = image;
......; });
});
dispatch_release(backgroundQueue);
Let me know whether this one helped you ;)
Reference

Lazy loading of image in tableview

Im trying to load the image of my uitableviewcells in lazy mode.
I'm trying to do it in the simplest possible way, I saw a lot of examples but they were going further than my purpose.
This is what Im doing currently, and its not working:
// Configure the cell...
Info *info = [self.Array objectAtIndex:indexPath.row];
cell.textLabel.text = info.name;
cell.detailTextLabel.text = info.platform;
cell.accessoryType = UITableViewCellAccessoryDisclosureIndicator;
//The image is downloaded in asynchronous
NSBlockOperation *downloadingImgBlock = [NSBlockOperation blockOperationWithBlock:^{
NSString* imageURL = info.imgURL;
NSData* imageData = [[NSData alloc]initWithContentsOfURL:[NSURL URLWithString:imageURL]];
cell.imageView.image = [UIImage imageWithData:imageData];
}];
[self.queue addOperation:downloadingImgBlock];
Why it's not working? And how could it work?
Man, AsyncImageView is your friend! Just set the image url and everything is handled for you, downloading, caching. It's awesome.
Try the following codes
......
__block UITableViewCell *cell = [tableView dequeueReusableCellWithIdentifier:CellIdentifier];
__block UIImage *img;
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
img = [UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString: imageURL]]];
dispatch_async(dispatch_get_main_queue(),^{
cell.imageView.image = img;
}
});
});
......
Using the 3rd party library source code would be easiest, but here's how you would do it in code. You are going to want to make a NSMutableArray either as a property in your .h or at the top of your .m like this:
#implementation viewController{
NSMutableArray *picList;
}
and in -initWithCoder: or, whatever init you are overloading:
picList = [[NSMutableArray alloc] init];
in – tableView:cellForRowAtIndexPath: (fullPicURL is the URL):
if([self imageExists:fullPicURL]){
shoePic = [picList objectForKey:fullSmallPicURL];
} else {
NSURL *picURL = [NSURL URLWithString:fullPicURL];
shoePic = [UIImage imageWithData:[NSData dataWithContentsOfURL:picURL]];
NSDictionary *thisImage = [NSDictionary dictionaryWithObject:shoePic forKey:fullSmallPicURL];
[cachedImages addEntriesFromDictionary:thisImage];
}
where -imageExists: is a function you write which checks the dictionary for the url. The object is the picture itself, the key is the url. Then you do cell.image = showPic;, or whatever you want to call it, and you're done for cacheing. In order to load them asynchronously, do this:
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSData *data = [NSData dataWithContentsOfURL:shoes];
[self performSelectorOnMainThread:#selector(fetchedShoeData:) withObject:data waitUntilDone:YES];
});
and at the end of fetchedData:
[self.tableView reloadData];
then just make sure you edit –numberOfSectionsInTableView: to change itself if it has to. Might be some other things you need to do to get it working 100%, let me know
try this one, I'm using always this scheme to load images asynchronously:
(I haven't found simplest way.)
- (void)inAnyMethod {
// ...
void (^blockLoadPhoto)() = ^{
UIImage *_imageTemporary = [UIImage imageNamed:#"..."];
if (_imageTemporary) {
[self performSelectorOnMainThread:#selector(setPhoto:) withObject:_imageTemporary waitUntilDone:true];
}
};
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), blockLoadPhoto);
// ...
}
- (void)setPhoto:(UIImage *)photo {
// do something with the loaded image
}
I finally managed to do it setting the image with:
– performSelectorOnMainThread:withObject:waitUntilDone:
After the image is downloaded

iOS: Select a GIF from the photo library, convert to NSData for use in multipart/form-data

What's currently working in my code:
I select a JPG or PNG from the Photo Library (using standard ImagePicker methods), and convert that image to NSData using:
self.myImageData = UIImageJPEGRepresentation(myImage, 0.9);
which I then post to a server using multipart/form-data.
I now want to do the same for a GIF, while retaining the original GIF data (so that an animated GIF going into the library, comes back out still animating).
In didFinishPickingMediaWithInfo, I am able to get the URL of the original GIF using
self.myGIFURL = [info objectForKey:UIImagePickerControllerReferenceURL].
Here's one example of what that might get me:
assets-library://asset/asset.GIF?id=1000000034&ext=GIF
Here are two ways I've tried now to push this GIF into NSData, and each time I myImageData shows (null).
I've tried to use initWithContentsOfURL:
NSData *dataFromGIFURL = [[NSData alloc] initWithContentsOfURL: myGIFURL];
self.myImageData = dataFromGIFURL;
[dataFromGIFURL release];
Then I tried converting the NSURL to a string for initWithContentsOfFile:
NSString *stringFromURL = [NSString stringWithFormat:#"%#", myGIFURL];
NSData *dataFromGIFURL = [[NSData alloc] initWithContentsOfFile: stringFromURL];
self.myImageData = dataFromGIFURL;
[dataFromGIFURL release];
Any suggestions? Thanks.
The UIImagePickerControllerReferenceURL key doesn't appear until iOS 4.1. I therefore take it as implicit in your question that it's fine to use the AssetsLibrary framework, which appeared in iOS only at 4.0. In which case, you can use the following:
- (void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:[info objectForKey:UIImagePickerControllerReferenceURL]
resultBlock:^(ALAsset *asset)
{
ALAssetRepresentation *representation = [asset defaultRepresentation];
NSLog(#"size of asset in bytes: %d", [representation size]);
unsigned char bytes[4];
[representation getBytes:bytes fromOffset:0 length:4 error:nil];
NSLog(#"first four bytes: %02x (%c) %02x (%c) %02x (%c) %02x (%c)",
bytes[0], bytes[0],
bytes[1], bytes[1],
bytes[2], bytes[2],
bytes[3], bytes[3]);
[library autorelease];
}
failureBlock:^(NSError *error)
{
NSLog(#"couldn't get asset: %#", error);
[library autorelease];
}
];
}
So, you create an ALAssetsLibrary, ask it to find you the asset with the URL specified (it understands the assets-library:// URL scheme), then when you get the asset you grab its default representation and use that to feed you the bytes. They'll be the actual on-disk bytes, the default representation for an asset from the library being its on-disk form.
For example, selecting a particular GIF I grabbed at random from Google images, from an image picker wired up to a delegate with that method in it gives me the output:
2011-03-03 23:17:37.451
IPTest[1199:307] size of asset in
bytes: 174960
2011-03-03 23:17:37.459
IPTest[1199:307] first four bytes: 47
(G) 49 (I) 46 (F) 38 (8)
So that's the beginning of the standard GIF header. Picking PNGs or JPGs gives the recognisable first four bytes of the PNG and JPG headers.
EDIT: to finish the thought, obviously you can use ALAssetRepresentation to read all of the bytes describing the file into a suitably malloc'd C array, then use NSData +(id)dataWithBytes:length: (or, more likely, +dataWithBytesNoCopy:length:freeWhenDone:) to wrap that into an NSData.
Here's a version that uses the newer Photos framework:
- (void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSURL * refUrl = [info objectForKey:UIImagePickerControllerReferenceURL];
if (refUrl) {
PHAsset * asset = [[PHAsset fetchAssetsWithALAssetURLs:#[refUrl] options:nil] lastObject];
if (asset) {
PHImageRequestOptions *options = [[PHImageRequestOptions alloc] init];
options.synchronous = YES;
options.networkAccessAllowed = NO;
options.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
[[PHImageManager defaultManager] requestImageDataForAsset:asset options:options resultHandler:^(NSData * _Nullable imageData, NSString * _Nullable dataUTI, UIImageOrientation orientation, NSDictionary * _Nullable info) {
NSNumber * isError = [info objectForKey:PHImageErrorKey];
NSNumber * isCloud = [info objectForKey:PHImageResultIsInCloudKey];
if ([isError boolValue] || [isCloud boolValue] || ! imageData) {
// fail
} else {
// success, data is in imageData
}
}];
}
}
}
Here's Eli's version using Swift 3:
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String: Any]) {
guard let imageURL = info[UIImagePickerControllerReferenceURL] as? URL else { return }
guard let asset = PHAsset.fetchAssets(withALAssetURLs: [imageURL], options: nil).lastObject else { return }
if picker.sourceType == .photoLibrary || picker.sourceType == .savedPhotosAlbum {
let options = PHImageRequestOptions()
options.isSynchronous = true
options.isNetworkAccessAllowed = false
options.deliveryMode = .highQualityFormat
PHImageManager.default().requestImageData(for: asset, options: options) { data, uti, orientation, info in
guard let info = info else { return }
if let error = info[PHImageErrorKey] as? Error {
log.error("Cannot fetch data for GIF image: \(error)")
return
}
if let isInCould = info[PHImageResultIsInCloudKey] as? Bool, isInCould {
log.error("Cannot fetch data from cloud. Option for network access not set.")
return
}
// do something with data (it is a Data object)
}
} else {
// do something with media taken via camera
}
}

Set UIImageView image using a url

Is it possible to read a url to an image and set a UIImageView to the image at this url?
For such a straightforward task I would highly recommend against integrating projects such as Three20, the library is a monster and not very straightforward to get up and running.
I would instead recommend this approach:
NSString *imageUrl = #"http://www.foo.com/myImage.jpg";
[NSURLConnection sendAsynchronousRequest:[NSURLRequest requestWithURL:[NSURL URLWithString:imageUrl]] queue:[NSOperationQueue mainQueue] completionHandler:^(NSURLResponse *response, NSData *data, NSError *error) {
myImageView.image = [UIImage imageWithData:data];
}];
EDIT for Swift 3.0
let urlString = "http://www.foo.com/myImage.jpg"
guard let url = URL(string: urlString) else { return }
URLSession.shared.dataTask(with: url) { (data, response, error) in
if error != nil {
print("Failed fetching image:", error)
return
}
guard let response = response as? HTTPURLResponse, response.statusCode == 200 else {
print("Not a proper HTTPURLResponse or statusCode")
return
}
DispatchQueue.main.async {
self.myImageView.image = UIImage(data: data!)
}
}.resume()
*EDIT for Swift 2.0
let request = NSURLRequest(URL: NSURL(string: urlString)!)
NSURLSession.sharedSession().dataTaskWithRequest(request) { (data, response, error) -> Void in
if error != nil {
print("Failed to load image for url: \(urlString), error: \(error?.description)")
return
}
guard let httpResponse = response as? NSHTTPURLResponse else {
print("Not an NSHTTPURLResponse from loading url: \(urlString)")
return
}
if httpResponse.statusCode != 200 {
print("Bad response statusCode: \(httpResponse.statusCode) while loading url: \(urlString)")
return
}
dispatch_async(dispatch_get_main_queue(), { () -> Void in
self.myImageView.image = UIImage(data: data!)
})
}.resume()
NSString *ImageURL = #"YourURLHere";
NSData *imageData = [NSData dataWithContentsOfURL:[NSURL URLWithString:ImageURL]];
imageView.image = [UIImage imageWithData:imageData];
It's possible to load an NSData from a URL and convert that to an image and stick that in the image view, but this is an extremely bad idea because it involves doing a synchronous URL download on the main thread. This will lock up the UI and possibly cause the user to think your app has crashed if the image doesn't download extremely fast.
Edit: To clarify, the original question looks to me like the poster wants to say something along the lines of
imageView.image = [UIImage imageWithContentsOfURL:theURL];
Hence my answer. It's possible to do the equivalent by going through NSData, but it's a bad idea. Instead, the image should be downloaded asynchronously using NSURLConnection, and only once it's fully downloaded should it be converted into a UIImage and assigned to the image view.
I am using https://github.com/rs/SDWebImage which is a beautifully designed library, which has the options to put a placeholder image, whether to memory, or disk cache the image or use the downloader independent of a UIImageView.
I was trying to do that by myself but the library took all the pain away.
All you need to do is to write the code below for your case :
#import "UIImageView+WebCache.h"
[imageView sd_setImageWithURL:[NSURL URLWithString:#"http://www.domain.com/path/to/image.jpg"] placeholderImage:[UIImage imageNamed:#"placeholder.png"]];
Worked like a charm for me.
If you want to display image from Url in ImageView, And also want to save this image in cache for optimize to server interaction this would help you Just pass your imageView object and string Url to this function
-(void)downloadingServerImageFromUrl:(UIImageView*)imgView AndUrl:(NSString*)strUrl{
strUrl = [strUrl encodeUrl];
NSString* theFileName = [NSString stringWithFormat:#"%#.png",[[strUrl lastPathComponent] stringByDeletingPathExtension]];
NSFileManager *fileManager =[NSFileManager defaultManager];
NSString *fileName = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"tmp/%#",theFileName]];
imgView.backgroundColor = [UIColor darkGrayColor];
UIActivityIndicatorView *actView = [[UIActivityIndicatorView alloc]initWithActivityIndicatorStyle:UIActivityIndicatorViewStyleWhite];
[imgView addSubview:actView];
[actView startAnimating];
CGSize boundsSize = imgView.bounds.size;
CGRect frameToCenter = actView.frame;
// center horizontally
if (frameToCenter.size.width < boundsSize.width)
frameToCenter.origin.x = (boundsSize.width - frameToCenter.size.width) / 2;
else
frameToCenter.origin.x = 0;
// center vertically
if (frameToCenter.size.height < boundsSize.height)
frameToCenter.origin.y = (boundsSize.height - frameToCenter.size.height) / 2;
else
frameToCenter.origin.y = 0;
actView.frame = frameToCenter;
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
dispatch_async(queue, ^{
NSData *dataFromFile = nil;
NSData *dataFromUrl = nil;
dataFromFile = [fileManager contentsAtPath:fileName];
if(dataFromFile==nil){
dataFromUrl=[[[NSData alloc] initWithContentsOfURL:[NSURL URLWithString:strUrl]] autorelease];
}
dispatch_sync(dispatch_get_main_queue(), ^{
if(dataFromFile!=nil){
imgView.image = [UIImage imageWithData:dataFromFile];
}else if(dataFromUrl!=nil){
imgView.image = [UIImage imageWithData:dataFromUrl];
NSString *fileName = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"tmp/%#",theFileName]];
BOOL filecreationSuccess = [fileManager createFileAtPath:fileName contents:dataFromUrl attributes:nil];
if(filecreationSuccess == NO){
NSLog(#"Failed to create the html file");
}
}else{
imgView.image = [UIImage imageNamed:#"NO_Image.png"];
}
[actView removeFromSuperview];
[actView release];
[imgView setBackgroundColor:[UIColor clearColor]];
});
});
}
Here's an updated answer to the best answer, as imageWithContentsOfURL is no longer a method of UIImage. You'll have to use CIImage:
NSURL *url = [NSURL URLWithString:#"http://url_goes_here.com/logo.png"];
imageView.image = [UIImage imageWithCIImage:[CIImage imageWithContentsOfURL:url]];
Unfortunately this feature is not available as of this writing... instead you will have to implement the functionality yourself by:
Downloading the data of the image
Saving it or caching it somewhere (db or filesystem) and then
Setting the UIImaveView to the saved structure
Fortunately you don't have to break your head coming out with said functionality as Apple provides an example that does exactly that as part of their code samples.
Follow the code and I'm sure you will be able to accommodate it to your needs.
EGOImageLoading
EGOImageView* imageView = [[EGOImageView alloc] initWithPlaceholderImage:[UIImage imageNamed:#"placeholder.png"]];
imageView.frame = CGRectMake(0.0f, 0.0f, 36.0f, 36.0f);
//show the placeholder image instantly
[self.anyView addSubview:imageView];
[imageView release] //if you want
//load the image from url asynchronously with caching automagically
imageView.imageURL = [NSURL URLWithString:photoURL];
If you want more, there is a delegate to handle actions after loading
#protocol EGOImageViewDelegate<NSObject>
#optional
- (void)imageViewLoadedImage:(EGOImageView*)imageView;
- (void)imageViewFailedToLoadImage:(EGOImageView*)imageView error:(NSError*)error;
#end
Another option is TTImageView from the Three20 library which handles asynchronous loading of an image. The control works great. However, the downside is you have to deal with the nightmare of installing Three20 (which is then almost impossible to fully remove).
use AFNetworking category UIImageView+AFNetworking.h
#import "UIImageView+AFNetworking.h
[profileImageView setImageWithURL:[NSURL URLWithString:photoURL] placeholderImage:[UIImage imageNamed:#"placeholder"]];