Load an Image from URL by many threads - iphone

I'm loading an image from URL into my app. The image size is large (around 1.5Mb). How can I use many threads (ex: 2 threads) to load this image to improve the speed? If using one thread to load this image, it takes me around 5s and I want to reduce this duration.

You are correct. 1.5Mb is a big image and the way to optimise is NOT to use many threads. Although you are on the right track. The technique is called "slicing" and is heavily used on web to load images faster. So take a image and slice it into 3 or 4 smaller pics (and not more) in your server. When rendering call these 4 images all at once. It will load faster than one big pic. Also this lessens the "perceived" latency for the end-user.
Also, when you slice up an image, it makes it easier to reduce the number of colors necessary to display that portion of the image, thus reducing your file size (sometimes fairly significantly).
As an example Google does used to do this for its main logo in its main search page. See 4 split us images of its logo?
The downside of slicing is that it increases maintenance costs. Some one has to maintain these image splits and make sure nothing goes amiss as the app keeps changing.

Please Try the following Code:
//in .h file declare the following objects:
IBOutlet UIImageView *imgTest;
-(IBAction)buttonTapped:(id)sender;
-(void)LoadImage:(NSString *) irlString;
-(void)setImage:(NSData *) imgData;
//in .m file write the following code:
-(IBAction)buttonTapped:(id)sender
{
[self performSelectorOnMainThread:#selector(LoadImage:) withObject:#"http://www.google.com/images/errors/logo_sm.gif" waitUntilDone:NO];
}
-(void)LoadImage:(NSString *) urlString
{
NSURL *imgURL=[NSURL URLWithString:urlString];
NSData *imgData=[NSData dataWithContentsOfURL:imgURL];
[self performSelectorInBackground:#selector(setImage:) withObject:imgData];
}
-(void)setImage:(NSData *) imgData;
{
imgTest.image=[UIImage imageWithData:imgData];
}
you can use activity indicator while loading the image as well. Start it in the buttonTapped method and stop it in the setImage method.
i hope this will help you.

Related

Showing Accurate Progress In UIProgressView While Downloading Images in iphone

I have four urls which consists images...
I'm downloding those images and placing them into documents folder..
here is my code..
-(void)viewDidLoad
{
NSMutableArray *myUrlsArray=[[NSMutableArray alloc]init];
[myUrlsArray addObject:#"http://blogs.sfweekly.com/thesnitch/steve_jobs3.jpg"];
[myUrlsArray addObject:#"http://www.droid-life.com/wp-content/uploads/2012/12/Steve-Jobs-Apple.jpg"];
[myUrlsArray addObject:#"http://2.bp.blogspot.com/-T6nbl0rQoME/To0X5FccuCI/AAAAAAAAEZQ/ipUU7JfEzTs/s1600/steve-jobs-in-time-magazine-front-cover.png"];
[myUrlsArray addObject:#"http://images.businessweek.com/ss/08/09/0929_most_influential/image/steve_jobs.jpg"];
[myUrlsArray addObject:#"http://cdn.ndtv.com/tech/gadget/image/steve-jobs-face.jpg"];
for (int i=0; i<myUrlsArray.count; i++)
{
[self downloadImageFromURL:[myUrlsArray objectAtIndex:i] withName:[NSString stringWithFormat:#"MyImage%i.jpeg",i]];
}
}
#pragma mark- downloading File
-(void)downloadImageFromURL:(NSString *)myURLString withName:(NSString *)fileName
{
UIImage *image = [[UIImage alloc] initWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:myURLString]]];
NSLog(#"%f,%f",image.size.width,image.size.height);
// Let's save the file into Document folder.**
NSString *documentsPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES) objectAtIndex:0];
NSString *jpegPath = [NSString stringWithFormat:#"%#/%#",documentsPath,fileName];// this path if you want save reference path in sqlite
NSData *data2 = [NSData dataWithData:UIImageJPEGRepresentation(image, 1.0f)];//1.0f = 100% quality
[data2 writeToFile:jpegPath atomically:YES];
}
NOW... I need to display a UIProgressView for above downloading progress accurately.
how can i achieve this functionality...
Can any one provide some guidelines to achieve this..
Thanks in advance...
I'd suggest you use some asynchronous downloading technique (either AFNetworking, SDWebImage, or roll your own with delegate-based NSURLSession) rather than dataWithContentsOfURL so that (a) you don't block the main queue; and (b) you can get progress updates as the downloads proceed.
I'd also suggest creating a NSProgress for each download. When your delegate method gets updates about how many bytes have been downloaded, update the NSProgress object.
You then can associate each NSProgress with a observedProgress for a UIProgressView, and when you update your NSProgress, the UI can be updated automatically.
Or, if you and a single UIProgressView to show the aggregate progress of all of the NSProgress for each download, you can create a parent NSProgress, establish each download's NSProgress as a child of the parent NSProgress, and then, as each download updates its respective NSProgress, this will automatically trigger the calculation of the parent NSProgress. And again, you can tie that parent NSProgress to a master UIProgressView, and you'll automatically update the UI with the total progress, just by having each download update its individual NSProgress.
There is a trick, though, insofar as some web services will not inform you of the number of bytes to be expected. They'll report an "expected number of bytes" of NSURLResponseUnknownLength, i.e. -1! (There are logical reasons why it does that which are probably beyond the scope of this question.) That obviously makes it hard to calculate what percentage has been downloaded.
In that case, there are a few approaches:
You can throw up your hands and just use an indeterminate progress indicator;
You can try changing the request such that web service will report meaningful "expected number of bytes" values (e.g. https://stackoverflow.com/a/22352294/1271826); or
You can use an "estimated download size" to estimate the percentage completion. For example, if you know your images are, on average, 100kb each, you can do something like the following to update the NSProgress associated with a particular download:
if (totalBytesExpectedToWrite >= totalBytesWritten) {
self.progress.totalUnitCount = totalBytesExpectedToWrite;
} else {
if (totalBytesWritten <= 0) {
self.progress.totalUnitCount = kDefaultImageSize;
} else {
double written = (double)totalBytesWritten;
double percent = tanh(written / (double)kDefaultImageSize);
self.progress.totalUnitCount = written / percent;
}
}
self.progress.completedUnitCount = totalBytesWritten;
This is a bit of sleight of hand that uses the tanh function to return a "percent complete" value that smoothly and asymptotically approaches 100%, using the kDefaultImageSize as the basis for the estimation.
It's not perfect, but it yields a pretty decent proxy for percent completion.
Your call to dataWithContentsOfURL is synchronous, meaning you don't get updates as the download is in process.
You can use a library like AFNetworking (https://github.com/AFNetworking/AFNetworking) which has callbacks to the progress of the download.
Actually a better solution is to use SDWebImage manager which will load the images in the background for you and cache them. Then the next time you use that image it will check the cache. Google it.
That way the user also doesn't have to sit around and wait while you're downloading stuff..
Then look at this other question that has some ideas on how to do a status:
How to show an activity indicator in SDWebImage
Do not use dataWithContentsOfURL, you are blocking the main thread until the data arrives.
Instead create your own connection with NSURLConnection and start listening to your delegate.
connection:(NSURLConnection *)connection didReceiveResponse:(NSURLResponse *)response: get the total data size with [response expectedContentLength].
connection:(NSURLConnection *)connection didReceiveData:(NSData *)data: This is where you do your calculations and update your UIProgressView. Something like, loadedBytes/total data size.
Good luck.

Get amount of memory used by app in iOS

I'm working on an upload app that splits files before upload. It splits the files to prevent being closed by iOS for using too much memory as some of the files can be rather large. It would be great if I could, instead of setting the max "chunk" size, set the max memory usage and determine the size using that.
Something like this
#define MAX_MEM_USAGE 20000000 //20MB
#define MIN_CHUNK_SIZE 5000 //5KB
-(void)uploadAsset:(ALAsset*)asset
{
long totalBytesRead = 0;
ALAssetRepresentation *representation = [asset defaultRepresentation];
while(totalBytesRead < [representation size])
{
long chunkSize = MAX_MEM_USAGE - [self getCurrentMemUsage];
chunkSize = min([representation size] - totalBytesRead,max(chunkSize,MIN_CHUNK_SIZE));//if I can't get 5KB without getting killed then I'm going to get killed
uint8_t *buffer = malloc(chunkSize);
//read file chunk in here, adding the result to totalBytesRead
//upload chunk here
}
}
Is essentially what I'm going for. I can't seem to find a way to get the current memory usage of my app specifically. I don't really care about the amount of system memory left.
The only way I've been able to think of is one I don't like much. Grab the amount of system memory on the first line of main in my app, then store it in a static variable in a globals class then the getCurrentMemUsage would go something like this
-(long)getCurrentMemUsage
{
long sysUsage = [self getSystemMemoryUsed];
return sysUsage - [Globals origSysUsage];
}
This has some serious drawbacks. The most obvious one to me is that another app might get killed in the middle of my upload, which could drop sysUsage lower than origSysUsage resulting in a negative number even if my app is using 10MB of memory which could result in my app using 40MB for a request rather than the maximum which is 20MB. I could always set it up so it clamps the value between MIN_CHUNK_SIZE and MAX_MEM_USAGE, but that would just be a workaround instead of an actual solution.
If there are any suggestions as to getting the amount of memory used by an app or even different methods for managing a dynamic chunk size I would appreciate either.
Now, as with any virtual memory operating system, getting the "memory used" is not very well defined and is notoriously difficult to define and calculate.
Fortunately, thanks to the virtual memory manager, your problem can be solved quite easily: the mmap() C function. Basically, it allows your app to virtually load the file into memory, treating it as if it were in RAM, but it is actually swapped in from storage as it is accessed, and swapped out when iOS is low on memory.
This function is really easy to use in iOS with the Cocoa APIs for it:
- (void) uploadMyFile:(NSString*)fileName {
NSData* fileData = [NSData dataWithContentsOfMappedFile:fileName];
// Work with the data as with any NSData* object. The iOS kernel
// will take care of loading the file as needed.
}

Ignoring iOS high-resolution #2x files

I have an app with a large number of images (1000+). I have a database table which contains all the filenames for these images. Images are loaded on demand based on this filename using UIImage* image = [UIImage imageNamed:...]
Due to the large number of images, I would like to programmatically test to ensure all image that are included in the database are in fact present in the project. To achieve this, I'm pulling all filenames from the table, looping over each filename, running the above code, and checking to see if image != nil. This works just fine.
The problem is that I would like to confirm that both normal resolution and high resolution (#2x) images are there. If the high-resolution file is present but the normal-resolution file is not, my code will not detect this.
Is there some way I can achieve this without having the run this process twice, once per resolution type? Can I force the SDK to ignore #2x files?
You should just use NSFileManager. It will probably be much quicker since it won't actually load the contents of the image files. For each fileName, use NSFileManager's fileExistsAtPath method to check for the image. Then, append "#2x" to the base name to check for the 2x image.
I would use NSBundle to locate the files so the UIImage doesn't have to be loaded into memory, you could use something like...
NSBundle *myBundle = [NSBundle mainBundle];
if([myBundle pathForResource:#"MyImage" forType: #"png"] == nil){
// low res image isn't there
}
if([myBundle pathForResource:#"MyImage#2x" forType: #"png"] == nil){
// high res image isn't there
}
Should be much faster...

TTImageView shows black image (OR: replacement for TTImageView)

I'm using Three20's TTImageView for it's async image loading + caching.
I've noticed this issue a bunch of times where an image will show up as completely black, and never finish loading. Here's an example of what I'm seeing:
http://screencast.com/t/7O7fnedX5Z2
So...basically I'm wondering if this is a bug in three20, and if so, how I might go about fixing it (is there a patch out there that might fix it)...OR:
Is there a good TTImageView replacement that performs async image loading + caching?
Turns out I was seeing the following in my log:
TTRequestLoader connection:didReceiveResponse:: TTDASSERT failed: 0 == _queue.maxContentLength || contentLength <=_queue\
.maxContentLength
(one for each failed image)
After that, a little bit of googling rendered:
http://groups.google.com/group/three20/browse_thread/thread/8bfac3654a6d9674/caf797f265445971?pli=1
Jeff Verkoeyen:
The comment immediately before that
assert should shed some light on the
situation.
// If you hit this assertion it's because a massive file is about to be downloaded.
// If you're sure you want to do this, add the following line to your app delegate startup
// method. Setting the max content length to zero allows anything to go through. If you just
// want to raise the limit, set it to any positive byte size.
// [[TTURLRequestQueue mainQueue] setMaxContentLength:0]

How to Load an array into OpenFlow

I'm trying to implement openFlow in my project but I cant seem to get the images to show up on my uiview. What isnt clear to me is once I have the dictionary of image links, how do i tell AFOpenView that I want to use that dictionary object as my data source?
I've looked at the demo code and I see that when the flickr request finishes, he saves a copy of the dictionary results, counts them, and then tells OpenFlowView that there are x number of images, but what is never clear is how he tells OpenFlowView to use the dictionary with the results?
- (void)flickrAPIRequest:(OFFlickrAPIRequest *)inRequest didCompleteWithResponse:(NSDictionary *)inResponseDictionary
{
// Hold onto the response dictionary.
interestingPhotosDictionary = [inResponseDictionary retain];
int numberOfImages = [[inResponseDictionary valueForKeyPath:#"photos.photo"] count];
[(AFOpenFlowView *)self.view setNumberOfImages:numberOfImages];
}
See here: http://blog.objectgraph.com/index.php/2010/04/09/how-to-add-coverflow-effect-on-your-iphone-app-openflow/
This tutorial seems to suggest that you have to call the view's setImage method multiple times, once per image.
This tells me that the implementation is confusing and weird, but for this you have to blame the component's author.
The images are loaded on demand in the 'updateCoverImage:' method of AFOpenFlowView.m
'updateCoverImage:' calls 'openFlowView:requestImageForIndex:' in AFOpenFlowViewController.m, which uses interestingPhotosDictionary.
So, it is called on demand whenever an image needs to be loaded. It wraps an operation queue so the images are loaded outside the main thread.