Cache to save images in dynamic memory iphone - iphone

I am implementing a Cache in my iOS app, that would keep images downloaded in RAM.
I did some research and found some code but most of them were for caching images to permanent storage.
I tried NSCache but couldn't work it around for my need.
The requirements are:
Limit on saving images. e.g. 100.
As the Cache limit is reached, it should remove most older image inserted before adding a new one.
I'm not sure about the exact word but I think it should be called FIFO cache (First in first out).
After some research, I did the following implementation.
static NSMutableDictionary *thumbnailImagesCache = nil;
+ (UIImage *)imageWithURL:(NSString *)_imageURL
{
if (thumbnailImagesCache == nil) {
thumbnailImagesCache = [NSMutableDictionary dictionary];
}
UIImage *image = nil;
if ((image = [thumbnailImagesCache objectForKey:_imageURL])) {
DLog(#"image found in Cache")
return image;
}
/* the image was not found in cache - object sending request for image is responsible to download image and save it to cache */
DLog(#"image not found in cache")
return nil;
}
+ (void)saveImageForURL:(UIImage *)_image URLString:(NSString *)_urlString
{
if (thumbnailImagesCache == nil) {
thumbnailImagesCache = [NSMutableDictionary dictionary];
}
if (_image && _urlString) {
DLog(#"adding image to cache")
if (thumbnailImagesCache.count > 100) {
NSArray *keys = [thumbnailImagesCache allKeys];
NSString *key0 = [keys objectAtIndex:0];
[thumbnailImagesCache removeObjectForKey:key0];
}
[thumbnailImagesCache setObject:_image forKey:_urlString];
DLog(#"images count in cache = %d", thumbnailImagesCache.count)
}
}
Now the problem is that I'm not sure weather this is the correct/efficient solution. Any one have any better idea/solution?

Your assumption about the order of the keys is certainly incorrect. The order of the keys in an NSDictionary is unspecified, the key and value at index 0 need not be the oldest one. You shall store the creation date of each image in the method where you put them in the cache dictionary.
Apart from that, the rest of the code seems valid.

Related

iOS NSArray inserting object at an index

I'm looking at the crash log for a device that's testing an app and I see the following lines...
objc_exception_throw + 33
[__NSArrayM insertObject:atIndex:] +187
The code where this happens is below. appData is an NSDictionary, and I'm expecting imageUrl to be a URL to a png file on the internet.
for (int i = 1; i <= [self getNumberOfScreenshots]; i++) {
pathToUrl = #"screenshot_";
pathToUrl = [pathToUrl stringByAppendingString:[[NSNumber numberWithInt:i] stringValue]];
imageUrl = [self.appData valueForKey:pathToUrl];
imageData = [[NSData alloc]initWithContentsOfURL:[NSURL URLWithString:imageUrl]];
[NSMutableArrayObj addObject:imageData];
}
What would cause this type of error? The error happens very rarely..could it be that imageData is sometimes nil because it fails to download the png image off the url, so that throws that exception when I try to add it to the NSMutableArrayObj?
Thanks!
there may two reasons as you have not described error more specically
1) Actually your not allocating the memory to array
2) inserting nil value to array. To stop inserting nil do this:
if(imageData)
{
[NSMutableArrayObj addObject:imageData];
}

ALAsset Image Size

Given an ALAsset that represents a photo, is it possible to retrieve the size of the photo (height and width) without loading the image into a UIImageView and also without using the aspectRationThumnail method?
Just a note:
iOS 5.1 introduces a new property dimensions for an ALAssetRepresentation instance. This returns a CGSize structure with the dimensions of the original image and might be the best solution for this problem in the future.
Cheers,
Hendrik
float width = asset.defaultRepresentation.dimensions.width;
float height = asset.defaultRepresentation.dimensions.height;
it fast, stable, and gives the actual dimensions. I've used it with ALAsset of videos.
A simpler way to access the image size is through [ALAssetRepresentation metadata]. On the images I tested on, this NSDictionary contained keys named PixelWidth and PixelHeight, which were NSNumber objects with the values you'd expect.
However, there seem to be no particular guarantees about the exact keys you'll find, so make sure your app can deal with cases where those keys aren't in the metadata. Also see iOS ALAsset image metadata for some cautions about speed and thread safety.
Comparison
I tested both methods - loading the image data in CGImageSourceRef or reading the metadata - on my iPad's entire asset library. Both methods returned the same sizes to within FLT_EPSILON. Apart from 2 outliers which took double time, the run times out of 16 repetitions were very similar:
Method | Mean time +/- 95% confidence
Size from CGImageSourceRef | 0.1787 +/- 0.0004
Size from metadata | 0.1789 +/- 0.0015
So, neither method has a performance benefit. It is entirely possible that the metadata dictionary is constructed on demand by reading the image data.
Update
This didn't work as originally offered, as noted in the comments. I've fixed it, but it now loads all the image data, which the OP was trying to avoid. It still avoids the additional, and still worse step, of decompressing the data into an image.
Get the defaultRepresentation of the ALAsset.
Get the data for the ALAssetRepresentation.
Use an adaptation of this handy sizeOfImageAtURL function. Thank you, shpakovski.
The code below represents the steps above.
// This method requires the ImageIO.framework
// This requires memory for the size of the image in bytes, but does not decompress it.
- (CGSize)sizeOfImageWithData:(NSData*) data;
{
CGSize imageSize = CGSizeZero;
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) data, NULL);
if (source)
{
NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:NO] forKey:(NSString *)kCGImageSourceShouldCache];
NSDictionary *properties = (__bridge_transfer NSDictionary*) CGImageSourceCopyPropertiesAtIndex(source, 0, (__bridge CFDictionaryRef) options);
if (properties)
{
NSNumber *width = [properties objectForKey:(NSString *)kCGImagePropertyPixelWidth];
NSNumber *height = [properties objectForKey:(NSString *)kCGImagePropertyPixelHeight];
if ((width != nil) && (height != nil))
imageSize = CGSizeMake(width.floatValue, height.floatValue);
}
CFRelease(source);
}
return imageSize;
}
- (CGSize)sizeOfAssetRepresentation:(ALAssetRepresentation*) assetRepresentation;
{
// It may be more efficient to read the [[[assetRepresentation] metadata] objectForKey:#"PixelWidth"] integerValue] and corresponding height instead.
// Read all the bytes for the image into NSData.
long long imageDataSize = [assetRepresentation size];
uint8_t* imageDataBytes = malloc(imageDataSize);
[assetRepresentation getBytes:imageDataBytes fromOffset:0 length:imageDataSize error:nil];
NSData *data = [NSData dataWithBytesNoCopy:imageDataBytes length:imageDataSize freeWhenDone:YES];
return [self sizeOfImageWithData:data];
}
- (CGSize)sizeOfAsset:(ALAsset*) asset;
{
return [self sizeOfAssetRepresentation:[asset defaultRepresentation]];
}
float width = CGImageGetWidth(asset.defaultRepresentation.fullResolutionImage);
float height = CGImageGetHeight(asset.defaultRepresentation.fullResolutionImage);
or same for asset.defaultRepresentation.fullScreenImage...

UIManagedDocument - Validating Core Data Entity

I have an app that uses Core Data and it gets its ManagedObjectContext by using UIManagedObject. From reading, I see that I am not suppose to save the context directly - rather I should depend on autosaving of UIManagedObject or use saveToURL:... My issue is that I want to validate the data being stored in my entity. I have constraints on the entity that specify that the min length for the string properties is 1. However, I can create a new object, assign its properties empty strings, and save the file. In the completion handler of saveToURL:... it always has a true success value. I then created my own validator for the name property of my entity. I used sample code from the Core Data Programming Guide -
-(BOOL)validateName:(id *)ioValue error:(__autoreleasing NSError **)outError
{
if (*ioValue == nil)
{
if (outError != NULL)
{
NSString *errorStr = #"nil error";
NSDictionary *userInfoDict = [NSDictionary dictionaryWithObject:errorStr
forKey:NSLocalizedDescriptionKey];
NSError __autoreleasing *error = [[NSError alloc] initWithDomain:#"domain"
code:1
userInfo:userInfoDict];
*outError = error;
}
return NO;
}
else if( [*ioValue length] == 0 )
{
if (outError != NULL) {
NSString *errorStr = #"length error";
NSDictionary *userInfoDict = [NSDictionary dictionaryWithObject:errorStr
forKey:NSLocalizedDescriptionKey];
NSError __autoreleasing *error = [[NSError alloc] initWithDomain:#"domain"
code:1
userInfo:userInfoDict];
*outError = error;
}
return NO;
}
else
{
return YES;
}
}
When this runs, I see that the ioValue has 0 length and that it returns NO, but then my completion handler is never called. Any help would be great.
Is there something I am missing for how to handle saving errors with UIManagedDocument - particularly how to notify the calling code that an error happened while saving its information.
As a rule, you should only call saveToURL to create a brand new file. Let auto-save do the rest.
Also, I'm not sure I follow your question. If you are asking how to know about save failures, the best you can do is register for notifications (since all saves happen on a background thread).
Directly from the documentation:
A UIDocument object has a specific state at any moment in its life cycle. You can check the current state by querying the documentState property. And you can be notified of changes in the state of a document by observing the UIDocumentStateChangedNotification notification.
I guess I need to implement handleError:(NSError *)error userInteractionPermitted:(BOOL)userInteractionPermitted in a subclass of the UIManagedDocument. I found that via this article - http://blog.stevex.net/2011/12/uimanageddocument-autosave-troubleshooting/

Truncated Core Data NSData objects

I am saving arrays of doubles in an NSData* object that is persisted as a binary property in a Core Data (SQLite) data model. I am doing this to store sampled data for graphing in an iPhone app. Sometimes when there are more than 300 doubles in the binary object not all the doubles are getting saved to disk. When I quit and relaunch my app there may be as few as 25 data points that have persisted or as many as 300.
Using NSSQLitePragmasOption with synchronous = FULL and this may be making a difference. It is hard to tell, as bug is intermittent.
Given the warnings about performance problems as a result of using synchronous = FULL, I am seeking advice and pointers.
Thanks.
[[Edit: here is code.]]
The (as yet unrealized) intent of -addToCache: is to add each new datum to the cache but only flush (fault?) Data object periodically.
From Data.m
#dynamic dataSet; // NSData * attribute of Data entity
- (void) addDatum:(double_t)datum
{
DLog(#"-[Data addDatum:%f]", datum);
[self addToCache:datum];
}
- (void) addToCache:(double_t)datum
{
if (cache == nil)
{
cache = [NSMutableData dataWithData:[self dataSet]];
[cache retain];
}
[cache appendBytes:&datum length:sizeof(double_t)];
DLog(#"-[Data addToCache:%f] ... [cache length] = %d; cache = %p", datum, [cache length], cache);
[self flushCache];
}
- (void) wrapup
{
DLog(#"-[Data wrapup]");
[self flushCache];
[cache release];
cache = nil;
DLog(#"[self isFault] = %#", [self isFault] ? #"YES" : #"NO"); // [self isFault] is always NO.
}
- (void) flushCache
{
DLog(#"flushing cache to store");
[self setDataSet:cache];
DLog(#"-[Data flushCache:] [[self dataSet] length] = %d", [[self dataSet] length]);
}
- (double*) bytes
{
return (double*)[[self dataSet] bytes];
}
- (NSInteger) count
{
return [[self dataSet] length]/sizeof(double);
}
- (void) dump
{
ALog(#"Dump Data");
NSInteger numDataPoints = [self count];
double *data = (double*)[self bytes];
ALog(#"numDataPoints = %d", numDataPoints);
for (int i = 0; i
I was trying to get behavior as if my Core Data entity could have an NSMutableData attribute. To do this my NSManagedObject (called Data) had an NSData attribute and an NSMutableData ivar. My app takes sample data from a sensor and appends each data point to the data set - this is why I needed this design.
On each new data point was appended to the NSMutableData and then the NSData attribute was set to the NSMutableData.
I suspect that because the NSData pointer wasn't changing (though its content was), that Core Data did not appreciate the amount of change. Calling -hasChanged on the NSManagedObjectContext showed that there had been changes, and calling -updatedObjects even listed the Data object as having changed. But the actual data that was being written seems to have been truncated (sometimes).
To work around this I changed things slightly. New data points are still appended to NSMutableData but NSData attribute is only set when sampling is completed. This means that there is a chance that a crash might result in truncated data - but for the most part this work around seems to have solved the problem.
Caveat emptor: the bug was always intermittent, so it is possible that is still there - but just harder to manifest.

Any way to get a Cached UIImage from my 'Documents' directory?

I know that the -imageNamed: method returns a Cached UIImage, but the problem is that my image file is stored in 'Documents', and the -imageNamed: method seems to only search the Bundle... I am currently (reluctantly) using -imageWithContentsOfFile: to get my image from 'Documents' but it is not the same...Scaling up/down a UIImageView containing the resulting image is choppy and awkward. Scaling the same UIImageView containing an image created with -imageNamed: however appears very smooth. So, again: How can I get a cached UIImage from my 'Documents' if I cannot use -imageNamed:?
I made an extension of the answer provided by rpetrich and overrode the imageName: method to add more of a drop in replacement. It searches the main bundle first and then looks in the caches directory. You could of course change the caches directory to the document directory.
#interface UIImage (CacheExtensions)
+ (UIImage *)imageNamed:(NSString *)name;
+ (void)clearCache;
#end
#import "UIImage+CacheExtensions.h"
static NSMutableDictionary *UIImageCache;
#implementation UIImage (CacheExtensions)
+ (UIImage *)imageNamed:(NSString *)name
{
id result;
if (!UIImageCache)
UIImageCache = [[NSMutableDictionary alloc] init];
else {
result = [UIImageCache objectForKey:name];
if (result) return result;
}
// First, check the main bundle for the image
NSString *imagePath = [[NSBundle mainBundle] pathForResource:name ofType:nil];
result = [UIImage imageWithContentsOfFileimagePath];
if(result) {
[UIImageCache setObject:result forKey:name];
return result;
}
// If not found, search for the image in the caches directory
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES);
NSString *cachesImagePath = [[paths lastObject] stringByAppendingPathComponent:name];
result = [UIImage imageWithContentsOfFile:cachesImagePath];
if(result) {
[UIImageCache setObject:result forKey:name];
return result;
}
return nil;
}
+ (void)clearCache
{
[UIImageCache removeAllObjects];
}
#end
The simplest way would be an NSMutableDictionary storing the cached images and a clear cache method:
#interface UIImage (CacheExtensions)
+ (id)cachedImageWithContentsOfFile:(NSString *)path;
+ (void)clearCache;
#end
static NSMutableDictionary *UIImageCache;
#implementation UIImage (CacheExtensions)
+ (id)cachedImageWithContentsOfFile:(NSString *)path
{
id result;
if (!UIImageCache)
UIImageCache = [[NSMutableDictionary alloc] init];
else {
result = [UIImageCache objectForKey:path];
if (result)
return result;
}
result = [UIImage imageWithContentsOfFile:path];
[UIImageCache setObject:result forKey:path];
return result;
}
+ (void)clearCache
{
[UIImageCache removeAllObjects];
}
#end
Note: you should call +[UIImage clearCache] from your didReceiveMemoryWarning method. Also, clearCache will invalidate all objects in the cache, not just unused items; a UIImage subclass and more complicated caching mechanism would be required to remedy this.
You can cache UIImages yourself just as -imageNamed: does. It just loads them, and then holds onto them. You can hold onto them, too, using an NSDictionary and implement your own -imageNamed:
But I'm more concerned about the trouble you're having with scaling. How are your images getting into Documents, how are you scaling them, and have you tested the same image file stored in the bundle? I doubt that -imageNamed: has anything to do with this. I would more suspect things like the fact that the bundle has some compression applied to it (though I don't yet have a theory on why this would matter in practice), differences in the file, or differences in how the rest of the program is behaving during scaling (causing contention on the disk or CPU). Caching is unlikely related to this issue.
I'd do some profiling w/ Instruments to try to find out where the choppiness is coming from. Are you maxing out the disk, CPU, memory? What's the bottleneck?
What about writing your own image cache? You have all the pieces in place, now you just need to encapsulate it and keep a record of images you've already loaded.