Given an ALAsset that represents a photo, is it possible to retrieve the size of the photo (height and width) without loading the image into a UIImageView and also without using the aspectRationThumnail method?
Just a note:
iOS 5.1 introduces a new property dimensions for an ALAssetRepresentation instance. This returns a CGSize structure with the dimensions of the original image and might be the best solution for this problem in the future.
Cheers,
Hendrik
float width = asset.defaultRepresentation.dimensions.width;
float height = asset.defaultRepresentation.dimensions.height;
it fast, stable, and gives the actual dimensions. I've used it with ALAsset of videos.
A simpler way to access the image size is through [ALAssetRepresentation metadata]. On the images I tested on, this NSDictionary contained keys named PixelWidth and PixelHeight, which were NSNumber objects with the values you'd expect.
However, there seem to be no particular guarantees about the exact keys you'll find, so make sure your app can deal with cases where those keys aren't in the metadata. Also see iOS ALAsset image metadata for some cautions about speed and thread safety.
Comparison
I tested both methods - loading the image data in CGImageSourceRef or reading the metadata - on my iPad's entire asset library. Both methods returned the same sizes to within FLT_EPSILON. Apart from 2 outliers which took double time, the run times out of 16 repetitions were very similar:
Method | Mean time +/- 95% confidence
Size from CGImageSourceRef | 0.1787 +/- 0.0004
Size from metadata | 0.1789 +/- 0.0015
So, neither method has a performance benefit. It is entirely possible that the metadata dictionary is constructed on demand by reading the image data.
Update
This didn't work as originally offered, as noted in the comments. I've fixed it, but it now loads all the image data, which the OP was trying to avoid. It still avoids the additional, and still worse step, of decompressing the data into an image.
Get the defaultRepresentation of the ALAsset.
Get the data for the ALAssetRepresentation.
Use an adaptation of this handy sizeOfImageAtURL function. Thank you, shpakovski.
The code below represents the steps above.
// This method requires the ImageIO.framework
// This requires memory for the size of the image in bytes, but does not decompress it.
- (CGSize)sizeOfImageWithData:(NSData*) data;
{
CGSize imageSize = CGSizeZero;
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) data, NULL);
if (source)
{
NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:NO] forKey:(NSString *)kCGImageSourceShouldCache];
NSDictionary *properties = (__bridge_transfer NSDictionary*) CGImageSourceCopyPropertiesAtIndex(source, 0, (__bridge CFDictionaryRef) options);
if (properties)
{
NSNumber *width = [properties objectForKey:(NSString *)kCGImagePropertyPixelWidth];
NSNumber *height = [properties objectForKey:(NSString *)kCGImagePropertyPixelHeight];
if ((width != nil) && (height != nil))
imageSize = CGSizeMake(width.floatValue, height.floatValue);
}
CFRelease(source);
}
return imageSize;
}
- (CGSize)sizeOfAssetRepresentation:(ALAssetRepresentation*) assetRepresentation;
{
// It may be more efficient to read the [[[assetRepresentation] metadata] objectForKey:#"PixelWidth"] integerValue] and corresponding height instead.
// Read all the bytes for the image into NSData.
long long imageDataSize = [assetRepresentation size];
uint8_t* imageDataBytes = malloc(imageDataSize);
[assetRepresentation getBytes:imageDataBytes fromOffset:0 length:imageDataSize error:nil];
NSData *data = [NSData dataWithBytesNoCopy:imageDataBytes length:imageDataSize freeWhenDone:YES];
return [self sizeOfImageWithData:data];
}
- (CGSize)sizeOfAsset:(ALAsset*) asset;
{
return [self sizeOfAssetRepresentation:[asset defaultRepresentation]];
}
float width = CGImageGetWidth(asset.defaultRepresentation.fullResolutionImage);
float height = CGImageGetHeight(asset.defaultRepresentation.fullResolutionImage);
or same for asset.defaultRepresentation.fullScreenImage...
Related
I am implementing a Cache in my iOS app, that would keep images downloaded in RAM.
I did some research and found some code but most of them were for caching images to permanent storage.
I tried NSCache but couldn't work it around for my need.
The requirements are:
Limit on saving images. e.g. 100.
As the Cache limit is reached, it should remove most older image inserted before adding a new one.
I'm not sure about the exact word but I think it should be called FIFO cache (First in first out).
After some research, I did the following implementation.
static NSMutableDictionary *thumbnailImagesCache = nil;
+ (UIImage *)imageWithURL:(NSString *)_imageURL
{
if (thumbnailImagesCache == nil) {
thumbnailImagesCache = [NSMutableDictionary dictionary];
}
UIImage *image = nil;
if ((image = [thumbnailImagesCache objectForKey:_imageURL])) {
DLog(#"image found in Cache")
return image;
}
/* the image was not found in cache - object sending request for image is responsible to download image and save it to cache */
DLog(#"image not found in cache")
return nil;
}
+ (void)saveImageForURL:(UIImage *)_image URLString:(NSString *)_urlString
{
if (thumbnailImagesCache == nil) {
thumbnailImagesCache = [NSMutableDictionary dictionary];
}
if (_image && _urlString) {
DLog(#"adding image to cache")
if (thumbnailImagesCache.count > 100) {
NSArray *keys = [thumbnailImagesCache allKeys];
NSString *key0 = [keys objectAtIndex:0];
[thumbnailImagesCache removeObjectForKey:key0];
}
[thumbnailImagesCache setObject:_image forKey:_urlString];
DLog(#"images count in cache = %d", thumbnailImagesCache.count)
}
}
Now the problem is that I'm not sure weather this is the correct/efficient solution. Any one have any better idea/solution?
Your assumption about the order of the keys is certainly incorrect. The order of the keys in an NSDictionary is unspecified, the key and value at index 0 need not be the oldest one. You shall store the creation date of each image in the method where you put them in the cache dictionary.
Apart from that, the rest of the code seems valid.
-(void)decode:(CVImageBufferRef)BufferRef
{
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
CGImageRef videoFrameImage = [ZXCGImageLuminanceSource createImageFromBuffer:BufferRef];
CGImageRef rotatedImage = [self rotateImage:videoFrameImage degrees:0.0f];
[NSMakeCollectable(videoFrameImage) autorelease];
//Decoding:
ZXLuminanceSource* source = [[[ZXCGImageLuminanceSource alloc] initWithCGImage:rotatedImage] autorelease];
[NSMakeCollectable(rotatedImage) autorelease];
ZXBinaryBitmap* bitmap = [ZXBinaryBitmap binaryBitmapWithBinarizer:[ZXHybridBinarizer binarizerWithSource:source]];
NSError* error = nil;
// There are a number of hints we can give to the reader, including
// possible formats, allowed lengths, and the string encoding.
ZXDecodeHints* hints = [ZXDecodeHints hints];
ZXMultiFormatReader* reader = [ZXMultiFormatReader reader];
ZXResult* result = [reader decode:bitmap
hints:hints
error:&error];
if (result)
{
// The coded result as a string. The raw data can be accessed with
// result.rawBytes and result.length.
NSString* contents = result.text;
// The barcode format, such as a QR code or UPC-A
ZXBarcodeFormat format = result.barcodeFormat;
}
else
{
// Use error to determine why we didn't get a result, such as a barcode
// not being found, an invalid checksum, or a format inconsistency.
}
[pool drain];
}
I ran into this problem which is easily resolved by down sampling the image that the camera generates. Apparently it's too high res for the library to process the bar code out of it. By reducing the size of my UIImage to 640x480 before call [image CGImage] everything worked perfectly.
You've commented that you should use the error to determine why you didn't get a result, but you don't actually do that. Look at what's in error.
My problem actually seems rather silly... I am writing an iPhone application that uses MKMapKit. The app grabs the EXIF metadata from a provided geotagged photo. The problem is that the latitude and longitude coordinates that I retrieve, for example:
Lat: 34.25733333333334
Lon: 118.5373333333333
returns a location in China. Is there a regional setting that I am missing or do I need to convert the lat/long coordinates before using them?
Thank you all in advance for any help you can provide.
Here is the code I am using to grab the GPS data. You'll notice I am logging everything to the console so I can see what the values are:
void (^ALAssetsLibraryAssetForURLResultBlock)(ALAsset *) = ^(ALAsset *asset)
{
NSDictionary *metadata = asset.defaultRepresentation.metadata;
NSLog(#"Image Meta Data: %#",metadata);
NSDictionary *gpsdata = [metadata objectForKey:#"{GPS}"];
self.lat = [gpsdata valueForKey:#"Latitude"];
self.lng = [gpsdata valueForKey:#"Longitude"];
NSLog(#"\nLatitude: %#\nLongitude: %#",self.lat,self.lng);
};
NSURL *assetURL = [mediaInfo objectForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:assetURL
resultBlock:ALAssetsLibraryAssetForURLResultBlock
failureBlock:^(NSError *error) {
}];
UIImage *img = [mediaInfo objectForKey:#"UIImagePickerControllerEditedImage"];
previewImage.image = nil;
self.previewImage.image = img;
NSData *imageData = UIImagePNGRepresentation(img);
if ([imageData length] > 0) {
self._havePictureData = YES;
}
i think you should grab the value using following:
CLLocation *location = [asset valueForProperty:ALAssetPropertyLocation];
Are you sure you're not missing a minus sign on that 118? 34.257, -118.5373 is nicely inside Los Angeles, California.
While you can get the location from the asset per #Allen, it is also valid to get it from the GPS metadata as you were trying to do initially. I'm not 100% sure the asset library coordinate will be the same as the coord in the GPS metadata, it depends on how Apple stores this coord. For example, if you are using a timestamp, the Asset library timestamp is different than the EXIF creation date (a different topic, admittedly).
In any case, the reason you have the coord wrong is b/c you also need to get the direction info as follows:
NSDictionary *metadata = asset.defaultRepresentation.metadata;
NSLog(#"Image Meta Data: %#",metadata);
NSDictionary *gpsdata = [metadata objectForKey:#"{GPS}"];
self.lat = [gpsdata valueForKey:#"Latitude"];
self.lng = [gpsdata valueForKey:#"Longitude"];
// lat is negative is direction is south
if ([[gpsdata valueForKey:#"LatitudeRef"] isEqualToString:#"S"]) {
self.lat = -self.lat;
}
// lng is negative if direction is west
if ([[gpsdata valueForKey:#"LongitudeRef"] isEqualToString:#"W"]) {
self.lng = -self.lng;
}
NSLog(#"\nLatitude: %#\nLongitude: %#",self.lat,self.lng);
This also will works,
void (^ALAssetsLibraryAssetForURLResultBlock)(ALAsset *) = ^(ALAsset *asset)
{
ALAssetRepresentation *rep = [asset defaultRepresentation];
NSDictionary *metadata = rep.metadata;
NSMutableDictionary *GPSDictionary = [[[metadata objectForKey:(NSString *)kCGImagePropertyGPSDictionary]mutableCopy] autorelease];
};
I believe that the reason that there isn't a negative sign is because of the metadata: exif:GPSLongitudeRef: W which (I believe) means that there should be a negative sign in front of the longitude since it is referencing the western hemisphere. I believe that this also applies to the latitude but with exif:GPSLatitudeRef: N for Northern and Southern hemispheres. Hope that this helped. Just realized this is exactly what #XJones said. Metadata using ImageMagick.
My question is about core data and memory usage. I have used core data before, but this time the amount of data is higher and this made me realise that there was much more to know. I have seen that there are several other similar posts and I got interesting information from them, but after applying it my apps still crashes. I have been dealing with this issue for a week now. Somebody please help.
Basically I have three subsequent similar loops of 64, 15, and 17 iterations respectively. They work fine on simulator. Tested on a couple of iPads they get memory warnings and they crash at the same iteration (number 34 of the first loop). Tested on iPad 2 it will crash at number 14 of the second loop. Instruments shows a memory usage of about 1.5 MB both live and overall. There are leaks for a few KB.
The loops perform the following (code below)
Execute a fetch with core data
For every record take a parameter stored as a row property attribute (String)
Call a procedure which takes that parameter and which returns data (about hundreds of KB)
Store these data in another property attribute (Transformable) of the same row
Pretty common task isn't it?
Now, since I got into memory issues, I tried to use the all the known (by me) tools at my disposal, which are:
release owned objects asap
create autorelease pools and drain them asap for not owned objects
save context asap
turn objects into faults asap
After applying all these techniques I got an exciting result: the app crashes at the exactly same point as before.
Here it is the code.
- (void) myMainProcedure {
[self performLoop1];
[self performLoop2]; // Similar to loop1
[self performLoop3]; // Similar to loop1
}
- (void) performLoop1 {
NSError * error = nil;
NSAutoreleasePool * myOuterPool;
NSAutoreleasePool * myInnerPool;
NSManagedObjectContext * applicationContext = [[[UIApplication sharedApplication] delegate] managedObjectContext];
[applicationContext setUndoManager:nil];
NSEntityDescription * myEntityDescription = [NSEntityDescription entityForName:#"EntityName"
inManagedObjectContext:applicationContext];
NSFetchRequest * myFetchRequest = [[NSFetchRequest alloc] init];
[myFetchRequest setEntity:myEntityDescription];
NSString * column = #"columnName";
NSPredicate * aWhereClause = [NSPredicate predicateWithFormat:
#"(%K = %#)", column, [NSNumber numberWithInt:0]];
[myFetchRequest setPredicate: aWhereClause];
myOuterPool = [[NSAutoreleasePool alloc] init];
NSArray * myRowsArray = [applicationContext executeFetchRequest:myFetchRequest
error:&error];
NSMutableArray * myRowsMutableArray = [[NSMutableArray alloc] initWithCapacity:0];
[myRowsMutableArray addObjectsFromArray: myRowsArray];
[myOuterPool drain];
[myFetchRequest release];
EntityName * myEntityRow;
int totalNumberOfRows = [myRowsMutableArray count];
myOuterPool = [[NSAutoreleasePool alloc] init];
for (int i = 0; i < totalNumberOfRows; i++) {
myInnerPool = [[NSAutoreleasePool alloc] init];
myEntityRow = [myRowsMutableArray objectAtIndex:0];
NSString * storedSmallAttribute = myEntityRow.smallAttribute;
UIImageView * largeData = [self myMethodUsingParameter: smallAttribute];
myEntityRow.largeAttribute = largeData;
[myRowsMutableArray removeObjectAtIndex:0];
[applicationContext save:&error];
[applicationContext refreshObject:myEntityRow mergeChanges:NO];
[myInnerPool drain];
[largeData release];
}
[myOuterPool drain];
[myRowsMutableArray release];
}
- (UIImageView *) myMethodUsingParameter : (NSString *) link {
UIImageView * toBeReturned = nil;
NSURL *pdfURL = [NSURL fileURLWithPath:link];
CGPDFDocumentRef pdf = CGPDFDocumentCreateWithURL((CFURLRef)pdfURL);
CGPDFPageRef page = CGPDFDocumentGetPage(pdf, 1);
CGRect pageRect = CGPDFPageGetBoxRect(page, kCGPDFMediaBox);
UIGraphicsBeginImageContext(pageRect.size);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetRGBFillColor(context, 1.0,1.0,1.0,1.0);
CGContextFillRect(context,pageRect);
CGContextSaveGState(context);
CGContextTranslateCTM(context, 0.0, pageRect.size.height);
CGContextScaleCTM(context, 1, - 1);
CGContextSetInterpolationQuality(context, kCGInterpolationHigh);
CGContextSetRenderingIntent(context, kCGRenderingIntentDefault);
CGContextDrawPDFPage(context, page);
CGContextRestoreGState(context);
UIImage *imageToBeReturned = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CFRelease(pdf);
toBeReturned = [[UIImageView alloc] initWithImage:imageToBeReturned];
UIGraphicsEndImageContext();
return toBeReturned;
}
Please note that
The mutable array was introduced as a (apparently useless) strategy to
have objects released sooner
Pools have been added ad part of the same strategy
The statement about interpolation quality was the only one to improve
the situation (say, to move the crash
a little bit forward)
Retain count for managed objects within the cycle ranges from 6 to 10
(?) I know that rc is not a valuable
information but still, I made a test
and I found out that I could send
multiple release messages to managed
objects before forcing the app to
crash for this. But the point is that
I am not supposed to release an
object I don't own, am I? ....
The entity upon which the request is set has got also some
bidirectional relationships with
other entities, but still, is this
relevant?
Thank you.
I think you should get a fresh look at your problem.
There are many ways to deal with Core Data that are much simpler than your approach.
Creating class files for your Core Data entities may help you.
Also, when you are saving your files, you should seriously evaluate each objects necessity and whether it can be done a better way.
In your case, I would offer two suggestions:
For each PDF URL,
a.Assign a unique identifier.
b.Save this unique identifier in your
core data store.
c.Add url to a queue for a
background process that creates
your PDF (background process will
allow the user to keep working while
the PDFs are being generated. You
could update your delegate on
progress or create a temporary image
that is replaced when the PDF is
created.)
d.Save the image in your app
directory (or photo library) using
the unique identifier as a name.
e.When needed, load the image from
disk into a UIImageView or
whatever appropriate.
For each PDF URL,
a.Draw the PDF.
b.Get UIImage representation
c.Convert to PNG NSData
(UIImagePNGRepresentation(image))
d.Save NSData in CoreData.
e.Load NSData and convert to UIImage
when needed.
I am saving arrays of doubles in an NSData* object that is persisted as a binary property in a Core Data (SQLite) data model. I am doing this to store sampled data for graphing in an iPhone app. Sometimes when there are more than 300 doubles in the binary object not all the doubles are getting saved to disk. When I quit and relaunch my app there may be as few as 25 data points that have persisted or as many as 300.
Using NSSQLitePragmasOption with synchronous = FULL and this may be making a difference. It is hard to tell, as bug is intermittent.
Given the warnings about performance problems as a result of using synchronous = FULL, I am seeking advice and pointers.
Thanks.
[[Edit: here is code.]]
The (as yet unrealized) intent of -addToCache: is to add each new datum to the cache but only flush (fault?) Data object periodically.
From Data.m
#dynamic dataSet; // NSData * attribute of Data entity
- (void) addDatum:(double_t)datum
{
DLog(#"-[Data addDatum:%f]", datum);
[self addToCache:datum];
}
- (void) addToCache:(double_t)datum
{
if (cache == nil)
{
cache = [NSMutableData dataWithData:[self dataSet]];
[cache retain];
}
[cache appendBytes:&datum length:sizeof(double_t)];
DLog(#"-[Data addToCache:%f] ... [cache length] = %d; cache = %p", datum, [cache length], cache);
[self flushCache];
}
- (void) wrapup
{
DLog(#"-[Data wrapup]");
[self flushCache];
[cache release];
cache = nil;
DLog(#"[self isFault] = %#", [self isFault] ? #"YES" : #"NO"); // [self isFault] is always NO.
}
- (void) flushCache
{
DLog(#"flushing cache to store");
[self setDataSet:cache];
DLog(#"-[Data flushCache:] [[self dataSet] length] = %d", [[self dataSet] length]);
}
- (double*) bytes
{
return (double*)[[self dataSet] bytes];
}
- (NSInteger) count
{
return [[self dataSet] length]/sizeof(double);
}
- (void) dump
{
ALog(#"Dump Data");
NSInteger numDataPoints = [self count];
double *data = (double*)[self bytes];
ALog(#"numDataPoints = %d", numDataPoints);
for (int i = 0; i
I was trying to get behavior as if my Core Data entity could have an NSMutableData attribute. To do this my NSManagedObject (called Data) had an NSData attribute and an NSMutableData ivar. My app takes sample data from a sensor and appends each data point to the data set - this is why I needed this design.
On each new data point was appended to the NSMutableData and then the NSData attribute was set to the NSMutableData.
I suspect that because the NSData pointer wasn't changing (though its content was), that Core Data did not appreciate the amount of change. Calling -hasChanged on the NSManagedObjectContext showed that there had been changes, and calling -updatedObjects even listed the Data object as having changed. But the actual data that was being written seems to have been truncated (sometimes).
To work around this I changed things slightly. New data points are still appended to NSMutableData but NSData attribute is only set when sampling is completed. This means that there is a chance that a crash might result in truncated data - but for the most part this work around seems to have solved the problem.
Caveat emptor: the bug was always intermittent, so it is possible that is still there - but just harder to manifest.