I am handling with the ALAssetsLibrary.
When I get all the thumbnails, I just use the UIImageViews to hold the thumbnails and add them to the holder.
Problem is here, it is really slow to add them. Maybe ten seconds or more. If there is much photos, it will be slower.
I would want to know what is the best practice to hold these thumbnails. (Many thanks!)
Use AlAsset aspectRatioThumbnail instead of fullResolutionImage for high performance
The class ALAsset has two methods to get thumbnails:
- (CGImageRef)thumbnail
- (CGImageRef)aspectRatioThumbnail
example:
//ALAssetsLibrary block will execute in a separate thread. So I suggest to do the UI related stuff in main thread.
dispatch_sync(dispatch_get_main_queue(), ^{
CGImageRef iref = [myasset aspectRatioThumbnail];
itemToAdd.image = [UIImage imageWithCGImage:iref];
});//end block
I think in this https://github.com/johnil/JFImagePickerController project with two classes JFAssetHelper and JFImageManager you will find answer.This use NSCache to cache photos and it really quick
Related
How Can I delay the stream to UIImageview using AVCaptureVideoPreviewLayer from camera?
See below how I bind them, but I just can't figure how to delay it (I don't want it in real time)
AVCaptureVideoPreviewLayer* captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.imageView.bounds;
[self.imageView.layer addSublayer:captureVideoPreviewLayer];
First you're going to want to remove the preview layer frame you have right now, as there is no method to delay those preview frames out of the box.
You're going to want to create a buffer. If we're talking about a few frames, you could have a NSMutableArray that you're filling up on one end with UIImages while you're feeding your image view from the other end.
Your UIImage would come from the didOutputSampleBuffer method, use something like this UIImage created from CMSampleBufferRef not displayed in UIImageView?
Now, few challenges you will have to deal with:
you're talking about having a multiple seconds delay, 5 seconds would be about 150 frames. Storing 150 UIImage in memory isn't gonna happen, unless they're very tiny and on the latest devices
You would solve that by saving the images to disk and have your array only store the path of those images instead of the images themselves. Now you're probably going to run into performance issues, as you're going to do read/write operations in real time, your framerate is going to suffer from that
Because of that bad frame rate, you're going to have to make sure you're not losing synchronization between recorded feed and live feed, otherwise you'll start with a 5 sec delay and end up with a much longer delay
Good luck with that, it can be done with some trade-off (slower frame rate...) but it can be done. (I have done something very similar myself multiple times, can't share the code for IP reasons).
I am just using AssetsLibrary Framework to load images from Photos.
[UIImage imageWithCGImage:[asset defaultRepresentation].fullScreenImage scale:1.0 orientation:(UIImageOrientation)[asset defaultRepresentation].orientation];
It will take about 0.5~0.6 second to get one photo. And the photo is not that large (for about 700*900).
Am I using the method in a wrong way? And can it be optimized?
(I want the photo of this size, do not want the thumbnail)
Many thanks!
you are using the method correctly. An idea to optimize the user-experience:
=> Load the thumbnail image first (best with dispatch_async) - that should be really quick. When this has completed, load the fullscreen image like you did above.
This is what apple does in the Photo App to provide a smooth user experience.
Cheers,
Hendrik
I have created an app which has a library of images. I have imported the images in to my resources folder in Xcode and am currently loading them in to an array in the appDelegate when the app loads. This is happening each time. I am now worried that as the library grows the app will run out of memory. Is there a better way of loading these?
the structure is
Library > category > image list > image
I have an NSMutableArray called mainLibrary which I then add another nsmutablearray for each category. For example
NSMutableArray *arrHome = [[NSMutableArray alloc] initWithObjects: #"Home",
[NSDictionary dictionaryWithObjectsAndKeys: [NSString stringWithFormat:#"%#/%#", resourcePath, #"alarm_clock.png"], #"image_path", #"Alarm Clock", #"title", nil],
[NSDictionary dictionaryWithObjectsAndKeys: [NSString stringWithFormat:#"%#/%#", resourcePath, #"bathtub.png"], #"image_path",#"Bathtub", #"title", nil],
nil];
[mainLibrary addObject: arrHome];
The above is for the "home" category and contains only two images. For other categories, I have the NSDictionary line for each image - there can be up to 100 images in each category.
If I am loading in 50mb+ of images to an array will this cause the app to become unresponsive - or more than likely, simply crash as it uses more of the 24mb (ish) allowance per app??
I am wondering whether I should do the above code and read in to the array ONCE a category has been selected - but that might then cause it to stall for the user.
Any thoughts?!
ps) I also need to have a think about Retina display. With 600+ images, duplicating them would mean my binary size would rocket!!
A while ago I ran some tests for an app I was working on. Loading 5000 images in succession:
[UIImage imageWithData:...] // 44.8 seconds
[UIImage imageWithContentsOfFile:...] // 52.3 seconds
[[UIImage alloc] initWithContentsOfFile:…] // 351.8 seconds
[UIImage imageNamed:...] // hung due to caching
Using the paths to the files is obviously the way to go. Obviously memory management will be key in your case.
compose your path dictionaries/arrays using strings, then load/unload images as needed (load lazily), if you need such a structure to interface with in your program.
600 HQ images will exceed the memory limits, and it's wasteful because the user will see how many of these images at a given time? resources are quite limited, and hq images require a lot of space.
if you have to display a bunch of thumbnails, those may be shrunk and exported easily enough so you can display all the images in a reasonable amount of time, while using a rational amount of memory (again, just load lazily where possible and release the image when not very easily viewed).
Why not use [UIImage imageNamed:IMAGE_NAME] directly ? It has a good memory management inside.
What's the best way to save and retrieve an array of images across app restarts?
I'm implementing a caching feature for offline viewing of downloaded images and just want to make sure I'm using the right persisting methods.
Thanks!
The quickest & probably best solution would be to persist your images to disk, no question about it.
You could do something like this to save them as JPEG.
NSData *data = UIImageJPEGRepresentation(image, 1.0f);
[data writeToFile:imagePath atomically:YES];
So far I've managed to create an app for iPhone that takes multiple images with about a 3 second interval between each. I`m processing each image in a separate thread asynchronously and everything is great till it gets to the moment for saving the image on the iPhone disk. Then it takes about 12 seconds to save the image to the disk using JPEG representation.
How does Apple do it, how do they manage to save a single image so fast to the disk is there a trick they are using? I saw that the animations distract the user for a while, but still the time needed is below 12 seconds!
Thanks in advance.
Actually apple uses its kernal driver AppleJPEGDriver, It is a hardware jpeg encoding api and is much faster than software encoding (JPEGRepresnetaion) and some of the people using it in their jailbreak apps(cycorder video recording application).
Apple should give the same functionality to its users but they are apple :)
I haven't tried this but I wouldn't be so sure that Apple isn't using the same methods. A big part of the Apple design philosophy relies on hiding operational interruptions from the user. The Apple code may take as much time as yours but simply be adroit at hiding the entire save time from the perception of the user.
If someone can't tell you how Apple actually does save faster I would suggest looking at ways to disguise the save time.
If you google around a bit... there is a whole bunch of people with the same problem.
I didn't find an answer. The general conclusion seems to be that apple either uses some internal api and bypass public api overhead or some hardware encoder.
Guess you are out of luck for fast image saving
I was having this problem in my app, on saving it would hang so I used Grand central dispatch.
Below is the setImage method out of my image cache class, if UIImage has a image it saves it otherwise it deletes it. You can adapt this to suit your needs hopefully, will only work on iOS 4+. The code is ARC enabled.
-(void)setImage:(UIImage *)image{
if (image == nil){
NSLog(#"Deleting Image");
// Since we have no image let's remove the cached image if it exists
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSString *cachePath = [NSSearchPathForDirectoriesInDomains(NSCachesDirectory,
NSUserDomainMask, YES) objectAtIndex:0];
[[NSFileManager defaultManager] removeItemAtPath:[cachePath
stringByAppendingPathComponent:#"capturedimage.jpg"] error:nil];
});
}
else {
NSLog(#"Saving Image");
// We've got an image, let's save it to flash memory.
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSString *cachePath =
[NSSearchPathForDirectoriesInDomains(NSCachesDirectory,
NSUserDomainMask, YES) objectAtIndex:0];
NSData *dataObj = UIImagePNGRepresentation(image);
[dataObj writeToFile:[cachePath
stringByAppendingPathComponent:#"capturedimage.jpg"] atomically:NO];
});
}
imageCache = image;
}