currently I have an image on the screen that is swapped out every 5 seconds with another image and is using an animation to do so.
At the same time on the screen I have objects that the user can pick up and drag around (using panning gesture). During the .5 duration of the animation, if I am moving around an object the UI stutters. For example I have a brush that I pick up and move around the screen. the 5 second timer ends and the background image updates. while this updates the brush stutters while that animation occurs. I moved the Image loading the the UI thread and force it to load by using NSData.
Is there a way that I can prevent this stutter while the animation to change the image is running. Here is how I swap the image.
// Dispatch to the queue, and do not wait for it to complete
// Grab image in background thread in order to not block UI as much as possible
dispatch_async(imageGrabbingQueue, ^{
curPos++;
if (curPos> (self.values.count - 1)) curPos= 0;
NSDictionary *curValue = self.values[curPos];
NSString *imageName = curValue [KEY_IMAGE_NAME];
// This may cause lazy loading later and stutter UI, convert to DataObject and force it into memory for faster processing
UIImage *imageHolder = [UIImage imageNamed:imageName];
// Load the image into NSData and recreate the image with the data.
NSData *imageData = UIImagePNGRepresentation(imageHolder);
UIImage *newImage = [[UIImage alloc] initWithData:imageData];
dispatch_async(dispatch_get_main_queue(), ^{
[UIView transitionWithView:self.view duration:.5 options:UIViewAnimationOptionTransitionCrossDissolve|UIViewAnimationOptionAllowUserInteraction|UIViewAnimationOptionAllowAnimatedContent
animations:^{
[self.image setImage:newImage ];
// Temp clause to show ad logo
if (curPos != 0) [self.imagePromotion setAlpha:1.0];
else [self.imagePromotion setAlpha:0];
}
completion:nil];
});
});
Thanks,
DMan
The image processing libs on the iPhone are not magic, they do take CPU time to actually decode the image. This is likely what you are running into. Calling UIImage imageNamed would likely cache the image, but caches can always be flushed so that does not force the system to keep the image in memory. Your code to call initWithData is pointless because it still has to decompress the PNG into memory and that is the part that is causing the slowdown. What you could do is render the image out as decoded pixels and then save that into a file. Then, memory map the file and wrap the mapped memory up in a coregraphics image. That would avoid the "decode and render" step that is likely causing the slowdown. But, anything else may not actually do what you are expecting. Oh, and you should not hold the decoded bytes in memory, because image data is typically so large that it will take up too much room in the device memory.
Related
UPDATE This piece of code is actually not where the problem is; commenting out all the CoreGraphics lines and returning the first image in the array as the result does not prevent the crashes from happening, so I must look farther upstream.
I am running this on a 75ms NSTimer. It works perfectly with 480x360 images, and will run all day long without crashing.
But when I send it images that are 1024x768, it will crash after about 20 seconds, having given several low memory warnings.
In both cases Instruments shows absolutely normal memory usage: a flat allocations graph, less than one megabyte of live bytes, no leaks the whole time.
So, what's going on? Is Core Graphics somehow using too much memory without showing it?
Also worth mentioning: there aren't that many images in (NSMutableArray*)imgs -- usually three, sometimes two or four. Crashes regardless. Crashes slightly less soon when there are only two.
- (UIImage*) imagefromImages:(NSMutableArray*)imgs andFilterName:(NSString*)filterName {
UIImage *tmpResultant = [imgs objectAtIndex:0];
CGSize s = [tmpResultant size];
UIGraphicsBeginImageContext(s);
[tmpResultant drawInRect:CGRectMake(0, 0, s.width, s.height) blendMode:kCGBlendModeNormal alpha:1.0];
for (int i=1; i<[imgs count]; i++) { [[imgs objectAtIndex:i] drawInRect:CGRectMake(0, 0, s.width, s.height) blendMode:kCGBlendModeMultiply alpha:1.0]; }
tmpResultant = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return tmpResultant;
}
Sounds to me like the problem is outside of the code you have shown. Images that are displayed on screen have a backing store outside of your app's memory that is width*height*bytes_per_pixel. You also get memory warnings and app termination if you have too many backing stores.
You might need to optimize there, to either create smaller optimized versions of these images for display or to allow for the backing stores to be released. Also turning on rasterization for certain non-changing layers can help here as well as setting layer contents directly to the CGImage as opposed to working with UIImages.
You should make a sample project that demonstrates the issue that has no other code around it and see if you still run out of memory. But as I suspect you'll find that just with the code you have shown you will not be able to reproduce the isse as it lies elsewhere.
I have a looping animation consisting of 120 frames, 512x512 resolution, saved as 32bit PNG files. I want to play this sequence back in a UIView inside my application. Can anyone give me some pointers regarding how I might do this, hopefully I can do this using the standard API (which I would prefer). I could use Cocos2D if needed or even OpenGL (but I am totally new to OpenGL at this point).
You can try this:
// Init an UIImageView
UIImageView *imageView = [[UIImageView alloc] initWithFrame:/*Some frame*/];
// Init an array with UIImage objects
NSArray *array = [NSArray arrayWithObjects: [UIImage imageNamed:#"image1.png"], [UIImage imageNamed:#"image2.png"], .. ,nil];
// Set the UIImage's animationImages property
imageView.animationImages = array;
// Set the time interval
imageView.animationDuration = /* Number of images x 1/30 gets you 30FPS */;
// Set repeat count
imageView.animationRepeatCount = 0; /* 0 means infinite */
// Start animating
[imageView startAnimating];
// Add as subview
[self.view addSubview:imageView];
This is the easiest approach, but I can't say anything about the performance, since I haven't tried it. I think it should be fine though with the images that you have.
Uncompressed, that's about 90MB of images, and that might be as much as you're looking at if they're unpacked into UIImage format. Due to the length of the animation and the size of the images, I highly recommend storing them in a compressed movie format. Take a look at the reference for the MediaPlayer framework; you can remove the playback controls, embed an MPMoviePlayerController within your own view hierarchy, and set playback to loop. Note that 640x480 is the upper supported limit for H.264 so you might need to scale down the video anyway.
Do take a note of issues looping video, as mentioned in the question here Smooth video looping in iOS.
I have an "ImageManipulator" class that performs some cropping, resizing and rotating of camera images on the iPhone.
At the moment, everything works as expected but I keep getting a few huge spikes in memory consumption which occasionally cause the app to crash.
I have managed to isolate the problem to a part of the code where I check for the current image orientation property and rotate it accordingly to UIImageOrientationUp. I then get the image from the bitmap context and save it to disk.
This is currently what I am doing:
CGAffineTransform transform = CGAffineTransformIdentity;
// Check for orientation and set transform accordingly...
transform = CGAffineTransformTranslate(transform, self.size.width, 0);
transform = CGAffineTransformScale(transform, -1, 1);
// Create a bitmap context with the image that was passed so we can perform the rotation
CGContextRef ctx = CGBitmapContextCreate(NULL, self.size.width, self.size.height,
CGImageGetBitsPerComponent(self.CGImage), 0,
CGImageGetColorSpace(self.CGImage),
CGImageGetBitmapInfo(self.CGImage));
// Rotate the context
CGContextConcatCTM(ctx, transform);
// Draw the image into the context
CGContextDrawImage(ctx, CGRectMake(0,0,self.size.height,self.size.width), self.CGImage);
// Grab the bitmap context and save it to the disk...
Even after trying to scale the image down to half or even 1/4 of the size, I am still seeing the spikes to I am wondering if there is a different / more efficient way to get the rotation done as above?
Thanks in advance for the replies.
Rog
If you are saving to JPEG, I guess an alternative approach is to save the image as-is and then set the rotation to whatever you'd like by manipulating the EXIF metadata? See for example this post. Simple but probably effective, even if you have to hold the image payload bytes in memory ;)
Things you can do:
Scale down the image even more (which you probably don't want)
Remember to release everything as soon as you finish with it
Live with it
I would choose option 2 and 3.
Image editing is very resource intensive, as it loads the entire raw uncompressed image data into the memory for processing. This is inevitable as there is absolutely no other way to modify an image other than to load the complete raw data into the memory. Having memory consumption spikes doesn't really matter unless the app receives a memory warning, in that case quickly get rid of everything before it crashes. It is very rare that you would get a memory warning, though, because my app regularly loads a single > 10 mb file into the memory and I don't get a warning, even on older devices. So you'll be fine with the spikes.
Have you tried checking for memory leaks and analyzing allocations?
If the image is still too big, try rotating the image in pieces instead of as a whole.
As Anomie mentioned, CGBitmapContextCreate creates a context. We should release that by using
CGContextRelease(ctx);
If you have any other objects created using create or copy, that should also be released. If it is CFData, then
CFRelease(cfdata);
I have divergent needs for the image returned from the iPhone camera. My app scales the image down for upload and display and, recently, I added the ability to save the image to the Photos app.
At first I was assigning the returned value to two separate variables, but it turned out that they were sharing the same object, so I was getting two scaled-down images instead of having one at full scale.
After figuring out that you can't do UIImage *copyImage = [myImage copy];, I made a copy using imageWithCGImage, per below. Unfortunately, this doesn't work because the copy (here croppedImage) ends up rotated 90ยบ from the original.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
// Resize, crop, and correct orientation issues
self.originalImage = [info valueForKey:#"UIImagePickerControllerOriginalImage"];
UIImageWriteToSavedPhotosAlbum(originalImage, nil, nil, nil);
UIImage *smallImage = [UIImage imageWithCGImage:[originalImage CGImage]]; // UIImage doesn't conform to NSCopy
// This method is from a category on UIImage based on this discussion:
// http://discussions.apple.com/message.jspa?messageID=7276709
// It doesn't rotate smallImage, though: while imageWithCGImage returns
// a rotated CGImage, the UIImageOrientation remains at UIImageOrientationUp!
UIImage *fixedImage = [smallImage scaleAndRotateImageFromImagePickerWithLongestSide:480];
...
}
Is there a way to copy the UIImagePickerControllerOriginalImage image without modifying it in the process?
This seems to work but you might face some memory problems depending on what you do with newImage:
CGImageRef newCgIm = CGImageCreateCopy(oldImage.CGImage);
UIImage *newImage = [UIImage imageWithCGImage:newCgIm scale:oldImage.scale orientation:oldImage.imageOrientation];
This should work:
UIImage *newImage = [UIImage imageWithCGImage:oldImage.CGImage];
Copy backing data and rotate it
This question asks a common question about UIImage in a slightly different way. Essentially, you have two related problems - deep copying and rotation. A UIImage is just a container and has an orientation property that is used for display. A UIImage can contain its backing data as a CGImage or CIImage, but most often as a CGImage. The CGImage is a struct of information that includes a pointer to the underlying data and if you read the docs, copying the struct does not copy the data. So...
Deep copying
As I'll get to in the next paragraph deep copying the data will leave the image rotated because the image is rotated in the underlying data.
UIImage *newImage = [UIImage imageWithData:UIImagePNGRepresentation(oldImage)];
This will copy the data but will require setting the orientation property before handing it to something like UIImageView for proper display.
Another way to deep copy would be to draw into the context and grab the result. Assume a zebra.
UIGraphicsBeginImageContext(zebra!.size)
zebra!.drawInRect(CGRectMake(0, 0, zebra!.size.width, zebra!.size.height))
let copy = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
Deep copy and Rotation
Rotating a CGImage has been already been answered. It also happens that this rotated image is a new CGImage that can be used to create a UIImage.
UIImage *newImage = [UIImage imageWithData:UIImagePNGRepresentation(oldImage)];
I think you need to create an image context (CGContextRef). Draw the UIImage.CGImage into the context with method CGContextDrawImage(...), then get the image from the context with CGBitmapContextCreateImage(...).
With such routine, I'm sure you can get the real copy of the image you want. Hope it's helped you.
On a iPhone app, I need to send a jpg by mail with a maximum size of 300Ko (I don't no the maximum size mail.app can have, but it's another problem). To do that, I'm trying to decrease quality until obtain an image under 300Ko.
In order to obtain the good value of the quality (compressionLevel) who give me a jpg under 300Ko, I have made the following loop.
It's working, but each time time the loop is executed, the memory increase of the size of the original size of my jpg (700Ko) despite the "[tmpImage release];".
float compressionLevel = 1.0f;
int size = 300001;
while (size > 300000) {
UIImage *tmpImage =[[UIImage alloc] initWithContentsOfFile:[self fullDocumentsPathForTheFile:#"imageToAnalyse.jpg"]];
size = [UIImageJPEGRepresentation(tmpImage, compressionLevel) length];
[tmpImage release];
//In the following line, the 0.001f decrement is choose just in order test the increase of the memory
//compressionLevel = compressionLevel - 0.001f;
NSLog(#"Compression: %f",compressionLevel);
}
Any ideas about how can i get it off, or why it happens?
thanks
At the very least, there's no point in allocating and releasing the image on every trip through the loop. It shouldn't leak memory, but it's unnecessary, so move the alloc/init and release out of the loop.
Also, the data returned by UIImageJPEGRepresentation is auto-released, so it'll hang around until the current release pool drains (when you get back to the main event loop). Consider adding:
NSAutoreleasePool* p = [[NSAutoreleasePool alloc] init];
at the top of the loop, and
[p drain]
at the end. That way you'll not be leaking all of the intermediate memory.
And finally, doing a linear search for the optimal compression setting is probably pretty inefficient. Do a binary search instead.