iPhone : Best way to detect the end of UIImageView image sequence animation - iphone

We know that UIImageView has a very nice support for image sequence animation. We can easily create an array of UIImage objects, set the animationImages property, configure animation duration, repeat count etc. and then just fire. But there seems to be no way to know when this animation has ended.
Say I have ten images and then I want to run an animation (repeat count = 1) with them. And when the animation is over, I want to run some other code. What is the best way to know that animation has ended?
I already understand that I can create a NSTimer and schedule it to fire after animation duration. But you really cannot rely on timer if you need good precision.
So my question is, is there any better way to know that an UIImageView image sequence animation has ended without using the timer?
The code is something like this
myImageView.animationImages = images; // images is a NSArray of UIImages
myImageView.animationDuration = 2.0;
myImageView.animationRepeatCount = 1;
[myImageView startAnimating]

The isAnimating property on UIImageView should go to NO when it's done animating. It's not a formal property, though, so you can't set up observation on it. You can poll it on a fine-grained timer (like CADisplayLink's).
There's no "animation completed" delegate for this, if that's the sort of thing you're looking for. The timing can be variable based on loading delay of the images, etc, and no, there's no sure-fire way to know precisely when it's done.
The image animation stuff on UIImageView is a convenience, and not heavyweight enough to do serious animation work with. Consider rolling your own if you need that kind of precision.

I made this (method of my UIImageView subclass).
-(void)startAnimatingWithCallback:(UIImageViewAnimationCompletitionBlock) completitionCallback
{
[self startAnimating];
dispatch_queue_t animatingQueue = dispatch_get_current_queue();
dispatch_queue_t pollingQueue = dispatch_queue_create("pollingQueue", NULL);
dispatch_async(pollingQueue, ^{
while(self.isAnimating) { usleep(10000); }
dispatch_async(animatingQueue, ^{ if (completitionCallback) completitionCallback(); });
});
}
Simple usage:
[self.oneView fadeOut];
[self.otherView startAnimatingWithCallback:^
{
[self.oneView fadeIn];
}];

Furthermore I could recommend setting default image of the UIImageView (property image) on the first frame of the animation and changing it to the last frame just after launching the animation (startAnimating method). This way we avoid the ugly blick which can occur when the animation is finished but the callback is not invoked.

Related

UIImageView performance within UITableViewCell using ASIHTTPRequest

This problem has been vexing me for a little over a day. The gist is that I have a Grouped UITableView setup such that there is only one entry per section. Each cell within the table view contains a UIImageView that takes up the entire width and length of the cell. Furthermore, each cell is at a specified set height and width.
Here is my init method:
- (id)initWithStyle:(UITableViewCellStyle)style reuseIdentifier:(NSString *)reuseIdentifier{
self = [super initWithStyle:style reuseIdentifier:reuseIdentifier];
if (self) {
[[self layer] setMasksToBounds:NO];
[[self layer] setShouldRasterize:YES];
[[self layer] setRasterizationScale:[[UIScreen mainScreen] scale]];
[[self layer] setCornerRadius:10];
_bgCoverView = [[UIImageView alloc] initWithFrame:CGRectZero];
[_bgCoverView setClipsToBounds:YES];
[[_bgCoverView layer] setCornerRadius:10];
[_bgCoverView setContentMode:UIViewContentModeScaleAspectFill];
[self.contentView addSubview:_bgCoverView];
[_bgCoverView release];
//Init other parts of the cell, but they're not pertinent to the question
}
return self;
}
This is my layout subviews:
- (void)layoutSubviews{
[super layoutSubviews];
if (!self.editing){
[_bgCoverView setFrame:CGRectMake(0, 0, tableViewCellWidth, self.contentView.frame.size.height)];
}
}
This is my setter for the cell:
- (void)setGroup:(Group*)group{
//Set background cover for the contentView
NSSet *usableCovers = [[group memories] filteredSetUsingPredicate:[NSPredicate predicateWithFormat:#"pictureMedData != nil || pictureFullData != nil || (pictureMedUrl != nil && NOT pictureMedUrl contains[cd] '/missing.png')"]];
Memory *mem = [usableCovers anyObject];
[_bgCoverView setImage:[UIImage imageWithContentsOfFile:#"no_pics_yet.png"]];
if (mem != nil){
[Folio cacheImageAtStringURL:[mem pictureMedUrl] forManagedObject:mem withDataKey:#"pictureMedData" inDisplay:_bgCoverView];
}
}
This is where I think I'm getting a performance ding. First, there's the filtered set call. If there are a lot of objects in the set, this could slow stuff down. However, there tend to be relatively few objects. Furthermore, when I removed this line of code, I saw no performance change, so it's pretty unlikely.
So, that pretty much leaves my cache method (which is located in a helper class). Here's the method:
+(void)cacheImageAtStringURL:(NSString *)urlString forManagedObject:(NSManagedObject*)managedObject withDataKey:(NSString*)dataKey inDisplay:(NSObject *)obj{
int objType = 0; //Assume UIButton
if (obj == nil)
objType = -1;
else if ([obj isKindOfClass:[UIImageView class]])
objType = 1; //Assign to UIImageView
NSData *data = (NSData*)[managedObject valueForKey:dataKey];
if ([data bytes]){
if (objType == 0) [(UIButton*)obj setBackgroundImage:[UIImage imageWithData:data] forState:UIControlStateNormal];
else if (objType == 1) [(UIImageView*)obj setImage:[UIImage imageWithData:data]];
return;
}
//Otherwise, do an ASIHTTPRequest and set the image when it's returned, save the data into core data so that we get a cache hit the next time.
}
Regardless of if the images are cached or not, there are major performance issues with scrolling down on this view. If I comment out the cache method, it works pretty well. Additionally, other views that call this method are sluggish, so I'm pretty sure it's something here.
I will also say, however, that I'm also suspicious of the cornerRadius code, however, I heard that if you set shouldRasterize to YES, it'll result in a performance speed-up. Still, I'm not sure if that's 100% true or if my implementation is off.
Any help would be greatly appreciated.
UPDATE
Still not 100% fixed yet, but we are getting there.
These variables must be set on the request:
[request setCachePolicy:ASIOnlyLoadIfNotCachedCachePolicy];
[request setCacheStoragePolicy:ASICachePermanentlyCacheStoragePolicy];
The first tells the cache to only ping the server for images if they are not cached. The second tells the cache to store the images permanently for the lifecycle of the app.
This got me part-way there. Some images loaded instantly whereas others seemed sluggish. I did more research and added these lines of code in my app delegate:
[[ASIDownloadCache sharedCache] setShouldRespectCacheControlHeaders:NO];
[ASIHTTPRequest setDefaultCache:[ASIDownloadCache sharedCache]];
The first line is crucial. ASIHTTPRequest's default action is to follow what the web server tells you. So, if it tells you in its headers not to cache a response, it won't. This line overrides that. Now, this worked really well on my iOS simulator. Then, I tried it on my 4S and it was slow again.
The reason was because I set up a block of code in my request that, upon the successful retrieval of an image, I would save it to Core Data. This call was being triggered every time, even for cached requests.
This is because ASIHTTPRequest will still make a request regardless of whether or not it uses a cached copy of your data. If it has a cached copy, it won't send out a request to a server, but will populate its request object's response data with cached data instead. So after my block takes care of display work, I call:
if ([request didUseCachedResponse]) return;
This works very well on the 4S and the 4. However, the 3GS is still disastrously slow :-(.
This is because ASIHTTPRequest is using a cached copy of my NSData and not my UIImage. So, it needs to re-create the image each time it uses the cache. This makes sense because ASIHTTPRequest is only returning data, and doesn't know what you are going to do with the data afterwards, so it'll always call your code that's set up when the request is returned. And because my code block converts the image from data, that's what I get.
A WAY FORWARD
I think to get the performance speedup I want, it's necessary to implement a custom UIImage cache. If we find the file in the cache, we return the UIImage object. If not, then we create a request and save it in the cache on return.
The tip with shouldRasterize is a valid one, though from my experience this decreases the quality of the graphic significantly. Also, since like iOS4.3 or so (can't remember anymore) they enormously improved the cornerRadius call, it's a very long time since I last needed to activate rasterizing for this kind of problem.
You could always deactivate cornerRadius and see how it performs.
What I really want to tell you however is that ASIHTTPRequest has built-in cache support, see
http://allseeing-i.com/ASIHTTPRequest/How-to-use#using_a_download_cache
Try it, it worked flawlessly for me. Think about what cache policies fit your needs best. Of course, if you need the images in CoreData you'll have to insert them, but I wouldn't try to cache them manually from CoreData.
Oh and something else ... ASIHTTPRequest is not under development anymore, you might want to consider switching to another framework (like AFNetworking). ASIHTTPRequest e.g. doesn't support ARC.
Check out this post. I use a blocks and GCD for image loading (especially useful in a TableViewCell). This loads the images on the fly if you put the method call in the cellForRowAtIndexPath method. I have no slugglish UI.

How to make a static image appear after 3 seconds?

How would I make an image appear after 3 seconds?
You can use:
[self performSelector: withObject: afterDelay: ]
I'm a big fan of using GCD (iOS 4+) because you can simplify your code with inline blocks.
In your case, you should set the image to hidden in Interface Builder, then create an IBOutlet with a connection to an ivar in your class.
Then you can simply run this in viewDidLoad or similar:
dispatch_time_t delay = dispatch_time(DISPATCH_TIME_NOW, NSEC_PER_SEC * 3.0);
dispatch_after(delay, dispatch_get_main_queue(), ^(void){
yourImage.hidden = NO;
});
This assumes that you are calling performSelector:withObject:afterDelay from the main thread, and that your UIImageView is initially hidden.
//assumes theImageView.hidden = YES
[self performSelector:#selector(showImage:) withObject:theImageView afterDelay:yourTimeInterval];
-(void)showImage:(UIImageView*)anImageView {
anImageView.hidden = NO;
}
It is important that performSelector is called from the main thread because the selector that is called after the delay will run on the same thread, and you do not want to update UI from anything other than the main thread as a general rule.
I haven't used XCode in awhile, but I'll take a stab for ya..
In your Interface Builder set the image's visibility as hidden
When your app starts up, set some global variable to the current time in an init fxn
In the main control loop for your UI, check if that global var contains a time that is more than 3 seconds ago, if so, change that image's visibility parameter to shown.
Best I can really say without really taking a look, which isn't possible right now.
Good luck!

When are a methods GUI operations actually carried out?

I am working on a web-services data processing app and I am trying to make the app run as quickly as possible. When a certain 3 finger pan gesture is performed, I call a method that sends updated information off to the server to get a new batch of images to update the existing ones with.
So lets say there are 15 images in an array, I filter through them with a 2 finger gesture, and then if I want to change something about them, I can do the 3 finger gesture, and I get that same set back, just tweaked a bit (contrast/brightness, etc.).
Is what I want though is to be able to update the imageView that is displaying the images after the first image has been retrieved, so as to give the user a feel for what the rest in the series are going to look like. But no matter what I try, and no matter how many different threads I try and implement, I can't get the imageView to update before the entire download is complete. Once the batch download is done (which is handled on a separate thread) the imageView updates with the new images and everything is great.
The first step in the process is this:
if(UIGestureRecognizerStateEnded == [recognize state]){
[self preDownload:windowCounter Level:levelCounter ForPane:tagNumber];// Where this method is what gets the first image, and tries to set it to the imageView
[self downloadAllImagesWithWL:windowCounter Level:levelCounter ForPane:tagNumber]; //And this method goes and gets all the rest of the images
}
This is my preDownload method:
-(void)preDownload:(int)window Level:(int)level ForPane:(int) pane{
int guidIndex = [[globalGuids objectAtIndex:pane] intValue];
UIImage *img = [DATA_CONNECTION getImageWithSeriesGUID:[guids objectAtIndex:guidIndex] ImageID:counter Window:window Level:level];
if(pane==0){
NSLog(#"0");
[imageView3 setImage:img];
}else if(pane==1){
NSLog(#"1");
[imageView31 setImage:img];
}else if(pane==2){
NSLog(#"2");
[imageView32 setImage:img];
}else if(pane==3){
NSLog(#"3");
[imageView33 setImage:img];
}
}
So by separating this out into two different methods (there are no threads being implemented at this point, these methods are being called before all that) I was thinking that after the preDownload method completed, that the imageView would update, and then control would continue on down into the downloadAllImagesWithWL method, but that doesn't appear to be the case.
Am I missing something simple here? What can I do to update my GUI elements before that second method is through running?
You are right. However the viewn won't refresh until your code reaches runloop. You can do 2 things:
Make your downloadAllImagesWithWL method async, so it will return after you called it, your main thread reaches runloop, gui updates, and the download method will tell your logic through a callback when its done.
OR
A simplier hackier (and bad) solution would be to run runloop for some time before you call your download method. Something like this: [[NSRunloop currentRunLoop] runUnitlDate: [Date dateWithTimeIntervalSinceNow: 0.1]]; It will run runloop for 0.1 second.
When the image is set, the image view will mark itself as needing display. The actual display won't occur until the beginning of the next run loop. In OS X, you can use -display to draw the view immediately, but I don't think Apple created a public method to do this on iOS. However, if the next method simply creates the background thread, then it will return quickly and the display update will probably occur before the thread finishes.

iphone - main thread freezes for half a second... why?

I have an app that is drawing lines on a quartz context. The app starts drawing when the user move his finger across the screen.
At the time TouchesMoved is fired, I save the quartz context to a PNG file (I know saving a file is slow... I have tried to do this to memory but app memory usage skyrocketed, so, I am trying to do it to disk).
As the context is being saved to this, I do this on touches moved
if (firstMove) // first movement after touchesbegan
[NSThread detachNewThreadSelector:#selector(newThreadUNDO)
toTarget:self
withObject:nil];
firstMove = NO
}
and then I have
- (void) newThreadUNDO {
NSAutoreleasePool* p = [[NSAutoreleasePool alloc] init];
[NSThread setThreadPriority:0.1];
[NSThread sleepForTimeInterval:0.0];
[self performSelectorOnMainThread:#selector(copyUNDOcontext) withObject:nil waitUntilDone:NO];
[p release];
}
and
- (void) copyUNDOcontext {
CGFloat w = board.image.size.width;
CGFloat h = board.image.size.height;
CGRect superRect = CGRectMake(0,0, w, h);
CGSize size = CGSizeMake(w, h);
UIGraphicsBeginImageContext(size);
CGContextRef new = UIGraphicsGetCurrentContext();
// lineLayer is the layer context I need to save
CGContextDrawLayerInRect(new, superRect, lineLayer);
UIImage *imagem = UIGraphicsGetImageFromCurrentImageContext();
[self saveTempImage:imagem :#"UNDO.png"];
UIGraphicsEndImageContext();
}
The problem is: as soon as the user starts moving, the new thread is fired, but even this new thread being with low priority the main thread still freezes for about half second (probably while the file is being saved).
Why is that?
How can I try to solve that?
thanks.
Have you tried:
performSelector:onThread:withObject:waitUntilDone:
With waitUntilDone set to NO.
If I recall correctly performing the selector on the Main thread always processes the selector in the main run loop of the application. I could be wrong. I have been using GCD for some time now.
If you try this I believe you will need to put the autorelease pool into the function, as it will serve as the entry and exit point of the thread.
First, a method named like saveTempImage:: is to be discouraged. Make it saveTempImage:fileName: or something.
Your guess is probably good; saving the file is probably where the pause is coming from. Could also be the rendering itself, if complicated, but doesn't look like it is.
However, guessing is generally an unproductive way to analyze performance problems. Use the provided tools. The CPU Sampler instrument could tell you what is really going on.
To fix? First confirm the problem. If it is the file I/O, move it off the main thread (I haven't looked at UIImage's documentation to know if it is thread safe in such a context).

Cocos2D iPhone Effect

trying to play around with the Cocos2d effects and created to methods to display and stop the Liquid action. My application however drops from 60fps down to 30fps when the effect is applied but the fps doesnt increase again when the scheduled stop action is called.
I originally thought that while the action has been completed the effect is still being rendered but after reading through the EffectsTest.h/.m in the Cocos2D 0.8 zip I cant find any reference to how this is achieved. Can anyone shed some light on this issue?
// effects
-(void)enableLiquidEffect
{
id liquid = [Liquid actionWithWaves:6 amplitude:20 grid:ccg(15, 10) duration:3];
[self schedule:#selector(disableLiquidEffect) interval:(3.0)];
[self runAction:liquid];
}
-(void)disableLiquidEffect
{
[self unschedule:#selector(disableLiquidEffect)];
[self stopAllActions];
}
Cheers,
AntonMills
Just a little tip here i know this was asked years ago, but someone might still come here
the code is a little bit overkill, here's how to do it:
// effects
-(void)enableLiquidEffect
{
id liquid = [Liquid actionWithWaves:6 amplitude:20 grid:ccg(15, 10) duration:3];
//No need to unschedule after 3 seconds since you already set duration-^ to 3 seconds.
[self runAction:liquid];
}
-(void)disableLiquidEffect
{
[self stopAllActions];
}
besides that the code is perfect
Just a guess but since the item will still have a transform set to liquid that it is still trying to apply a more complex transform then needed after its done. Save off your transform prior to starting and then when it stops set it back. You could try just setting it to nil.