Blinking UIView - iphone

For my iPhone app I'm creating some rotating gears with the help of some subclassed UIViews.
I have created subclasses that rotate themselves triggered by a timer.
In one place I have one of these subclasses within another one (so rotation within rotation, think moon rotation around earth and it's own axle). It all rotates fine and dandy, but sometime, like once or twice a minute, I see a very quick white blink in the area of the UIViews. Sometimes in the upper half, sometimes in the lower one and sometimes the whole area (which is only about 128 x 128 pixels).
I rotate by using CGAffineTransformMakeRotation.
I guessed it was due to performance problem, but after simplifying images (no more photoshop made drop shadows in PNG for example) and reducing the number per second the timer is called (2 times per second instead of 5) I still have the problem. CPU load is now down to between 9-25% (from around 47%) when measured in Instruments on a iPhone 3G. Still blinking!
Any clues on where to begin troubleshooting or any better way to rotate images within a view?
All ideas appreciated!

Basically I had an animation in an animation. Not technically skilled enough to say WHY that caused a problem, but removing the second animation solved the problem. My animations were of the type [UIView animateWithDuration... in which I did several CGAffineTransformMakeRotation's

Related

Animation like minimizing and maximizing screens on Mac OS X

I am developing an iPhone application where
I need to provide an animation which is used by Mac OS X while minimizing and maximizing the screen to an UIImageView.
Need the animation on an image which continously give pulse effect (i.e. continous zoomin/out)(PS: I have done this using two images but I want to know whether its possible with single image).
Image must come in full size from the screen with alpha as 0 and then go to the center of the screen increasing alpha at each move till it reaches its orignal position.
Any help would be really appreciated..
Thanks
This is not predefined, but checkout these WWDC2010 Videos for informations on how to do it:
Session 424 - Core Animation in Practice, Part 1
Session 425 - Core Animation in Practice, Part 2
This animation is quite complex. I think it is done my manipulating the CGAffinityTransform of the window. It can be implemented the old-school way with a timer and updating the transformation matrix manually or you could maybe even use CoreAnimation. I think there is an animatable value function for this.

How many UIViews can be displayed simultaneously on iOS before running into performance problems?

I'm making an iPhone game and using UIView objects to draw sprites. Most of the time, I have no performance problems. However, once I have around 15 to 20 objects on the screen (and maybe 5 of them moving around), the game becomes considerably slower, especially on the iPhone 3G. The frame rate can drop to as low as a single frame per second.
Is this simply a limitation of using UIView objects, or should iOS be able to handle this many UIView objects on screen at the same time?
In order to isolate the problem, I've made drawing my views very simple — drawing a red rectangle. drawRect is only getting called once per view. The view hierarchy is very simple and shallow. I'm using CADisplayLink to update the UIView locations every frame.
There's very little else going on, so I'd like to hear if anyone else has had success using this number of UIView objects.
The key to my problems ended up being that I had labels on top of my game content. The labels are not opaque, which likely was a large part of the problem, as phix23 suggested.
The first thing that made a big difference was removing a frames per second label that was on top of the content. Having a label that changed content on every frame caused a lot of slowdown.
I also had a large label that displayed on top of much of the game and changed shape when you level up. It turned out that drawing this label on top of everything caused a lot of slowdown as well.
In answer to my original question, I've found that on an iPhone 3G I can support about 30-40 opaque UIViews onscreen at the same time, with 2 or 3 non-opaque views as well. Non-opaque UIViews that change size, shape, or location are by far the worst, and even one of these that covers a significant amount of the screen can quickly cause problems.
If you're setting the opaque property of each view to NO, keep in mind that this seriously affects the speed of drawing the views. If your views aren't transparent, you should leave this set to YES, which is default.
for such type of application you should use CoreGraphics / Quartz / OpenGL but anyway I don't think there is a limitation on such low count. For example if I have a table view with 9 rows and each row has 5 subviews its still displayed acceptable fast. Have you tried using UIView animation to change the position in view?
good luck in learning OpenGL ;)

Big animation iPhone with CCSpriteFrameCache - plist

I have a problem when try to load the big animation with about 54 images (320x480 each image) into CCSpriteFrameCache, I can't use plist for this. How can I make the animation work? At this time, my animation can't work on iPhone 2G, 3G, and iPod.
Thank for your help,
John
You won´t be able to do it...
Consider playing a video or just animating an small portion of the screen.
Your best bet is to determine why the animation has 54 images that are all the width/height of the screen. This is an unnecessary number of images.
Break the animation down:
Is the background 'static' (does it move around, change constantly, etc?)
If it moves around a bit, but is really part of a much larger "canvas" then simply export out the entire background canvas and perform the movements yourself using the Cocos2D Actions available to you (CCMoveTo, CCJumpTo, CCDelayTime, CCSequence, etc)
What in the animation moves around, and how does it move around?
Can it be broken into much smaller bits and the frames for the various "characters"
or "movable objects" within the scene be exported out onto a sprite sheet (saved out
via Zwoptex?)
A good animation sequence should be a series of much smaller images, all working together in unison to create the final "animation sequence".
If you break it down, I wouldn't be surprised if you were able to reduce your 54 images at 320x480 each down to a handful of 512x512 spritesheets (ala Zwoptex).
If your having trouble breaking it down, I would be available to look at the final animation and help you determine what could be minimized to reduce the overhead.

Infinite maps/scrolling question

I'm using cocos2d for the iPhone to create an infinitely scrolling horizontal tile map. To achieve this, I've generated a library of 'segments', which are basically horizontal chunks of levels that I randomly choose from and append to the end of that particular levels tile map. When tiles scroll off of the left of the screen they are removed from the layer and released. This all works fine.
My question revolves around the legitimacy of the scrolling method I've chosen. Following guidance from this article, I've been scrolling my map by updating the layers position at regular intervals (subtracting from the x axis to move the layer to the left). And while this works, I'm concerned that there's probably some finite limit to the positioning of a layer. Am I going to run into issues after a certain amount of time has passed (when the layers x-axis position is considerably large?)
Any thoughts on my approach would be appreciated.
This is a good question. What I would do is run some tests on how far you can position the layer. I placed a sprite and focused the camera to ccp(1000000000000000, 1000000000000000) with no issues.
Do you really think this would be an issue in real gameplay? Seems like it would take a very long game to reach a position like that.

Core Animation with contentsRect jerkiness

in my (puzzle) game the pieces are drawn on-screen using a CALayer for each piece. There are 48 pieces (in an 8x6 grid) with each piece being 48x48 pixels. I'm not sure if this is just too many layers, but if this isn't the best solution I don't know what is, because redrawing the whole display using Quartz2D every frame doesn't seem like it would be any faster.
Anyway, the images for the pieces come from one big PNG file that has 24 frames of animation for 10 different states (so measures 1152 x 480 pixels) and the animation is done by setting the contentsRect property of each CALayer as I move it.
This actually seems to work pretty well with up to 7 pieces tracking a touch point in the window, but the weird thing is that when I initially start moving the pieces, for the first 0.5 a second or so, it's very jerky like the CPU is doing something else, but after that it'll track and update the screen at 40+ FPS (according to Instruments).
So does anyone have any ideas what could account for that initial jerkiness?
The only theory I could come up with is it's decompressing bits of the PNG file into a temporary location and then discarding them after the animation has stopped, in which case is there a way to stop Core Animation doing that?
I could obviously split the PNG file up into 10 pieces, but I'm not convinced that would help as they'd all (potentially) still need to be in memory at once.
EDIT: OK, as described in the comment to the first answer, I've split the image up into ten pieces that are now 576 x 96, so as to fit in with the constraints of the hardware. It's still not as smooth as it should be though, so I've put a bounty on this.
EDIT2: I've linked one of the images below. Essentially the user's touch is tracked, the offset from the start of the tracking is calculated (they can one move horizontal or vertical and only one place at a time). Then one of the images is selected as the content of the layer (depending on what type of piece it is and whether it's moving horizontally or vertically). Then the contentsRect property is set to chose one 48x48 frame from the larger image with something like this:-
layer.position = newPos;
layer.contents = (id)BallImg[imgNum];
layer.contentsRect = CGRectMake((1.0/12.0)*(float)(frame % 12),
0.5 * (float)(frame / 12),
1.0/12.0, 0.5);
btw. My theory about it decompressing the source image a-fresh each time wasn't right. I wrote some code to copy the raw pixels from the decoded PNG file into a fresh CGImage when the app loads and it didn't make any difference.
Next thing I'll try is copying each frame into a separate CGImage which will get rid of the ugly contentsRect calculation at least.
EDIT3: Further back-to-basics investigation points to this being a problem with touch tracking and not a problem with Core Animation at all. I found a basic sample app that tracks touches and commented out the code that actually causes the screen to redraw and the NSLog() shows the exactly the same problem I've been experiencing: A long-ish delay between the touchesBegin and first touchesMoved events.
2009-06-05 01:22:37.209 TouchDemo[234:207] Begin Touch ID 0 Tracking with image 2
2009-06-05 01:22:37.432 TouchDemo[234:207] Touch ID 0 Tracking with image 2
2009-06-05 01:22:37.448 TouchDemo[234:207] Touch ID 0 Tracking with image 2
2009-06-05 01:22:37.464 TouchDemo[234:207] Touch ID 0 Tracking with image 2
2009-06-05 01:22:37.480 TouchDemo[234:207] Touch ID 0 Tracking with image 2
Typical gap between touchesMoved events is 20ms. The gap between the touchesBegin and first touchesMoved is ten times that. And that's with no computation or screen updating at all, just the NSLog call. Sigh. I guess I'll open this up a separate question.
I don't think it's a memory issue; I'm thinking that it has to do with the inefficiency of having that large of an image in terms of Core Animation.
Core Animation can't use it natively, as it exceeds the maximum texture size on the GPU (1024x1024). I would break it up some; individual images might give you the best performance, but you'll have to test to find out.
IIRC, UIImageView does its animating by setting successive individual images, so if it's good enough for Apple….
When it is about performance, I definitly recommend using Shark (also over Instruments). In the 'Time Profile' you can see what are the bottlenecks in your app, even if it is something in Apple's code. I used it a lot while developing my iPhone app that uses OpenGL and Core Animation.
Have you tried using CATiledLayer yet?
It is optimized for this type of work.