How to animate 120 images efficiently? - iphone

There are 120 images which are 360-degree round shootings of a product showcase, each is 3 degree apart, size 320 x 320 pixels. Displaying the images continuously will give the effect of rotating the product in front of the user.
So far, the performance is far from satisfactory, there is delay in loading and when the user touch to slide to a particular image.
If OpenGL is not an option, what is the recommended way to handle this kind of animation efficiently? Thanks!

I don't really understand your UI but here is my guess:
Your UI is a circle contains a lots of images and all are in the same screen. I think you can show the thumbnail images (50x50-80x80) in the circle and you can store all of thumbnails in memory without big issue. With that size, each of them is just around 8kb. 120 x 9 ~ 1 MB of memory
When you animate to an image, you can load the thumbnail first and then loading the big image in the background. It will create some effects that user see some images not clear and then see it clear. This will also help when users touch some to slide to a particular image. At least, you have something to show.
You can save some memory and loading issue by resizing the big image to be 75-80% of the size 320x320. This one will save you a little bit of time for loading the images and doesn't impact the image's quality a lot. It all depends on your app as well.
You can try to use eager loading. You can load images ahead before you actually animate them. You can have an array to store 3 - 5 images (320 x 320 is not a big size, so 5 of them will just around 500-1000 KB). When you show the first image, you start loading the 4 th image before hand.

If you want to really compress this and have smooth playback on mobile, you should store it as a movie and then use AVFoundation's AVPlayer to have custom playback. You can set it to roughly any frame or change the speed of playback forward or backward and loop to make it seamless. You could, for example, set it up to change the location or play velocity in the movie based on a swipe. It is more work to set up than MPMoviePlayer but is quite customizable.

How about a UIWebView + HTML5?
Look at the apple showcase: http://www.apple.com/html5/showcase/threesixty/

I would make lowres copies of all 120 images, maybe 80x80 in size. All 120 should fit in the GPU's texture cache. Load all 120 images into 120 offscreen image views, zoomed up to 320x320. Animate using those 120 image views, showing one image at a time. Whenever the animation stops, quickly replace the last image with one more view that has the corresponding full resolution 320x320 copy. Whenever the user starts to start to animation, switch back to the lowres set of images.
This works because the eye can't see changing objects (illusion of motion) with the same resolution as they can a static image, depending on the rate of animation of course.

Related

Webdesign: PNG-Image width limitations

I want to make a kind of slideshow based on scrolling the webpage.
My problem is that I have an image width of 78720x1015px in png-format.
The width of the image is determined by one single image of 1920px which is 41 times arranged next to each other. - It should be like a cartoon where an image moves by 100% (margin: -100%) and generates a feeling of a movie.
However, this results in an image width of 1920px x 41pics = 78720px.
This is of an enormous width, but what I am wondering about is that the filesize is only 975kB which is in my opinion not that big!? - However, somehow it takes a very long time to load the picture in the Webbrowser and the image is not of such quality as in my ImageViewer on Desktop.
Question 1: What do I have to consider when dealing with such a big image-width? What are the limits?
Question 2: Is there a better way make such kind of a slideshow? - Consider that the sliding itself shouldn't be visible. It should be like a movie based on about 40 pictures.
Thanks in advance!
PNG compresses very well, especially if it's a cartoon like you say that may use only a limited number of colours.
However, when loaded into memory, the device must load all that pixel data into RAM to display it. That's almost 80 million pixels in your case, which would be around 320Mb of uncompressed data. This is probably why the browser is struggling, and especially so if you're using margin to move it around as that requires a full re-draw of the image.
You may have better results with transform, as this should use hardware acceleration and also avoids the reflow part of a redraw since transforms don't affect page layout.
But the better solution would be to break it down into individual images. Have your code load in the next image, scroll it across, then load the next while scrolling and unload the one that's now off-screen to provide a relatively seamless view.

Preloading a large image

I have an image with the dimensions of 5534 × 3803, and size of 2.4mb. The UIView references notes that:
"In iOS 3.0 and later, views are no longer restricted to this maximum
size but are still limited by the amount of memory they consume."
When the image loads, it lags for half a second, then slides in. The image sits in the UIImageView at 1024x704, but can be scaled up to 4x that size for the purpose of my app.
Are you able to preload the image in the AppDelegate? Or is there another way of working around having such a large image?
Thanks
EDIT: The scaling is done via UIPinchGestureRecognizer, and scales up and done (scale x4 - x1) based on the image's center point. There is no panning of the image when zoomed in.
Personally, I would try to write a tile-based system (think Google Maps) that slices your big image into a grid of small images to avoid loading in that gigantic image all at once into RAM. I don't really know what your user interactions are for this image, or whether the images are changing or baked into your project, but I'd assume you can let users scroll around since that image is bigger than any iOS screen. With a tile-based system, you only load the images that are on-screen. CATiledLayer is an Apple class for doing just such a thing. That's probably what you want to look into.
See this StackOverflow question for some different approaches. The accepted answer uses code from Apple's sample PhotoScroller project, which may work for your needs and uses CATiledLayer.
This ScrollViewSuite Apple code might also get on your way (check out the Tiling code).

Big UIImage scrolling performance

I have a big UIimage (2000x2000). Image drawed every time on app start, and copied to CALayer.
On Current time i put UIScrollView on main view, and make CALayer with drawn image.
Scrolling on small zoom looks fine. But on min zooming , when whole image visible, image scroll slowed, it becomes not quick responsible on move touch.
So, the question. What can I do, to increase scrolling performance?
The approach I would take is to use a lower resolution version of your image at lower zoom levels (lower = zoomed out).
First, see this post for resizing UIImages.
Respond to the scrollViewDidEndZooming:withView:atScale: method in UIScrollViewDelegate, and switch the images when a certain zoom level is reached. This will take some trial and error to find the correct balance. You may even want to render your image at several different resolutions. Be sure to generate the different sized UIImages in advance so there is no delay while zooming.

Big animation iPhone with CCSpriteFrameCache - plist

I have a problem when try to load the big animation with about 54 images (320x480 each image) into CCSpriteFrameCache, I can't use plist for this. How can I make the animation work? At this time, my animation can't work on iPhone 2G, 3G, and iPod.
Thank for your help,
John
You won´t be able to do it...
Consider playing a video or just animating an small portion of the screen.
Your best bet is to determine why the animation has 54 images that are all the width/height of the screen. This is an unnecessary number of images.
Break the animation down:
Is the background 'static' (does it move around, change constantly, etc?)
If it moves around a bit, but is really part of a much larger "canvas" then simply export out the entire background canvas and perform the movements yourself using the Cocos2D Actions available to you (CCMoveTo, CCJumpTo, CCDelayTime, CCSequence, etc)
What in the animation moves around, and how does it move around?
Can it be broken into much smaller bits and the frames for the various "characters"
or "movable objects" within the scene be exported out onto a sprite sheet (saved out
via Zwoptex?)
A good animation sequence should be a series of much smaller images, all working together in unison to create the final "animation sequence".
If you break it down, I wouldn't be surprised if you were able to reduce your 54 images at 320x480 each down to a handful of 512x512 spritesheets (ala Zwoptex).
If your having trouble breaking it down, I would be available to look at the final animation and help you determine what could be minimized to reduce the overhead.

Core Animation with contentsRect jerkiness

in my (puzzle) game the pieces are drawn on-screen using a CALayer for each piece. There are 48 pieces (in an 8x6 grid) with each piece being 48x48 pixels. I'm not sure if this is just too many layers, but if this isn't the best solution I don't know what is, because redrawing the whole display using Quartz2D every frame doesn't seem like it would be any faster.
Anyway, the images for the pieces come from one big PNG file that has 24 frames of animation for 10 different states (so measures 1152 x 480 pixels) and the animation is done by setting the contentsRect property of each CALayer as I move it.
This actually seems to work pretty well with up to 7 pieces tracking a touch point in the window, but the weird thing is that when I initially start moving the pieces, for the first 0.5 a second or so, it's very jerky like the CPU is doing something else, but after that it'll track and update the screen at 40+ FPS (according to Instruments).
So does anyone have any ideas what could account for that initial jerkiness?
The only theory I could come up with is it's decompressing bits of the PNG file into a temporary location and then discarding them after the animation has stopped, in which case is there a way to stop Core Animation doing that?
I could obviously split the PNG file up into 10 pieces, but I'm not convinced that would help as they'd all (potentially) still need to be in memory at once.
EDIT: OK, as described in the comment to the first answer, I've split the image up into ten pieces that are now 576 x 96, so as to fit in with the constraints of the hardware. It's still not as smooth as it should be though, so I've put a bounty on this.
EDIT2: I've linked one of the images below. Essentially the user's touch is tracked, the offset from the start of the tracking is calculated (they can one move horizontal or vertical and only one place at a time). Then one of the images is selected as the content of the layer (depending on what type of piece it is and whether it's moving horizontally or vertically). Then the contentsRect property is set to chose one 48x48 frame from the larger image with something like this:-
layer.position = newPos;
layer.contents = (id)BallImg[imgNum];
layer.contentsRect = CGRectMake((1.0/12.0)*(float)(frame % 12),
0.5 * (float)(frame / 12),
1.0/12.0, 0.5);
btw. My theory about it decompressing the source image a-fresh each time wasn't right. I wrote some code to copy the raw pixels from the decoded PNG file into a fresh CGImage when the app loads and it didn't make any difference.
Next thing I'll try is copying each frame into a separate CGImage which will get rid of the ugly contentsRect calculation at least.
EDIT3: Further back-to-basics investigation points to this being a problem with touch tracking and not a problem with Core Animation at all. I found a basic sample app that tracks touches and commented out the code that actually causes the screen to redraw and the NSLog() shows the exactly the same problem I've been experiencing: A long-ish delay between the touchesBegin and first touchesMoved events.
2009-06-05 01:22:37.209 TouchDemo[234:207] Begin Touch ID 0 Tracking with image 2
2009-06-05 01:22:37.432 TouchDemo[234:207] Touch ID 0 Tracking with image 2
2009-06-05 01:22:37.448 TouchDemo[234:207] Touch ID 0 Tracking with image 2
2009-06-05 01:22:37.464 TouchDemo[234:207] Touch ID 0 Tracking with image 2
2009-06-05 01:22:37.480 TouchDemo[234:207] Touch ID 0 Tracking with image 2
Typical gap between touchesMoved events is 20ms. The gap between the touchesBegin and first touchesMoved is ten times that. And that's with no computation or screen updating at all, just the NSLog call. Sigh. I guess I'll open this up a separate question.
I don't think it's a memory issue; I'm thinking that it has to do with the inefficiency of having that large of an image in terms of Core Animation.
Core Animation can't use it natively, as it exceeds the maximum texture size on the GPU (1024x1024). I would break it up some; individual images might give you the best performance, but you'll have to test to find out.
IIRC, UIImageView does its animating by setting successive individual images, so if it's good enough for Apple….
When it is about performance, I definitly recommend using Shark (also over Instruments). In the 'Time Profile' you can see what are the bottlenecks in your app, even if it is something in Apple's code. I used it a lot while developing my iPhone app that uses OpenGL and Core Animation.
Have you tried using CATiledLayer yet?
It is optimized for this type of work.