I have a large png Image that I need to Zoom&Move.
I therefore created a UIScrollView and embedded a UIImageView.
The App works fine in the simulator, but when running it on the device (8GB iPod Touch) it crashes as soon as the view is loaded.
I tried with a smaller test Image (4MB) works fine and suspect the iPod can't handle a 20MB PNG. I also tried different other formats, such as JPG (in various save patterns), but that did't help either.
Any clues how I can solve this?
Ouch, 20M is a large image. The first thought that comes to mind is can you dice up the image? I.e. instead of one image have a whole bunch of small images which together make up the larger image. Then you can load on demand the same way google maps downloads image squares.
have a look at the ScrollViewSuite Example from apple. Sounds exactly like what you are trying to do.
3_Tiling demonstrates:
How to subclass UIScrollView to add content tiling
Reusing tiles to optimize performance and memory use
Changing the resolution of the content in response to zooming
I suggest the Example Photoscroller. It demonstrates CATiledLayer which you can use to tile your image and even use smaller images as lod images. It's much smaller then the complete ScrollViewSuite example but has everything you need to do what you want. It contains only 2 classes which you should be able to use in your project with minor edits.
You might want to check the WWDC 2010 Session #104 "Desinging Apps with Scroll Views"... they handle and explain that example.
You will need to tile your image. I suggest imagemagick for that :)
Related
I have a huge map image (25mb) that I want to use in my app. The entire point of the app is to pinch zoom and scroll around this richly detailed map.
I am loading it in a uiimageview, with some subview layers that can be toggled on and off to overlay the map.
Everything is working fine code wise and on my simulator, but my concern is that the actual iPhone/iPad may have problems rendering/manipulating a 25mb png image.
Anyone have experience with huge image files in Xcode? Any recommendations or concerns about what I have described?
Someone suggested using uiwebview to display the image. I'm not sure why thy would be any better.
Thanks
Best way to handle very large images is to chop them into easily manageable square chunks and load them dynamically. Here you can see a very nice tutorial for a tiled UIScrollView. Modding the code a bit will surely suit your needs.
Check out the ADC Videos, specifically WWDC 2010 Session 104 "Designing Apps With Scroll Views". This shows how to use CATiledLayer.
I need to develop a feature into an iPhone app which will allow the user to zoom in very much on an image and display high-quality details of the image without loading the large and loading it online. I've found a example here: developer.apple.com/library/ios/#samplecode/ScrollViewSuite/ but it dont seems to zoom at full details,
And the images are store locally.
I have seem apple developer example like PhotoScroller, scrollviewsuite.
but feels they work differently just cutting the image in tiles.
Is it Possible To access asynchronously chunk of data and render it to view at same time in didReceivedData function delegate.
Please provide some tutorial or example.
Thanks
Avinash
Based on what you have said, PhotoScroller is really what you are describing. It is the only example that I know that can handle HUGE images and only display what is needed. I have used it for 100 megapixel images and it works great.
I have about 20-ish high quality images (~3840x5800 px) that I need to load in a simple gallery type app. The user clicks a button and the next image is loaded into the UIImageView.
I currently use [UIImage imageWithContentsOfFile:] which takes about 6 seconds to load each image in the simulator :(
if I use [UIImage imageNamed:] it takes even longer to load but caches the images which means its quicker if the user wishes to see the same images again. But it may cause memory problems later with all that caching crashing my app.
I want to know whats the best practice for loading these? I'm experimenting with reducing image file size as much as is possible but I really need them to be high quality image for the purpose of the app (zoomable, etc.).
Thanks for any advice
[EDIT]
Hey again guys,
Thanks for all ye're advice. The project's spec's have changed a little. Now as well as displaying the images they firstly have to be zoomed in to a particular spot and when the user taps next it zooms out and then displays the next image. So I'm not sure if the proposed solutions fits?
Apple's docs recommend against trying to load single images that are larger than 1024x1024. You should look into using CATiledLayer instead, to load pieces of the images as needed.
You can have a look at this Apple sample:
http://developer.apple.com/library/ios/#samplecode/PhotoScroller/Introduction/Intro.html#//apple_ref/doc/uid/DTS40010080
It shows how to load big images, breaking them in tiles for different zoom levels.
You can't see all those pixels at any given time, so there is no need to load them all. Load lower-res copies ("big-thumbnails") to view the complete image, then selected sub-tiles, maybe of 2 or more different resolution sets, after the user zooms in.
The CATiledLayer API may be able to handle some of the latter for you.
I've run into the issue of using a UIBarButtonItem with a custom color. Everything out on the 'net seems to indicate that the only way around this lack of official API support revolves around the use of images. This is all fine and dandy when developing for pre-iOS 4 devices, except when using the new iPhone 4. Creating an image for iPad and pre-iOS 4 devices is straightforward enough, but the images developed for those devices look absolutely horrid on iPhone 4. I suspect that this problem will be exacerbated further with the introduction of next generation devices.
Consider the example below. Notice how the default colored button is nice and smooth, but the iPhone 3GS image looks terrible. It does not seem very scalable (pun intended) to have to include multiple images for different resolution devices.
In the absence of an official API for changing the color of a UIBarButtonItem, what strategies are out there for creating images that scale well against differing resolution devices? This problem is hardly unique to UIBarButtonItems, how is the community adapting to other UI elements that are bitmapped? Is there a better solution for this particular case than using an image (such as using Quartz to draw it)?
If at all possible, please offer concrete code examples.
You can list any image as Image#2x.png along with Image.png and the system will select the appropriate image at runtime.
If you look at the source for Three20 you can see how they draw custom buttons and shapes that will scale well, regardless of resolution.
Give Opacity (for Mac) a try. Draw your button in it with vector elements and effects, and it'll spit out the necessary Quartz code to reproduce it, drawing natively in your iOS application. You get Retina (#2x) support automatically.
Been over a year since I posted this question, but ran into a use case where I wanted to be able to do this, so instead of having to draw or otherwise create the buttons, I decided to write an open source application to create them. This application uses private APIs to change the colors of the UIBarButtonItem objects and then uses a graphics context to save them to a determined location on your computer's file system. This way you can have pixel perfect UIBarButtonItem images to use in your UIToolbars.
The app creates both the standard and #2x resolution images.
UIBarButtonItem-Generator # GitHub
Any vector drawing app may work, but I would also consider povray, which allows you to create in a C-like scripting 3D language, then export any pixel size you choose.
http://povray.org
I have the same problem with navigation bar so solve as the following:
first i subclass my navigation bar
inside this class
- (void)drawRect:(CGRect)rect
{
UIImage *image=[UIImage imageNamed:#"MyImage.png"];
self.frame=CGRectMake(0, 0, image.size.width, image.size.height);
self.backgroundImage =image;
}
finally save the same image with different resolution With #2x at the end
When a PNG is added to an XCode iPhone project, the compiler optimizes it using pngcrush. Once on the device, the image's rendering performance is very fast.
My problem is that my application downloads its PNGs from an external source at runtime (from Picasa Web albums, using the Google Data APIs). Unfortunately, these images' performance is quite bad. When I do custom rendering on top of the image, it seems 100x slower than its internally stored counterparts. I strongly suspect this is because the downloaded images haven't been optimized.
Does anyone know how I can optimize an externally downloaded PNG at runtime on the iPhone? I'm hoping for a class that does this. I even considered adding pngcrush's source code to my app, which seems drastic. I haven't been able to find an decent answer myself. I'd be very grateful for any help.
Thanks!
Update:
Some folks have suggested that it may be due to the file's size, but it isn't. During my tests, I added a toggle button to switch between using the embedded version and the downloaded version of exactly the same PNG. The only difference is that the embedded one was optimized by 'pngcrush' during compilation. This does some byte-swapping (from RGBA to BRGA) and pre-multiplication of alpha. (http://iphonedevelopment.blogspot.com/2008/10/iphone-optimized-pngs.html)
Also, the performance I'm referring to isn't the downloading, but the rendering. I superimpose custom painting on top of the image (overriding the drawRect method of the UIView), and it's very choppy when the background is the downloaded version, and very smooth when it's the embedded (and therefore optimized) version. Again, it's exactly the same file. The only difference is the optimization, which I'm hoping I can perform on the image at runtime, on the device, after downloading it.
Thanks again for everyone's help!
That link you posted pretty much answers your question.
During the build process XCode pre-processes your png so it's in a format that's more friendly to the graphics chip in the iPhone.
Png's that have not been processed like this will likely use a slower rendering path, one that deals with the non-native format and the fact that the alpha must be computed separately for each color.
So you have two options;
Perform the same work that pngcrush does and swap ordering/pre-multiply alpha. The speed up may be due to one or both of these.
After you have loaded your image, you can "create" a new image from it. This new image should be in the iPhone's native format and so should perform faster. The downside is it could potentially take up a bit more memory.
E.g.
CGRect area = CGRectMake(0, 0, width, height);
CGSize size = area.size;
UIGraphicsBeginImageContext(size);
[oldImage drawInRect:area];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
The fact that you say it "seems" 100x slower indicates that you have not performed any experimentation, but made a guess (it must be the PNG optimization), and are now going down a path based on a hunch.
You should spend time to confirm what the problem is before you try to solve it. My gut says that PNG optimization shouldn't be the issue: that mostly affects the loading of images, but once they are in memory it doesn't matter what file format they were originally in.
Anyway, you should try an A-B comparison, either get your code to load an optimized PNG from somewhere else and see how it compares, or make a test app that just does some drawing on the two PNG types. Once you've confirmed what the problem is, then you can figure out if you need to compile pngcrush into your app.
On the surface, it sounds like something else is at play here. Any additional image manipulation should only add time until it's displayed onscreen...
Would it be at all possible to get the server to gzip the images by sending the appropriate HTTP header? (If it even helps file size much, that is.)
Temporarily using the pngcrush source might be a good test as well, just to get some measurements.
Are you storing the png at the original downloaded size? If it's a large image it'll take significantly longer to render.
Well it seems that a good way to do it (since you can't run pngcrush on the iPhone and expect that to speed it up) would be to make your requests through a proxy that runs pngcrush. The proxy would have nice horse power to actually give you some gain over the 100x pain you feel.
try pincrush to trans the normal png file to the crushed png file
You say you are drawing on top of the image by overriding a UIView's drawRect: method. Are you trying to do some animation by repeatedly drawing the whole image with your custom stuff on top of it?
You might get better results if you put your custom stuff in a separate view or layer, and let the OS deal with compositing the result over the background. The OS will only update the parts of the screen that you actually change, and won't be repainting the entire image as often.