I'm building an app which includes a number of image sequences (5 sequences with about 80 images each). It runs nicely in the iPhone simulator, but causes my iPhone to reboot when I test it. By the way, each png image is about 8k in size.
Has anyone successfully built a similar app?
Am I using too many resources for the iPhone to handle?
Anyone?
UPDATE:
Thanks to all for you answers! I've modified my code to use [UIImage imageWithContentsOfFile:] instead of [UIImage imageNamed:]
However I'm still unable to prevent the app from crashing my iPhone.
(please note that my pngs are not that big about 400x400px / 8k)
Does anyone have any suggestions?
Here's my code:
// code snippet:
myFrames = [[NSMutableArray alloc] initWithCapacity:maxFrames];
NSMutableString *curFrame;
num = 0;
// loop (maxframes = 80)
for(int f = 1; f < maxFrames+1; f++)
{
curFrame = [NSMutableString stringWithString:tName];
if(f < 10) [curFrame appendString:[NSString stringWithFormat:#"00%i",f]];
else if(f>9 && f<100) [curFrame appendString:[NSString stringWithFormat:#"0%i",f]];
else [curFrame appendString:[NSString stringWithFormat:#"%i",f]];
UIImage *img = [UIImage imageWithContentsOfFile: [[NSBundle mainBundle] pathForResource:curFrame ofType:#"png"]];
if(img) [myFrames addObject:img];
[img release];
}
// animate the images!
self.animationImages = myFrames;
self.animationDuration = (maxFrames * .05); // Seconds
[self startAnimating];
The best way to find out is to run the application under Instruments using Leaks or Object Alloc. If you see an upward trend that keeps rising, you might have a leak.
If you're using [UIImage imageNamed:], you should be aware that it pre-caches an optimized version which takes up more memory when compared with [UIImage imageWithContentsOfFile:]. Additionally, until iPhone 3.0, the cache created by [UIImage imageNamed:] doesn't get released when there's a memory warning.
The current-gen iPhone only has 128MB of ram, some of which is used by the OS itself. A 320x480 image fully uncompressed with an alpha channel can take 614k. If you have 400 unique images that are full screen, that's well over 128MB of ram, assuming it is loaded up and cached uncompressed.
The number one reason why an app would not crash on the simulator but on the phone would be memory
On the iphone simulator AFAIK the memory is not limited to 128Mb while on the iphone once it reaches 128Mb it restarts. So check your memory usage on the simulator. You have to change the way you are loading the images and or check for leaks. Also check if your getting low memory warnings by implementing the methods (I forgot what they are called :()
I've seen apps run in the simulator and not on the phone because of improper PNG formatting (even a single improperly formatted image can cause this crash). Check to make sure that the format of your images matches those of PNG files provided by apple in their example apps.
That being said 400 full screen images would easily cause it to run out of memory as in memory they will occupy far more than the 8kb. Not sure how big those images are, but if they're all in memory they will need to be very, very small on the iPhone.
The first answer to your question states that while your PNGs may take up only 8K on disk, that is the compressed on-disk form. When it is loaded into memory, it is decompressed and is much larger than 8K. At 32-bits per pixel, a 400x400 image will be 640K.
Even without the alpha channel, you're looking at 480K. 480K x 80 frames, that is 38.4MB, which is definitely creeping into using more memory than the iphone has available to give your app at once. Here is an article about some of the troubles with obtaining a substantial about of memory from the iPhone OS.
Related
I would like to render multiple H264 mp4 videos on multiple views at the same time. Target is to read about 8 short videos, each at a size of 100x100 pixels and let them display their content on multiple positions on the screen, simultaneously.
Imagine 24 squares on the screen, each showing one video out of pool of 8 videos.
MoviePlayer doesn't work, for it's only showing one fullscreen video. An AVPlayer with multiple AVPlayerLayers is limited, because only the most-recently-created Layer will show it's content on screen (according to the documentation and my testing).
So, i wrote a short video class and created an instance for every .mp4 file in my package, using AVAssetReader to read it's content. On update, every videoframe is retreived converted to an UIImage and displayed, according to the video's framerate. Furthermore, these images are cached for a fast access on looping.
- (id) initWithAsset:(AVURLAsset*)asset withTrack:(AVAssetTrack*)track
{
self = [super init];
if (self)
{
NSDictionary* settings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA], (NSString*)kCVPixelBufferPixelFormatTypeKey, nil];
mOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:track outputSettings:settings];
mReader = [[AVAssetReader alloc] initWithAsset:asset error:nil];
[mReader addOutput:mOutput];
BOOL status = [mReader startReading];
}
return self;
}
- (void) update:(double)elapsed
{
CMSampleBufferRef buffer = [mOutput copyNextSampleBuffer];
if (buffer)
{
UIImage* image = [self imageFromSampleBuffer:sampleBuffer];
}
[...]
}
Actually this works pretty well, but only for 4 videos. The fifth one never shows up. First I thought of memory issues, but I tested it on the following devices:
iPhone 3GS
iPhone 4
iPad
iPad 2
I had the same behaviour on each device: 4 videos playing in a smooth loop, no differences.
If it would have been a memory issue, I would have expect at least either the iPad 2 to show 5 or 6 videos (due to it's better hardware) or the 3GS to show only 1 or a crash somewhere.
The simulator shows all videos, though.
Debugging on the device shows, that
BOOL status = [mReader startReading];
returns false for video 5,6,7 and 8.
So, is there some kind of hardware setting (or restriction) that doesn't allow more than 4 simultaneous AVAssetReaders? Because, I can't really explain this behaviour. I don't think that all devices have the exact same amount of video memory.
Yes, iOS has an upper limit on the number of videos that can be decoded at one time. While your approach is good, I don't know of any way to work around this upper limit as far as having that many h.264 decoders active at once. If you are interested, please have a look at my solution to this problem, this is an xcode project called Fireworks. Basically, this demo shows decoding a bunch of alpha channel videos to disk, then each one is played by mapping a portion of the video files into memory. This approach makes it possible to decode more than 4 movies at the same time without using up all the system memory and without running into the hard limit of the number of h.264 decoder objects.
Have you tried creating separate AVPlayerItems based on the same AVAsset for each AVPlayerLayer?
Here's my latest iteration of a perfectly smooth-scrolling collection view with real-time video previews (up to 16 at a time):
https://youtu.be/7QlaO7WxjGg
It even uses a cover flow custom layout and "reflection" view that mirrors the video preview perfectly. The source code is here:
http://www.mediafire.com/download/ivecygnlhqxwynr/VideoWallCollectionView.zip
I have created an app which has a library of images. I have imported the images in to my resources folder in Xcode and am currently loading them in to an array in the appDelegate when the app loads. This is happening each time. I am now worried that as the library grows the app will run out of memory. Is there a better way of loading these?
the structure is
Library > category > image list > image
I have an NSMutableArray called mainLibrary which I then add another nsmutablearray for each category. For example
NSMutableArray *arrHome = [[NSMutableArray alloc] initWithObjects: #"Home",
[NSDictionary dictionaryWithObjectsAndKeys: [NSString stringWithFormat:#"%#/%#", resourcePath, #"alarm_clock.png"], #"image_path", #"Alarm Clock", #"title", nil],
[NSDictionary dictionaryWithObjectsAndKeys: [NSString stringWithFormat:#"%#/%#", resourcePath, #"bathtub.png"], #"image_path",#"Bathtub", #"title", nil],
nil];
[mainLibrary addObject: arrHome];
The above is for the "home" category and contains only two images. For other categories, I have the NSDictionary line for each image - there can be up to 100 images in each category.
If I am loading in 50mb+ of images to an array will this cause the app to become unresponsive - or more than likely, simply crash as it uses more of the 24mb (ish) allowance per app??
I am wondering whether I should do the above code and read in to the array ONCE a category has been selected - but that might then cause it to stall for the user.
Any thoughts?!
ps) I also need to have a think about Retina display. With 600+ images, duplicating them would mean my binary size would rocket!!
A while ago I ran some tests for an app I was working on. Loading 5000 images in succession:
[UIImage imageWithData:...] // 44.8 seconds
[UIImage imageWithContentsOfFile:...] // 52.3 seconds
[[UIImage alloc] initWithContentsOfFile:…] // 351.8 seconds
[UIImage imageNamed:...] // hung due to caching
Using the paths to the files is obviously the way to go. Obviously memory management will be key in your case.
compose your path dictionaries/arrays using strings, then load/unload images as needed (load lazily), if you need such a structure to interface with in your program.
600 HQ images will exceed the memory limits, and it's wasteful because the user will see how many of these images at a given time? resources are quite limited, and hq images require a lot of space.
if you have to display a bunch of thumbnails, those may be shrunk and exported easily enough so you can display all the images in a reasonable amount of time, while using a rational amount of memory (again, just load lazily where possible and release the image when not very easily viewed).
Why not use [UIImage imageNamed:IMAGE_NAME] directly ? It has a good memory management inside.
Hello I'm working on an iPhone application which provides information with images and texts. In every text there is one image, which can be clicked and zoomed, shown with a UIImageView
NSString* imgName = [imgPath substringToIndex:[imgPath rangeOfString:#".jpg"].location];
UIImage* img = [UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:imgName ofType:#"jpg"]];
[imgView setImage:img];
as I go through the images by opening them one by one the app crashes(debug on device). with some error in console:
: Decompression error
my_app_name(1226,0x3e088868) malloc: * mmap(size=32768) failed (error code=12)
* error: can't allocate region
and then:
CoreAnimation: failed to allocate 2228352 bytes.
I don't have leak in code and if I do not open the images I don't get the error. so does anyone have a clue where this problem could be?
oh I think I finally fixed it. and yes my images are relative large, about 700*600 in size.
the problem seems to be in [imgView setImage:img]; the img is although released but somehow still in memory, I don't know. One line code [imgView setImage:nil]; before releasing imgView in dealloc fixes the problem.
Thank you for the helps.
Seems you are using too much memory?
How many images do you open? Start with instruments attached and watch the memory footprint.
Keep in mind that images take much more memory when loaded than compressed on disk.
Try wrapping-up your allocations/releases in a local auto release pool.
I am using the new Retina display feature of ios 4.0 in my iphone application.
I added the images for higher resolution with the naming convention as image#2x.png to my existing image folder.
eg. I am adding the image in the following way:
UIImageView *toolBarBg=[[UIImageView alloc] initWithFrame:CGRectMake(0,0,320,88) ];
NSString *toolBarBgImgPath=[[[NSBundle mainBundle]resourcePath]stringByAppendingPathComponent:#"bg_btm_bar.png"];
[toolBarBg setImage:[UIImage imageWithContentsOfFile:toolBarBgImgPath]];
I am having the image named "bg_btm_bar#2x.png" in my image folder as well.
But when I am running my application it is not taking the higher res image.
I am not understanding that how to make the application use the higher res image.
Please help me out!
Use imageNamed: instead of imageWithContentsOfFile:
So far I've managed to create an app for iPhone that takes multiple images with about a 3 second interval between each. I`m processing each image in a separate thread asynchronously and everything is great till it gets to the moment for saving the image on the iPhone disk. Then it takes about 12 seconds to save the image to the disk using JPEG representation.
How does Apple do it, how do they manage to save a single image so fast to the disk is there a trick they are using? I saw that the animations distract the user for a while, but still the time needed is below 12 seconds!
Thanks in advance.
Actually apple uses its kernal driver AppleJPEGDriver, It is a hardware jpeg encoding api and is much faster than software encoding (JPEGRepresnetaion) and some of the people using it in their jailbreak apps(cycorder video recording application).
Apple should give the same functionality to its users but they are apple :)
I haven't tried this but I wouldn't be so sure that Apple isn't using the same methods. A big part of the Apple design philosophy relies on hiding operational interruptions from the user. The Apple code may take as much time as yours but simply be adroit at hiding the entire save time from the perception of the user.
If someone can't tell you how Apple actually does save faster I would suggest looking at ways to disguise the save time.
If you google around a bit... there is a whole bunch of people with the same problem.
I didn't find an answer. The general conclusion seems to be that apple either uses some internal api and bypass public api overhead or some hardware encoder.
Guess you are out of luck for fast image saving
I was having this problem in my app, on saving it would hang so I used Grand central dispatch.
Below is the setImage method out of my image cache class, if UIImage has a image it saves it otherwise it deletes it. You can adapt this to suit your needs hopefully, will only work on iOS 4+. The code is ARC enabled.
-(void)setImage:(UIImage *)image{
if (image == nil){
NSLog(#"Deleting Image");
// Since we have no image let's remove the cached image if it exists
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSString *cachePath = [NSSearchPathForDirectoriesInDomains(NSCachesDirectory,
NSUserDomainMask, YES) objectAtIndex:0];
[[NSFileManager defaultManager] removeItemAtPath:[cachePath
stringByAppendingPathComponent:#"capturedimage.jpg"] error:nil];
});
}
else {
NSLog(#"Saving Image");
// We've got an image, let's save it to flash memory.
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSString *cachePath =
[NSSearchPathForDirectoriesInDomains(NSCachesDirectory,
NSUserDomainMask, YES) objectAtIndex:0];
NSData *dataObj = UIImagePNGRepresentation(image);
[dataObj writeToFile:[cachePath
stringByAppendingPathComponent:#"capturedimage.jpg"] atomically:NO];
});
}
imageCache = image;
}