Understanding the memory consumption on iPhone - iphone

I am working on a 2D iPhone game using OpenGL ES and I keep hitting the 24 MB memory limit – my application keeps crashing with the error code 101. I tried real hard to find where the memory goes, but the numbers in Instruments are still much bigger than what I would expect.
I ran the application with the Memory Monitor, Object Alloc, Leaks and OpenGL ES instruments. When the application gets loaded, free physical memory drops from 37 MB to 23 MB, the Object Alloc settles around 7 MB, Leaks show two or three leaks a few bytes in size, the Gart Object Size is about 5 MB and Memory Monitor says the application takes up about 14 MB of real memory. I am perplexed as where did the memory go – when I dig into the Object Allocations, most of the memory is in the textures, exactly as I would expect. But both my own texture allocation counter and the Gart Object Size agree that the textures should take up somewhere around 5 MB.
I am not aware of allocating anything else that would be worth mentioning, and the Object Alloc agrees. Where does the memory go? (I would be glad to supply more details if this is not enough.)
Update: I really tried to find where I could allocate so much memory, but with no results. What drives me wild is the difference between the Object Allocations (~7 MB) and real memory usage as shown by Memory Monitor (~14 MB). Even if there were huge leaks or huge chunks of memory I forget about, the should still show up in the Object Allocations, shouldn’t they?
I’ve already tried the usual suspects, ie. the UIImage with its caching, but that did not help. Is there a way to track memory usage “debugger-style”, line by line, watching each statement’s impact on memory usage?
What I have found so far:
I really am using that much memory. It is not easy to measure the real memory consumption, but after a lot of counting I think the memory consumption is really that high. My fault.
I found no easy way to measure the memory used. The Memory Monitor numbers are accurate (these are the numbers that really matter), but the Memory Monitor can’t tell you where exactly the memory goes. The Object Alloc tool is almost useless for tracking the real memory usage. When I create a texture, the allocated memory counter goes up for a while (reading the texture into the memory), then drops (passing the texture data to OpenGL, freeing). This is OK, but does not always happen – sometimes the memory usage stays high even after the texture has been passed on to OpenGL and freed from “my” memory. This means that the total amount of memory allocated as shown by the Object Alloc tool is smaller than the real total memory consumption, but bigger than the real consumption minus textures (real – textures < object alloc < real). Go figure.
I misread the Programming Guide. The memory limit of 24 MB applies to textures and surfaces, not the whole application. The actual red line lies a bit further, but I could not find any hard numbers. The consensus is that 25–30 MB is the ceiling.
When the system gets short on memory, it starts sending the memory warning. I have almost nothing to free, but other applications do release some memory back to the system, especially Safari (which seems to be caching the websites). When the free memory as shown in the Memory Monitor goes zero, the system starts killing.
I had to bite the bullet and rewrite some parts of the code to be more efficient on memory, but I am probably still pushing it. If I were to design another game, I would certainly think of some resource paging. With the current game it’s quite hard, because the thing is in motion all the time and loading the textures gets in the way, even if done in another thread. I would be much interested in how other people solve this issue.
Please note that these are just my views that do not have to be much accurate. If I find out something more to say on this topic, I will update the question. I’ll keep the question open in case somebody who understands the issue would care to answer, since these all are more workarounds and guesses than anything else.

I highly doubt this is a bug in Instruments.
First, read this blog post by Jeff Lamarche about openGL textures:
has a simple example of how to load
textures without causing leaks
gives understanding of how "small"
images, get once they are loaded
into openGL, actually use "a lot" of memory
Excerpt:
Textures, even if they're made from
compressed images, use a lot of your
application's memory heap because they
have to be expanded in memory to be
used. Every pixel takes up four bytes,
so forgetting to release your texture
image data can really eat up your
memory quickly.
Second, it is possible to debug texture memory with Instruments. There are two profiling configurations: OpenGL ES Analyzer and OpenGL ES Driver. You will need to run these on the device, as the simulator doesn't use OpenGL. Simply choose Product->Profile from XCode and look for these profiles once Instruments launches.
Armed with that knowledge, here is what I would do:
Check that you're not leaking memory -- this will obviously cause this problem.
Ensure your'e not accessing autoreleased memory -- common cause of crashes.
Create a separate test app and play with loading textures individually (and in combination) to find out what texture (or combination thereof) is causing the problem.
UPDATE: After thinking about your question, I've been reading Apple's OpenGL ES Programming Guide and it has very good information. Highly recommended!

One way is to start commenting out code and checking to see if the bug still happens. Yes it is tedious and elementary, but it might help if you knew where the bug was.
Where it is crashing is why it is crashing, etc.

Hrmm, that's not many details, but if leaks doesn't show you where the leaks are, there are two important options:
[i] Leaks missed a leak
[ii] The memory isn't actually being leaked
fixing [i] is quite hard, but as Eric Albert said filing a bug report with Apple will help. [ii] means that the memory you're using is still accessible somewhere, but perhaps you've forgotten about it. Are any lists growing, without throwing out old entries? Are any buffers being realloc()ed a lot?

For those seeing this after the year 2012:
The memory really loaded into device's physical memory is the Resident Memory in VM Tracker Instrument.
Allocation Instrument only marks the memory created by malloc/[NSObject alloc] and some framework buffer, for example, decompressed image bitmap is not included in Allocation Instrument but it always takes most of your memory.
Please Watch WWDC 2012 Session 242 iOS App Performance: Memory to get the information from Apple.

This doesn't specifically help you, but if you find that the memory tools don't provide all the data you need, please file a bug at bugreport.apple.com. Attach a copy of your app and a description of how the tools are falling short of your analysis and Apple will see if they can improve the tools. Thanks!

Related

iPhone Memory, What to trust?

I have been having some memory problems with an app. Im getting to a stage now where nothing really gives me a solid answer in terms of memory.
At first I used the Allocations profiler which I dont think seems to work that well at all, I think this is due to the fact most of my code is in Obj-C++ meaning it cant track the memory correctly.
With the Allocations profiler it tells me the app uses 32mb of memory and around this point it says it has low memory and sometimes crashes out. However, within other parts of the applications its used up to 40mb and never crashed out.
I found this code chunk:
http://landonf.bikemonkey.org/code/iphone/Determining_Available_Memory.20081203.html
Which appears to tell me im using 70mb of memory, when I get the low memory warning it says I have 2mb - 4mb left of unused memory. Which seems more reasonable, but its almost double what the profiler says!
The only thing I can think of is just ignoring it all and reducing the amount of memory used by my app as much as possible.
Ignore it all and reduce the size of your app is actually a good way to proceed. Make sure that you're responding to memory warnings by purging anything in memory that you don't need. Remember that different devices have different amounts of memory, and you may need to use even less than you think, at least if you want to support those older devices.

If malloc() is considered causing "dirty memory", is free() cleaning it up?

I've heard rumors that calling malloc leads to so called "dirty memory", which you can see in the VM Tracker instrument.
Now, rumors also say one must try to keep the amount of dirty memory as low as possible. But what they didn't talk much about was how to undirty it again.
Sometimes there's no other option than using malloc(). Heck, I love malloc(). For example when creating audio sources for OpenAL, one must malloc() a lot of data.
So: When my app calls malloc() and free() all over the place, I always believed that's fine. Am I having a huge problem when doing that? Or will free() always "clean it up"? I'm a bit confused because some very big guys at a very big company warned that malloc() must be avoided as much as possible, because of this dirty memory problem.
Maybe someone can un-confuse me about that.
I seriously doubt this is true. All memory allocation in Cocoa is eventually done via malloc. so sayeth Apple's Memory Usage Performance Guidelines. Quoting from that document:
Because memory is such a fundamental resource, Mac OS X and iOS both
provide several ways to allocate it. Which allocation techniques you
use will depend mostly on your needs, but in the end all memory
allocations eventually use the malloc library to create the memory.
Even Cocoa objects are allocated using the malloc library eventually.
I don't know about your big guys at your big company, but I've known big guys at big companies that didn't know squat. Just sayin'. Documentation trumps rumors every time. :)
I don't know what they mean by "dirty" and "clean". Possibly they are referring to the problem of fragmentation. Doing lots of allocs and frees can cause fragmentation problems, but it really depends on the usage patterns and block sizes you are allocating. In general, don't worry about using malloc and free. If you a real reason to avoid the standard allocator, you can use your own allocator. Then you just call malloc once for a huge block that you can use as the basis of your custom allocator.
If you malloc and free same size memory blocks multiple times, the memory will be reused instead of accumulating dirty VM pages. So it's perfectly safe as long as you know the max of all possible fragment sizes ever allocated by your app at any one time, and that keeps your app under the OS kill limit.
You are correct - free() will simply clean it up again.

Memory issues with cocoa touch

First let me start off by saying I do not believe I am leaking, but I could be wrong. My issue is that at after my app is done loading I have about 10 - 20 mb of Live bytes according to object alloc, which I am fine with. However, according to activity monitor my process allocation is about 70 - 80 mb, which needless to say is a bit high. To make matters worse when I go on to load the next screen of my app I need to pull more data to build it, which then sends my process allocation up to 100 mb+ or so, need less to say this is much to high and the next action after this causes my app to crash due to low memory warnings. Is there anyway to reduce the process allocation memory?
Belief can sometimes be validated with cold hard data. Have you tried using Leaks? Allocations is another helpful memory profiling performance tool provided by Instruments. The way to reduce memory allocation is to not use as much. ;-)
What are you allocating that takes up 100+ MB? Images? Video?
The question is, are you releasing memory when you're done with it. Instruments can help you answer that question. If you show some code, the community can try to help you troubleshoot as well.
Best regards.

Will Apple reject my iPhone application because of increasing memory usage?

I have an application with multiple views. It works pretty fine without any leaks or crashes. But when you run using performance tool for leaks, I see when I switch through multiple views and comeback to home screen, my overall size of the application gets increased. Like if its 1.53MB after visiting 4-5 different views and getting back to screen increases the consumption to 1.58MB or less but definitely greater than 1.53MB.
I tried resolving this issue but not able to figure out where I am going wrong since there are no memory leaks.
Does anyone know what could be the problem?
Will apple reject my application on this basis?
I would go back and forth between the screens many many many many times (at least one hundred times). If the memory continues to grow (linearly) during that time, you have a problem. If the memory stabilizes, you might be okay.
Definitely keep trying to fix you memory leaks. But if it's small, I doubt Apple will notice it. I mean, their own apps leak some too. You could get rejected for it, sure. But realistically, leaking a few bytes here and there shouldn't prevent an approval by itself.
(Source, 2 apps approved, one with the same issue, a tiny little memory leak I couldn't track down. I submitted it and was approved. Shortly after, I found and fixed it and released it as part of an update).
If an application has an increasing memory footprint on a known stable state for example named A after going into and coming back from state B which should have no persistent affect on state A and there is no memory leaks this problem called (as much as I know) the lingering memory.
Checklist to be sure if you have lingering memory problem:
App has no memory leaks, or no memory leaks on non-system code when profiled by Instruments.
State A and State B are individually stable states, like in state-machine.
State B has no permanent affect on State A, or it's memory. State A could be a gateway, a menu to another states like State B or State C. But Child states has no or limited info about state A and makes no change about State A.
On loop state changes starts and ends with root state for example A->B->A, A->C->A, A->B->C->A; you encounter increasing memory usage on State A. Memory usage on other child states are not important.
To spot and solve this problem profile your app with instruments. But instead of monitoring leaks, you should monitor allocations and total memory. Every time your app gets to State A, including start, take a memory snapshot. (There is a button for that :D) After snapshot go to State B, State C and use your application as it suppose to. After coming back to root state, in this example State A, take another snapshot. Instruments will show you memory allocations and difference delta in total memory between snapshots. It will also give information about for which object the memory had allocated and when if possible. If it was your code you probably will see the type of class and allocation point. Instruments can not help you about when the object should have been released but when you got the lingering object or memory, figuring out the deallocation point should be much easier.
BUT! Do not forget:
OS and Framework codes could have leaks and lingering memory problems like every OS. If you are sure that it is not your code leaking or lingering in the memory the everything is fine. That was the case in my app and it got approved(App: Tusudoku). System function often use additional memory if there is available, but they immediately release it when received memory warning. Although devices has limited memory, it is a waste if still not used, and using memory does not make memory chip to use measurably increased electrical current. Using memory to the limits for performance and immediately releasing it when someone definitely needs it, is best possible practice. These cache memories does not tend to be grove over time linearly but you should force memory warning every time app gets to root state, in this example State A. So this way you will be sure any cache memory allocated by system or frameworks will be deallocated, then you take the snapshot.
Most of the apps on the App Store® has memory leaks and other memory problems. The question is how this affect user. Non-linear lingering memory with rapidly dropping acceleration on increase velocity generally won't be a reason for rejection. Calculated the memory usage as 15MB for a perfect working app but if it worked, no problem, say that it will reach 20MB limit max ever and you are good to go. So you later fix your memory problems. Bu if your application has a linear or worse increasing memory usage and can not release that memories when needed, that will be a critical problem.
For more information about memory usage please consider reading official documentation and watching WWDC videos(That's where I learned all about memory fixes using Instruments).
There is no set in stone answer.
On the one hand is the fact that your application may have an obscure memory leak would be enough to reject it according to the posted policies.
On the other hand documents submitted to the FCC by Apple (in the AT&T+Apple vs. Google monopoly fight) give enough detail to work out just how much goes into reviewing an app - unless Apple lied the average app is reviewed by 2 people, and each of them spends around 5 minutes and 38 seconds (assuming Apple doesn't give breaks) to determine if your app passes or fails.
So the answer largely depends on if this memory leak can be discovered in the first 5 minutes of examination by some of the most overworked testers in the industry.
If you are using UIImageViews in your views, then part of the extra memory could be the caching that it does. See here.
Sometimes when we load views, and then switch to another, we leave the view around. For example, if you have a rootviewcontroller that has all the views as retained properties. Normally when you remove a subview, it is released, but not if you have it retained in your viewcontoller. As yu can see, that would add up to memory consumed, but not freed. It's not a leak, except that it gets released only when you release or remove the rootviewcontroller.
You could try to go through and find places where memory is tied up like this, or you could justify it based on the added speed of going through views without having to wait for them to reload.
In summary, it is good to know why your views and other objects consume and hold on to memory, but you may find that all those uses are justified, and you want to keep things that way. Having said that, I don't think Apple will be rejecting our app for decisions like this. If your app crashes because of memory usage, then that would get it rejected.
You're describing very typical memory usage.
If your app runs out of memory and crashes while they're testing it, they will reject it. Beyond that, you're fine.

iPhone Out of Memory WEIRD crashing

My app crashes after about 20 minutes with status 101 (Out of Memory, I believe)
Debugging using Instruments - ObjectAlloc and Leaks gives me no clues. The ObjectAlloc graph stays at a nice constant level of around 1 million bytes (1MB), as does the Net # of allocations. I have got rid of all leaks.
I thought it could be something to do with number of threads, but graphing these in ObjectAlloc also shows them to be constant.
Can anyone point me in the direction of another tool, or another avenue of investigation?
Fix everything Clang finds. LLVM Clang Static Analysis
Remember that objects allocated by the system (and that includes things like images and sounds) don't get tracked in Instruments (although the top level retain counts do, of course). So it's feasable that you're loading images, say, which won't contribute much to your memory usage as show, but can drain a lot of actual memory!
If none of this strikes any chords, you could try the subtractive debugging approach - (take a copy of your project) cut out chunks of functionality until the problem goes away or you get the smallest possible thing that reproduces it. That should at least help you to find where the bottleneck is. Admittedly this will be hard (a) because you'll have to wait 20 minutes or so every time you test (but if you make this a background procedure it's not so bad) and (b) because the nature of memory problems is that there may not be one single cause, but a critical mass of smaller causes.
Good luck!
My experiences with Object Alloc have not been that great. It does not always give you the actual memory used by your application.
Instead, use Object Alloc with Activity Monitor. Make sure you use the "Physical Memory Free" and "Physical Memory used" options in the activity monitor. That will tell you exactly how much memory your application is using.
What do you mean by "nice level". It does not rise over time, at all? How much memory total - it could just be the phone needs some memory for some other app and yours is a little too big to stay up.
The error code 101 means that iPhone OS force quit your app. If you're using UIImageViews in your application, be sure to manage the memory on them. I've found that once my application goes over 10/12 MB, the iPhone terminates it.
If you're not using any image views (or large images), then your backend code is eating up too much space.
All I can say is you need to look at your allocation more carefully and manage what views you keep in memory at any one time.
Run your application in Instruments (Run -> Start with Performamce Tool -> Leaks) to see where your memory is getting allocated.
Hope this helps!