Behavior of "Created & Destroyed" Allocation Lifespan in Instruments - iphone

When using the Allocations instrument in Instruments, you can choose between 3 Allocations Lifespans. The first 2 seem obvious:
"All Objects Created" - Every object
"Created & Still Living" - Every object still in memory
What about the third one: "Created & Destroyed"?
Is this:
1) Every object that was destroyed during the selected timespan AND created during the selected timespan
OR
2) Every object that was destroyed during the selected timespan created since the beginning of the run (regardless of the selected timespan)
The Instruments guide doesn't actually give the behavior of any of these options. I assume it is guess 1, but wonder if anybody knows for sure.

You can see the differences in the allocation lifespans by choosing Call Trees from the jump bar. The Bytes Used column illustrates the following formula:
All Objects Created = (Created and Still Living) + (Created and Destroyed)
From what I've seen, Instruments has behavior #2 for the Created and Destroyed lifespan. Instruments shows the objects that were destroyed, and it doesn't matter when the objects were created.

Related

High Memory Allocation debugging with Apple Instruments

I have an app written in swift which works fine initially, but throughout time the app gets sluggish. I have opened an instruments profiling session using the Allocation and Leaks profile.
What I have found is that the allocation increases dramatically, doing something that should only overwrite the current data.
The memory in question is in the group < non-object >
Opening this group gives hundreds of different allocations, with the responsible library all being libvDSP. So with this I can conclude it is a vDSP call that is not releasing the memory properly. However, double clicking on any of these does not present me with any code, but the raw language I do not understand.
The function that callas vDSP is wrapped like this:
func outOfPlaceComplexFourierTransform(
setup: FFTSetup,
resultSize:Int,
logSize: UInt,
direction: FourierTransformDirection) -> ComplexFloatArray {
let result = ComplexFloatArray.zeros(count:resultSize)
self.useAsDSPSplitComplex { selfPointer in
result.useAsDSPSplitComplex { resultPointer in
vDSP_fft_zop(
setup,
&selfPointer,
ComplexFloatArray.strideSize,
&resultPointer,
ComplexFloatArray.strideSize,
logSize,
direction.rawValue)
}
}
return result
}
This is called from another function:
var mags1 = ComplexFloatArray.zeros(count: measurement.windowedImpulse!.count)
mags1 = (measurement.windowedImpulse?.outOfPlaceComplexFourierTransform(setup: fftSetup, resultSize: mags1.count, logSize: UInt(logSize), direction: ComplexFloatArray.FourierTransformDirection(rawValue: 1)!))!
Within this function, mags1 is manipulated and overwrites an existing array. It was my understanding that mags1 would be deallocated once this function has finished, as it is only available inside this function.
This is the function that is called, many times per second at times. Any help would be appreciated, as what should only take 5mb, very quickly grows by two hundred megabytes in a couple of seconds.
Any pointers to either further investigate the source of the leak, or to properly deallocate this memory once finished would be appreciated.
I cannot believe I solved this so quickly after posting this. (I genuinely had several hours of pulling my hair out).
Not included in my code here, I was creating a new FFTSetup every time this was called. Obviously this is memory intensive, and it was not reusing this memory.
In instruments looking at the call tree I was able to see the function utilising this memory.

Swift property loaded in memory before initialisation is called

In my app it looks like the property is loaded in memory before it's even called. Either there's something wrong with my understanding, or there's something I've overlooked. To illustrate that I've put a breakpoint soon after signIn is tapped as you can see on the image (line 226):
I'm expecting to initialise a property let user = User() on line 230. So I have no idea how could user be already loaded in memory, as seen on the bottom left of the attached image.
Why does this happen?
Is this some kind of code optimisation that happens under the hood?
I can confirm that user has not been loaded / initialised prior to that (it's not a class property). Additionally I can confirm that commenting out line 230 and below results in user not being loaded, so it looks that line 230 is the cause for user being loaded. But the strange thing is that this happens before it's called, as I've paused the execution on line 226 with a breakpoint.
It's important to be able to read the variables list. All local variables are always shown, even if they have not been initialized yet, because the storage has already been set aside for them. Before initialization, therefore, the variable is shown, but its value may be bogus. My guess is that the value is bogus. It says it's a User because that is its type, but at the time you are looking, it is pointing at garbage, and you should ignore it. After the execution path passes thru the initialization, the value will change and will be the real value.
(However, if this is a Release build, then there might indeed be optimization of some sort, because this is a constant whose value does not depend upon preceding code, so it can be allocated at any time. But you should not be debugging a Release build.)
It hasn't been loaded into memory. The debugger sees the variable in scope and displays it but the number it's showing is whatever is lying around in memory. If you step past your let user... statement, you should see the value of the variable change to something more like the other object addresses in your picture.

How to clean the memory properly (conceptual, SWIFT)

I have a loop where I add objects (.append) to an array and remove them after a while in a Background-Task (dispatch_async(backgroundQueue, {)
During this loop my array has theoretically a max amount of objects in it, because I am removing old objects.
What I observe is, that the memory cunsumption goes high (over 400MB) till I get a didReceiveMemoryWarning Warning and my didReceiveMemoryWarning is called and sometimes the app just crashes. So it looks like my cleaning doesn't work, BUT
When stopping in the middle of the loop and waiting for some seconds (say 10 seconds) I see the memory slowly going down to my expected value (60MB).
So I guess I conceptually doing this wrong.
How to do this properly ? Is there a way to force cleaning memory like in Java forcegc ?
When cleaning the objects in the main task (dispatch_async(dispatch_get_main_queue(),{) the memory doesn't reach the limit, but the GUI is stocking.

How to delete object using blueprints (not actor)

Ok I've created some class which inherits from UObject. I can created it in level BP (using Construct object node) and store reference in my BP variable. When I'm creating object I'm setting Outer as self. So level BP owning newly created object. Now my question is how to delete this object from memory? I tried to set BP variable to null but it seems that I need to destroy level to release this object. Any idea how to do it without destorying level?
I have no access to UE4 at this moment but I hope this can help/hint you to a right direction:
UObjects are managed by the garbage collector. To create a UObject appropriately, use NewObject(), NewNamedObject() and ConstructObject(). It is possible to configure the way UObjects will be handled by garbage collector at the time of creation with Object Flags enumeration. (If you like to learn more about UObject Instance Creation, you can go here: hhttps://docs.unrealengine.com/latest/INT/Programming/UnrealArchitecture/Objects/Creation/index.html )
This way, you should not call new or delete on UObjects. If UObject is no longer needed, it usually means that there are no references to it (this may, however differ, depending on the context and garbage collection flags used at the moment of UObject creation). In this situation, you can run ForceGarbageCollection() function:
GetWorld()->ForceGarbageCollection(true);
Please note, that calling this method may cause crashes in some situations, particularly when object is already being destroyed by garbage collector or has a value of null.
Also, if you like to learn more about Unreal Object Handling, you can go here: hhttps://docs.unrealengine.com/latest/INT/Programming/UnrealArchitecture/Objects/Optimizations/index.html
Credit : https://answers.unrealengine.com/questions/219430/explicitely-delete-a-uobject.html
Ps. StackOverflow doesn't allow me to post more than two links because I don't have enough reputation ... so remove the first 'h' from my broken links, it'll work.
I managed to resolve this, I also got some clues given on unreal answer hub: https://answers.unrealengine.com/questions/337525/how-to-delete-object-using-blueprints.html
So basically answer is: set reference variable to null, and at some moment GC will release it. But don't expect to have this instantly.

Loading Next Level

I am making a game in Unity. There are currently two levels. I count to 30 seconds, when the time becomes 0, I want to load the next level automatically. But when the next level is loaded, the game screen freezes and I cannot move anything. The function I used to load the next level follows below (this function is in a script which an empty game object which will not be destroyed when loading a new level carries):
function loadNextlvl(){
var cur_level = Application.loadedLevel;
yield WaitForSeconds(5.0);
Application.LoadLevel(cur_level + 1);
}
What should I do?
My work with Unity has been hobby-driven only, but anytime I've used Application.LoadLevel I passed it a string of the level name, rather than a pointer. I see from the API that it's overloaded to take an int as well, but maybe for testing purposes, call it by name to see if that works.
Also, you need to tell Unity the levels you're going to be using. You can do this in the Build Settings off the file menu.
Lastly, you can try using Application.levelCount to see if you're within the bounds of levels.