Swift/Xcode - Get value of random functions with breakpoint; at creation point / before usage? - swift

I have a couple of functions in my game that in this example spawns enemies (monsters), at the very top of the function i have created a set of variables, etc:
let MonsterRandomType = arc4random_uniform(3)
let MonsterRandomRotation = arc4random_uniform(3)
And a dussin other variables that determine various properties of the enemies, the game started out as an endless runner and basically everything was made up of randomizations.
Now i decided to create levels/maps that is repeatable or have hardcoded values instead of randomness. I figured i reuse my existing functions for creating random game levels, run the game and save the random values created and use them in my hardcoded levels, i did this by setting break points on my var's and let's, iterating through them and saving as json into a file from the log, it did not work since, for whatever reason, when using random functions like arc4random_uniform the debugger and breakpoint doesn't actually set/get the value before the variable is being used further down into my code where the variables are used in if or switch statements.
If i set a breakpoint at:
let MonsterRandomType = arc4random_uniform(3)
And then test it by hitting the "Print Description" button to get more info about the variable in the output window in Xcode i will get this "error":
Printing description of MonsterRandomType:
(UInt32) MonsterRandomType = <variable not available>
Now i obviously know that i can achieve what i want (getting the value of every variable that is using a random_function) by setting the breakpoint where these variables are being used further down into the code but its very inefficient to go through many, many lines of code and many, many functions when all the variables is at the top of every function/file.
If i do the following:
print(arc4random_uniform(3))
The output window will show the value created by the random function as soon as it is created, but why cant the debugger and breakpoint show the value when it is created? Seems it can only show the value when the variable is first used further down into the code.
If i did:
let MonsterRandomType = 1
The debugger would identify the value, in the debugger it would say: "MonsterRandomType = (Int) 1". So the debugger obviously can get values as soon as the variable is created and the variable is attributed the value instantly, unless that value is created by a random_function like arc4random_uniform which is very annoying. Is it possible to fix this behaviour?
Image of breakpoint example

Related

Anylogic: Dynamically change source rate using variable/slider

I am trying to dynamically change the source Arrival rate using a variable "arrivalRate" linked to a slider (see image).
However, during the simulation the initial rate remains the same, even when I change the arrivalRate. I know that the arrivalRate variable is changing successfully (it is an int) - but this has no effect on the source rate during the simulation.
Anyone have an idea what the issue is - or how to fix it?
Whenever you see the = sign before a field, it means it's not dynamic, it is only evaluated at the start of the model or at the element creation and will not change throughout the simulation run unless you force it. In other words, the variable arrivalRate is checked only once to assign the source's arrival rate and that's it.
Now if you want to change it dynamically, in the slider's Action field, write the following:
source.set_rate( arrivalRate );

How to turn-off/disable Metal depth testing

I know how to enable the depth test in Metal using swift.
Just call the MTLRenderCommandEncoder.setDepthStencilState() with appropriate MTLDepthStencilState object like this and it works fine.
renderCommandEncoder.setDepthStencilState( state )
to turn it off, I thought this could work but it gives me an error at runtime.
renderCommandEncoder.setDepthStencilState( nil )
the error:
-[MTLDebugRenderCommandEncoder setDepthStencilState:]:3843: failed assertion `Set Depth Stencil State Validation
depthStencilState must not be nil.
it is weird because Apple's documentation says that the default value is nil and the function setDepthStencilState() takes optional value.
any idea how to turn depth-testing off or am I doing something wrong?
environment:
Xcode 13.2
Swift 5
deployment target: MacOS 11.1
You can disable depth test by creating an MTLDepthStencilState from MTLDepthStencilDescriptor with depthCompareFunction set to always.
let descriptor = MTLDepthStencilDescriptor()
descriptor.depthCompareFunction = .always
let depthStencilState = device.makeDepthStencilState(descriptor: descriptor)
renderCommandEncoder.setDepthStencilState(depthStencilState)
Update: just setting the depthCompareFunction to always will make the depth test always pass, but it will also still write out the depth for all the fragments. If you want to keep the depth buffer in the same state, you can set isDepthWriteEnabled to false.
I don't use swift, but you will understand my logic.
In Metal, you configure depth testing independently from the render pipeline, so you can mix and match combinations of render pipelines and depth tests. The depth test is represented by a MTLDepthStencilState object, and like you do with a render pipeline, you can create multiple variation of this object.
Create a new MTLDepthStencilState object and configure it with the following settings:
DepthStencilDescriptor* depthStencilDescriptor = DepthStencilDescriptor::alloc()->init();
depthStencilDescriptor->setDepthWriteEnabled(false); /* disable depth write */
_depthStencilStateNoWrite = _device->newDepthStencilState(depthStencilDescriptor);
depthStencilDescriptor->release();
Use it whenever you want your object to be ignored by the depth test like so:
renderEncoder->setDepthStencilState(_depthStencilStateNoWrite);

MIT-Scratch : Sequential cloning without delay

I am just starting to play with this as an educational tool for a youngster and encounter strange behavior whilst attempting to clone sprites.
I setup a global variable for position x,y in sprite_1 and clone a sprite_2 object. This object immediately copies the global x,y to local x,y and exits. Later sprite_2 renders using the stored local x,y.
sprite_1:
sprite_2:
I expect the four sprites to clone diagonally up/right on the screen according to this small reproduce-able example. Instead I appear to get four sprite_2 objects all on top of each other:
If I add a delay of 1 second onto the end of the clone(x,y) function however all is well:
As all four sprite_2 objects appear to be where the last clone was placed, I have a suspicion that the clones are not created immediately but instead created as a batch all at once, at some time and therefore are all taking the last coordinates from the globals _clone_enemy_x/y.
Is this the case? is there are way to circumvent this behavior or what is the solution?
I have 2 possible solutions to this problem:
Go to the "define clone()()" block, right click it, open up the advanced dropdown, and tick "run without screen refresh".
Get rid of the custom block all together, but use the original source for that block in the actual code.
I hope this helps!

unexpected frame appears onto another frame after an event

I'm using a character application. In the first page, there is a frame f-selection where the search fields are entered. When I search for something and open some other frames in that search, then I press F10 which is for opening another frame, the new frame opens but f-selection also appears on it. I'm suspecting this code makes it pop up again:
else assign ll-lgst-key1:SENSITIVE in frame f-selection = TRUE
ll-lgst-key2:SENSITIVE in frame f-selection = FALSE
because when I comment these lines, the frame doesn't pop up. But then I can't use these fields at the first frame where I should, too. I don't know why this code is called again; but is there anything else I can do to fix this issue? I tried to write hide frame f-selection everywhere possible but it doesn't work.
That snippet of code is making "key1" of your frame sensitive. In order to be sensitive it needs to pop up...
So the issue is why is that block of code executing? You say "I don't know why this code is called again". Neither will anyone else because you have shared such a tiny little bit of the overall code. Apparently the flow of control is taking you through that block so you should work on understanding why that is. You might try using the debugger to step through the code execution or you could insert some old fashioned MESSAGE statements to get to the bottom of it.
If you want to kludge around the problem you could wrap that bit of code in conditional logic. Define and set a variable that determines the desired state of the f-selection frame and use that to control the sensitivity logic:
define variable f-shouldBeVisible as logical no-undo.
if .... then
f-shouldBeVisible = yes.
else
f-shouldBeVisible = no.
...
else
do:
if f-shouldBeVisible then
assign ll-lgst-key1:SENSITIVE in frame f-selection = TRUE
ll-lgst-key2:SENSITIVE in frame f-selection = FALSE
.
end.
Of course that looks kind of silly -- but it is just an example with grossly over-simplified logic.
OTOH if you know enough to set the variable you ought to be able to figure out why the ELSE branch is executing. But maybe it is a useful first step.

How to reset FFmpeg static global variables?

Im trying to create a movie from set of PNG images using FFmpeg in iPhone. Later merging the video created with audio which is recorded separately. I can call this as a two phases of my first pass. But when I start my second pass, FFmpeg crashes in first phase. I know that this because the global variables set in first pass are not reset during the second pass. Is there any way to reset the static global variables set to the FFmpeg?
In my case I am getting error like "frame size changed to 320x400, bgra", even though the images are set to PNG before I start my second pass.
This issue got resolved now. After debugging the FFmpeg code, I found that the pixel format was not reset and it was retaining the previously set value. Fix is to reset the "frame_pix_fmt = PIX_FMT_NONE" before you start actual encoding. "frame_pix_fmt" is declared as a static global variable in ffmpeg.c.