Open Edge v11 Drill down button bug - progress-4gl

I have a problem and I am not sure if it is a Progress(Open Edge) bug or if there is something wrong with my code.
I have a main container forum from which I call child functions to display with in the MDI Parent. Each child can also call subsequent function which will be parented to the caller child function, overlapping the frame.
Now the problem is when I open the same child function twice and drill down on both and return to the calling child form on the first function, the button I used to drill down to the sub-function will no longer trigger on the first child function but will still work on the second child function.
I will not be able to supply the code example for this so I hope what I explained is understandable.
Could someone please tell me what is causing this and how to fix it.

Probably you define the trigger for your button in child function on the child object. make sure, that your trigger is only once defined.

Related

Are there any ways to loop in a button press callback method until the button is released?

So im using gtkmm specifically here.
What I'm working on:
I'm implementing a feature in which when I press in a certain window a parameter is increased/decreased depending on if the mouse moves left/right.
What needs to be done:
To be able to somehow loop in the button press callback method -(doing so as each iteration would store the new value of the cursor position and compare it to the inital position to determine wheter increase value or decrease)- until the button is released.However the event which is sent remains the same so I cant just for example check to see if the events type becomes GTK_BUTTON_RELEASE. I have also tried to use a flag which is set to false in the button release call back, however the release callback isnt even called as the execution is stuck in the now infinite loop.
Thoughts
Im thinking maybe the solution could be a certain something to do with signals.
Update:
Gtk has two functions for doing what I was saying here exactly which are , gtk_events_pending(), and gtk_main_iteration(). Hever turns out having a limitless loop inside an event is considered a bad practise and its better to use other methods like here for example using GdkEventMotion, or other types.

How to run Unreal blueprint nodes on BeginPlay that depend on another blueprint

I need blueprint A to run nodes in BeginPlay which rely on a variable in blueprint B, but that variable is null until set in B's BeginPlay function. Of course, A's BeginPlay could run before B's and I would run into errors. I can think of two ways to get around this, but neither feel like a proper approach:
In A's BeginPlay, add a Delay node with a second or less duration in the hopes that B's variable has been initialized by then. It seems like this could easy break things and isn't smooth.
Have an Event Dispatcher in B called "VariableSet". A binds an event to it in BeginPlay and that event runs the dependent code. This usually works but I haven't heard of anyone doing this.
Is there a proven, documented method to avoid null pointers in BeginPlay?
I've faced the similar problem in the past.
And in my case I just used Event Dispatchers. I'm not too proud of that aproach but It did a job.
Parent is calling ED at the end of BeginPlay execution flow
Child is binding ED to CustomEvent at the beggining of its own BeginPlay execution flow
When Parent's BeginPlay is finished (which calls ED), logic in Child's CustomEvent runs
For me it worked really well because one Parent had a lot of Childs and these Childs had their own Childs, etc, etc...
and due to Dispatchers and Events I was able to easily track what was going on.

runBlock not working in sequence?

I am trying to create a basic spawn sequence- the block must be created, moveDownLeft, and then removeLeft. moveDownLeft and removeLeft work fine on their own when the block is added using self.addChild(block1) previously, however I need to have self.addchild within the sequence.
The only way that I can see to do this is use runBlock, and I looked at this question when I got an error using that: Swift: SKAction.runBlock -> Missing argument for parameter 'completion' in call BUT WHY?
So now I am left with this:
block1.runAction(SKAction.sequence([SKAction.runBlock({ self.addChild(self.block1) }), moveDownLeft, removeLeft]))
And nothing in the sequence works because the block is not created in the first place. Why is this happening?
Your code fragment is too short but it looks like a typical chicken and egg problem.
node can only run actions once it has been added as child and thus becomes part of the scene graph
your node is supposed to run an action that will eventually add itself to the scene graph but it's not in the scene graph yet so it won't run that action
Add the node as child first, then run the action. If you need the node to be inactive for some time, simply set it's visible property to NO for the duration. You kay also ned to change other properties, ie postpone creation of the physics body.

Can two panels share a uicontrol in a MATLAB GUI?

I've got a MATLAB GUI that has different aspects of functionality, each with their own panel of uicontrols. When one panel is selected, the other one is set to invisible, and vice-versa. However, they share some of the same inputs in the form of a popup menu. Can I include a 'clone' instance of the menu on the second panel somehow? I'd like to avoid as many redundant callbacks and uicontrols as possible.
I guess if the uicontrol was a direct child of the figure, you may be able to put it in front of everything.
A much simpler solution is to use the same callback for multiple uicontrols. In the property editor, you can modify the callback name and set it to a common callback function. Additionally, you can create a field (e.g. myPopupH) in the OpeningFcn of the GUI, in which you store the handles of the popups that should behave the same way. Then, in the callback, you'd use hObject, i.e. the first input argument, for all the get calls (to access the modified state of the popup-menu), but you'd use handles.myPopupH in all the set calls, so that you can ensure that both popups always have the same state. Thus, the ui-object may be redundant, but all the code (which is much more critical) only exists in a single copy.
One place where I routinely use a single callback for multiple ui elements is the close request function which is accessed from the "Cancel"-button as well as from the "X" that closes the figure, and possibly from one of the "File"-menu items.

gtk.fixed layout laid out events?

I have a gtk.Fixed. I move components inside it around using:
myFixed.move( myEventBox, new_x, new_y )
What event do I listen for to know when myEventBox has been rendered at its new position?
Do I connect to the fixed or the eventbox?
MORE INFO:
I need this information so I know when it is safe to queue a video under the eventbox... if I do it too soon (e.g. right after calling myFixed.move) I can see the glitch. Currently getting around this with a gobject.idle_add.
To be honest, I am not aware of any such event. The object should move immediately and redraw the screen, but I don't think any signal is emitted when that happens.
The PyGTK documentation is very comprehensive, and it will list all of the functions and events of every object in the library. In searching (through both the gtk.Container (for fixed) and gtk.Widget (for fixed and eventbox) signal lists, I can't find any such event. The closest thing is an "add" signal in gtk.Container, but I don't think that's what you're looking for.
If the object is not moving, please post your code, because there is probably a subtle error.
If the object is moving just fine and you just want the event/signal, you may have to simulate it yourself. Write the function you want to be called as soon as the object is moved in a function (def) inside "__ init __", and then call that function in code in the line right after "myFixed.move".