My simplified job look like this:
tSetGlobalVar--->(onSubJobOK)--->tRunJob--->(onSubJobOK)--->tJava
tSetGlobalVar will define a global variable and its initial value let's say: myKey:"firstValue"
tRunJob will run a sub job which contain only a second tSetGlobalVar component that's supposed to set a new value to the global variable defined in the master job:
((String)globalMap.get("myKey")): "newValue"
also tried this:
"myKey": "newValue"
tJava used just to debug, it's code is like below:
System.out.println(((String)globalMap.get("myKey")));
Actual output: firstValue
Expected output: newValue
Is there any other way to modify the value of a global variable in a subjob and get the updated value in the master job ?
You should be passing data to your child job by using context variables rather than the globalMap. You then pass data back up to the parent job using a tBufferOutput in the child job.
As an example, here's a very basic setup that takes an id and a date and passes it to a child job which simply prints it to the console/logs and then passes some data back up to the parent that is also just printed to the console/logs.
Parent job:
The data in the tFixedFlowInput in the parent job is as follows:
Notice how you must use a key value pair combination for the key and the value to pass it to the tContextLoad component which will then create a context variable named by the key and holding the defined value.
In the parent job we set the id and date context variables and then straight away print the current contexts for the job (which will include the context variables just set).
After this we then call the child job using a tRunJob component set to pass the entire context:
Alternatively you can specify which context variables you pass to the child job by unticking the Transmit whole context option and explicitly defining which contexts to send. It is also possible to define the value of the context variables here but it typically makes more sense to generate the context variable values in the main flow of your job and pass them as a key value pair to the tContextLoad component.
Child job:
In the child job we simply print the contexts that have been sent by dumping them to a tLogRow with a tContextDump component and then after that we use another tFixedFlowInput to hard code some data in this case:
Which we then pass to a tBufferOutput component which allows us to read the data back in the parent job.
Going back to the parent job we then link a tLogRow to the tRunJob with a main link and provide the schema that is in the child job's tBufferOutput. This then prints the data from the child job.
The example job's output is then:
.----+----------.
|Parent Contexts|
|=---+---------=|
|key |value |
|=---+---------=|
|date|2014-10-30|
|id |12345 |
'----+----------'
.----+----------.
|Child Contexts |
|=---+---------=|
|key |value |
|=---+---------=|
|date|2014-10-30|
|id |12345 |
'----+----------'
.-----+----------.
|Child Data to be Passed to Parent|
|=----+---------=|
|id |date |
|=----+---------=|
|12346|2014-10-31|
'-----+----------'
.-----+----------.
|Output from Child1|
|=----+---------=|
|id |date |
|=----+---------=|
|12346|2014-10-31|
'-----+----------'
The globalMap stores data in the job itself and is not shared at all with parent or child jobs so cannot be used for this purpose.
In Talend you can pass context variables to subjobs and they work like standard Java variables, so if you pass something unmodifiable (like Strings or basic types) you won't get any change back, but if you pass "by reference" types you will get your object changed and you will see the changes made by the subjob as the father job still holds the reference to the changed object.
So the solution to your problem is to simply pass the globalMap from the Father job to the child job, so the child can read and change that map and the father job will see any change.
The map you pass will be a different map than the globalMap of the child job and will be used only where you use it.
Unfortunately Talend have only one "by reference" type: the Object type, so you will need to pass anything as object anc cast it back where you will use it, but I think that is not such a big problem.
Here are the images for the job:
Running the father
Passing the map to the child
The child retrieves the map from the contex variable and uses the map
Related
I have a fixture that I want to apply to every test function, where I extract metadata from the tests. Something like
#pytest.fixture(autouse=True)
def extract_metadata(request):
func_name = request.function.__name__
# etc.
...
I also want to extract the parametrize values here. But I can't figure out how to extract the current parameter values from the request object. The only place I see that they are indicated at all is in the test id inside of the request.node.name, but I'd prefer to extract the actual values rather than parsing them out of the id in the string.
The parameters can be accessed with request.node.callspec.params, which is a dict mapping parameter name to parameter value.
I am trying to seize a given number of resources dynamically, but I can't figure out the syntax. In the Resource Sets Dynamic Assignment, each unit is represented by the name of the resource set it belongs to. In the picture, the seize block will seize 3 resources of the set "resourcePool".
I need to seize a specific number of resources for each individual agent. Then I tried creating ArrayList of Resource Pool objects and passing it in the dynamic assignment but it doesn't work as the type doesn't match.
For example, let's say I have an agent which requires 4 resources, so the expression needed is: { resourcePool, resourcePool, resourcePool, resourcePool }. How can I assign this expression in a variable or collection of the agent such that it can be used in the Resource Sets Dynamic Assignment box? I think I should finally get something like:
{{agent.resourceSetsToSeize}}
So how to define "resourceSetsToSeize"?
You were so close. The only issue is that the parameters inside the agent must be of type ResourcePool[][], an array of arrays. To convert an array list, in your case resourceSetsToSeize to array you need to call toArray() but with the argument of the specific array you want to convert it to.
So your code should have looked like
{agent.resourceSetsToSeize.toArray(new ResourcePool[resourceSetsToSeize.size()]}
(Assuming that resourceSetsToSeize is a List object
The code can be a bit confusing, see another example below of how to rather use an array as the parameter and then use that directly without converting.
Here is an agent with the parameter of type ResourcePool[][]
When you create the agent you then create this array and put it in the constructor. As you can see you don't need to use the empty constructor and then assign it, you can make use of your parameterized constructor.
And then in the seize object you can simply access the parameter.
I am trying to pass text with dynamic content as a parameter into a pipeline (execute pipeline activity).
As a super simple example, I want the input to my pipeline to be a timestamp, utcnow(). Here are my results:
I've noticed:
If I put #utcnow() in a set variable activity and set the execute pipeline parameter to that variable it works.
If I put #utcnow() (or #{utcnow()}) in the main parameter and set the execute pipeline parameter to that parameter it does not work. I get that string "utcnow()" as the result.
Is there anything that I am missing here? I definitely feel like I've done this successfully before.
If I understand your question correctly, the issue is caused by the main parameter(pipeline parameter) doesn't support expression or functions.
For example, we could pass the value from variable to pipeline active parameter, and it works well, because variable support expression/functions:
When the main pipeline only contains an Execute Pipeline active, we pass the value from main parameter(pipeline parameter) to the Execute Pipeline parameter:
When we debug the pipeline, we need pass the value of main parameter:
The value of pipeline parameter only support the String value, then function utcNow() or #{utcnow() will considered as the String.
Suppose I have a foreach inside of a pipe:
I'd like to iterate through the following:
#split(split(item().name,'_')[4],'-')[1]
However, I'd like to pass this formula in through a parameter.
I've defined a parameter myExpression with the desired value in the pipeline, and attempting to reference it like so:
Note that the full expression would be: {#pipeline().parameters.myExpression}
However, data factory does not execute that expression, rather it just accepts it as a verbatim string:
{#pipeline().parameters.myExpression}
How do we pass in an expression from parameters from within the pipeline?
When you define a parameter like this and pass the value, what you are doing is is send a string input, as the textbox doesn't accept expression. The only way to pass expression to a parameter is to pass it from another pipeline. Another issue we have is one ADF limitation - there can not be nested iterations. Calling a second pipeline solves both the issues.
Split your flow in two pipelines.
First (parent) pipeline - Keep all steps up until generating the array over which iteration has to happen.
#split(split(item().name,'_')[4],'-')[1]
Then, inside a for each loop, invoke an "Execute pipeline" activity. In there, pass the expression that you desire in a similar fashion-
In the child pipeline, define a string parameter to absorb the passed value. Then, use #pipeline().parameters.ParamName to use it in there.
HTH
your description lacks a lot of context of what are you trying to do. I can only presume that you generate array in one pipeline and you want to iterate it in another. Looking at your print screen it looks like you typed in your value, therefore output is a plain text. you should hit dynamic context
so it would look like this:
Is there a way to pipe a whole object through a pipeline and process mentioned Object in one step? Put simply the $PSItem Variable on the other side of my pipeline should have the same value as the whole object which was put through the pipe.
I've found the following method to have a sort of anonymous functions in posh though this processes every item in the input object separately (As this is what the process block in advanced functions is meant for).
Therefor the code:
Get-Service | & {process {return $_.length}}
Returns:
1 1 1 1 1 1 1..
What I'm looking for is a way to access the full object with the $_/$PSItem variable after the pipeline and process it further / return properties of this object.
The Process Block in PowerShell can take single member arrays as its input which then leads it to use the whole member of the array to process rather then all members of the Object.
Using the comma operator one can create a single member array in a simple fashion.
Further information about Operators
The following code uses the comma operator to put the object array which is returned by Get-Process into a single member array.
,(Get-Process)
You are now free to use the object in the pipeline and access properties of it.
,(Get-Process) | & {process {if($_.length -ge 10) {return "Greater / equals 10"}else{return "Smaller than 10"}}}