How can I pass a complex objects as input when using Cadence CLI tool? - cadence-workflow

Here is a sample workflow and its input struct:
func MyWorkflow(ctx cadence.Context, input MyWorkflowParameters) error {
...
}
type MyWorkflowParameters struct {
SomeString: string,
SomeInteger: int32
}
What’s the best way to pass the complex struct above as the input parameter to the Cadence CLI tool when starting or signaling a workflow?

For multiple params, use array:
--input '["a","b",123]'

The input parameter of the Cadence command-line tool accepts values in a few different formats depending on what your workflow expects. Here are examples for three cases:
1. Single integers or strings:
--input 12345
--input “my-string"
2. Complex objects:
When the parameter is a struct as in your example, you need to pass a valid JSON encoded object as in the following example:
--input '{"SomeString":"my-string","SomeInteger":12345}'
3. Multiple parameters:
If you have a workflow that expects multiple parameters, you need to pass a single space-delimited string where each part of the string corresponds to a particular parameter expected by the workflow. The example below shows how you can pass one integer, one string, and one struct parameter, consecutively:
--input '12345 "second param" {"SomeString":"my-string","SomeInteger":12345}'
On a related note, the recommended way to accept input parameters in a workflow is using a single struct parameter. Even though the syntactic sugar in JSON pollutes the CLI command a little bit, especially when all you need to pass is a single parameter, it pays off when you start passing more parameters to the workflow.

Related

Apache AGE - Creating Functions With Multiple Parameters

I was looking inside the create_vlabel function and noted that to get the graph_name and label_name it is used graph_name = PG_GETARG_NAME(0) and label_name = PG_GETARG_NAME(1). Since these two variables are also passed as parameters, I was thinking that, if I wanted to add one more parameter to this function, then I would need to use PG_GETARG_NAME(2) to get this parameter and use it in the function's logic. Is my assumption correct or do I need to do more tweaks to do this?
You are correct, but you also need to change the function signature in the "age--1.2.0.sql" file, updating the arguments:
CREATE FUNCTION ag_catalog.create_vlabel(graph_name name, label_name name, type new_argument)
RETURNS void
LANGUAGE c
AS 'MODULE_PATHNAME';
Note that all arguments come as a "Datum" struct, and PG_GETARG_NAME automatically converts it to a "Name" struct. If you need an argument as int32, for example, you should use PG_GETARG_INT32(index_of_the_argument), for strings, PG_GETARG_CSTRING(n), and so on.
Yes, your assumption is correct. If you want to add an additional parameter to the create_vlabel function in PostgreSQL, you can retrieve the value of the third argument using PG_GETARG_NAME(2). Keep in mind that you may need to make additional modifications to the function's logic to handle the new parameter correctly.
The answers given by Fahad Zaheer and Marco Souza are correct, but you can also create a Variadic function, with which you could have n number of arguments but one drawback is that you would have to check the type yourself. You can find more information here. You can also check many Apache Age functions made this way e.g agtype_to_int2.

Call a kdb function passing another function as argument using sendSync method of qpython(kdb)

In the KDB server, we have two functions defined as
q)t:{0N!x[`min]; 0N!x[`max];}
q).up.map:{[keyList; valueList] keyList!valueList}
The KDB server, does not allow to pass dict()!() as an argument directly to a function, rather one has to use .up.map.
Calling t function from kdb would be like
q)t[.up.map[`min`max;10 20]]
I want to call the t function from qpython sendSync() method passing another function .up.map[`min`max;10 20] as an argument to t.
Unfortunately, I cannot find a solution in the qptyhon doc - https://qpython.readthedocs.io/en/latest/qpython.html#qpython.qconnection.QConnection.sendSync
Error -
When I tried sendSync() method, below error is raised -
qpython.qtype.QException: b'['
The KDB server, does not allow to pass dict()!() as an argument directly to a function, rather one has to use .up.map.
May I know why this is so? It's not a bad idea to challenge the original design before looking for workarounds. If dictionary were allowed as its parameter, it could have been as simple as
params = QDictionary(qlist(numpy.array(["min", "max"], dtype=numpy.string_), qtype=QSYMBOL_LIST),
qlist(numpy.array([10, 20], dtype=numpy.int64), qtype=QLONG_LIST))
with qconnection.QConnection(host='localhost', port=5000) as q:
q.sendSync("t", params)
If you want to do what you can do in q console via qpython, it's actually also simple: you pass the same string over. Effectively it's the same mechanism as a q client passing a string via IPC to the server, where the string is parsed and evaluated. Here you need to convert the input to the given string format in your Python code, thus not as clean as the above (although it looks more verbose).
with qconnection.QConnection(host='localhost', port=5000) as q:
q.sendSync("t[.up.map[`min`max;10 20]]")
Maybe you can use a lambda for this. That way it's just the arguments that need be serialized:
q.sendSync("{t[.up.map[x;y]]}", qlist(["min", "max"], qtype=QSYMBOL_LIST), [10, 20])
If that's not permitted, you could create it as a named wrapper function on the kdb side, which could be.
Alternatively, you could format your call with arguments as a string. A bit hacky; but workable for simple input.
q.sendSync(f"t[.up.map[`{'`'.join(['min', 'max'])};{' '.join(['10', '20'])}]]")

How to return integer value from notebook in adf pipeline

I have a usecase where I need to return an integer as output from a synapse notebook in pipeline and pass this output in next stage of my pipeline.
Currently mssparkutils.notebook.exit() takes only string values. Is there any utility methods available for this?
I know we can cast the integer to string type and send it to the exit("") method. I wanted to know if I could achieve this without casting.
cast()function is the standard and official method suggested by Spark itself. AFAIK, there is no other method. Otherwise, you need to manage it programmatically.
You can also try #equals in dynamic content to check whether the exitValue fetched from the notebook activity output equals to some specific value.
#equals(activity('Notebook').output.status.Output.result.exitValue, '<value>')
Refer: Spark Cast String Type to Integer Type (int), Transform data by running a Synapse notebook
instead, you can convert the string number to an integer in dynamic content. like this:
#equals(
int(activity('Notebook').output.status.Output.result.exitValue)
,1)
or add an activity that sets the string value to a variable that is an int.

ADF: Dynamic Content in parameters

I am trying to pass text with dynamic content as a parameter into a pipeline (execute pipeline activity).
As a super simple example, I want the input to my pipeline to be a timestamp, utcnow(). Here are my results:
I've noticed:
If I put #utcnow() in a set variable activity and set the execute pipeline parameter to that variable it works.
If I put #utcnow() (or #{utcnow()}) in the main parameter and set the execute pipeline parameter to that parameter it does not work. I get that string "utcnow()" as the result.
Is there anything that I am missing here? I definitely feel like I've done this successfully before.
If I understand your question correctly, the issue is caused by the main parameter(pipeline parameter) doesn't support expression or functions.
For example, we could pass the value from variable to pipeline active parameter, and it works well, because variable support expression/functions:
When the main pipeline only contains an Execute Pipeline active, we pass the value from main parameter(pipeline parameter) to the Execute Pipeline parameter:
When we debug the pipeline, we need pass the value of main parameter:
The value of pipeline parameter only support the String value, then function utcNow() or #{utcnow() will considered as the String.

how to pass in an expression through a parameter

Suppose I have a foreach inside of a pipe:
I'd like to iterate through the following:
#split(split(item().name,'_')[4],'-')[1]
However, I'd like to pass this formula in through a parameter.
I've defined a parameter myExpression with the desired value in the pipeline, and attempting to reference it like so:
Note that the full expression would be: {#pipeline().parameters.myExpression}
However, data factory does not execute that expression, rather it just accepts it as a verbatim string:
{#pipeline().parameters.myExpression}
How do we pass in an expression from parameters from within the pipeline?
When you define a parameter like this and pass the value, what you are doing is is send a string input, as the textbox doesn't accept expression. The only way to pass expression to a parameter is to pass it from another pipeline. Another issue we have is one ADF limitation - there can not be nested iterations. Calling a second pipeline solves both the issues.
Split your flow in two pipelines.
First (parent) pipeline - Keep all steps up until generating the array over which iteration has to happen.
#split(split(item().name,'_')[4],'-')[1]
Then, inside a for each loop, invoke an "Execute pipeline" activity. In there, pass the expression that you desire in a similar fashion-
In the child pipeline, define a string parameter to absorb the passed value. Then, use #pipeline().parameters.ParamName to use it in there.
HTH
your description lacks a lot of context of what are you trying to do. I can only presume that you generate array in one pipeline and you want to iterate it in another. Looking at your print screen it looks like you typed in your value, therefore output is a plain text. you should hit dynamic context
so it would look like this: