I have a data flow which takes startTime and endTime as parameters and I am passing the values to these parameters via a pipeline using pipeline expression but the expression is not evaluated rather its considered as string. Attached the Images
I found out the root cause,
long() is not supported in expression builder.
# is missing at the beginning of the expression.
So the working expression was which gives CURRENT EPOCH TIME
#div(sub(ticks(startOfDay(utcNow())),ticks('1970-01-01T00:00:00.0000000Z')),10000000)
Related
I have a usecase where I need to return an integer as output from a synapse notebook in pipeline and pass this output in next stage of my pipeline.
Currently mssparkutils.notebook.exit() takes only string values. Is there any utility methods available for this?
I know we can cast the integer to string type and send it to the exit("") method. I wanted to know if I could achieve this without casting.
cast()function is the standard and official method suggested by Spark itself. AFAIK, there is no other method. Otherwise, you need to manage it programmatically.
You can also try #equals in dynamic content to check whether the exitValue fetched from the notebook activity output equals to some specific value.
#equals(activity('Notebook').output.status.Output.result.exitValue, '<value>')
Refer: Spark Cast String Type to Integer Type (int), Transform data by running a Synapse notebook
instead, you can convert the string number to an integer in dynamic content. like this:
#equals(
int(activity('Notebook').output.status.Output.result.exitValue)
,1)
or add an activity that sets the string value to a variable that is an int.
I am trying to pass text with dynamic content as a parameter into a pipeline (execute pipeline activity).
As a super simple example, I want the input to my pipeline to be a timestamp, utcnow(). Here are my results:
I've noticed:
If I put #utcnow() in a set variable activity and set the execute pipeline parameter to that variable it works.
If I put #utcnow() (or #{utcnow()}) in the main parameter and set the execute pipeline parameter to that parameter it does not work. I get that string "utcnow()" as the result.
Is there anything that I am missing here? I definitely feel like I've done this successfully before.
If I understand your question correctly, the issue is caused by the main parameter(pipeline parameter) doesn't support expression or functions.
For example, we could pass the value from variable to pipeline active parameter, and it works well, because variable support expression/functions:
When the main pipeline only contains an Execute Pipeline active, we pass the value from main parameter(pipeline parameter) to the Execute Pipeline parameter:
When we debug the pipeline, we need pass the value of main parameter:
The value of pipeline parameter only support the String value, then function utcNow() or #{utcnow() will considered as the String.
When I create a custom type summary using a Python script, it is possible to access ivars using value.GetChildMemberByName("<child-name>"). However, this does not work for computed properties or functions.
With the frame variable command, the script that generates the summary can evaluate expressions in the current frame (e.g. value.GetFrame().EvaluateExpression(value.GetName() + ".description"))
However, this will not work when using p <some-expression>/expression -- <some-expression> as there is no frame, so the above statement will fail to produce any results.
Is there a way to call functions or evaluate computed properties in a type summary when using p (expression --)?
You might what to use SBValue.CreateValueFromExpression instead of either the frame or the target EvaluateExpression calls for data formatters.
SBValues remember the context they were defined in, and SBValue.CreateValueFromExpression funnels that context back to the expression evaluator. Since the Variable formatters always receive the SBValue that they are acting on, CreateValueFromExpression allows a simple way to forward that context to the new expression.
The EvaluateExpression function is available on the target as well as on frames. Try value.GetTarget().EvaluateExpression(...).
Here is a sample workflow and its input struct:
func MyWorkflow(ctx cadence.Context, input MyWorkflowParameters) error {
...
}
type MyWorkflowParameters struct {
SomeString: string,
SomeInteger: int32
}
What’s the best way to pass the complex struct above as the input parameter to the Cadence CLI tool when starting or signaling a workflow?
For multiple params, use array:
--input '["a","b",123]'
The input parameter of the Cadence command-line tool accepts values in a few different formats depending on what your workflow expects. Here are examples for three cases:
1. Single integers or strings:
--input 12345
--input “my-string"
2. Complex objects:
When the parameter is a struct as in your example, you need to pass a valid JSON encoded object as in the following example:
--input '{"SomeString":"my-string","SomeInteger":12345}'
3. Multiple parameters:
If you have a workflow that expects multiple parameters, you need to pass a single space-delimited string where each part of the string corresponds to a particular parameter expected by the workflow. The example below shows how you can pass one integer, one string, and one struct parameter, consecutively:
--input '12345 "second param" {"SomeString":"my-string","SomeInteger":12345}'
On a related note, the recommended way to accept input parameters in a workflow is using a single struct parameter. Even though the syntactic sugar in JSON pollutes the CLI command a little bit, especially when all you need to pass is a single parameter, it pays off when you start passing more parameters to the workflow.
Suppose I have a foreach inside of a pipe:
I'd like to iterate through the following:
#split(split(item().name,'_')[4],'-')[1]
However, I'd like to pass this formula in through a parameter.
I've defined a parameter myExpression with the desired value in the pipeline, and attempting to reference it like so:
Note that the full expression would be: {#pipeline().parameters.myExpression}
However, data factory does not execute that expression, rather it just accepts it as a verbatim string:
{#pipeline().parameters.myExpression}
How do we pass in an expression from parameters from within the pipeline?
When you define a parameter like this and pass the value, what you are doing is is send a string input, as the textbox doesn't accept expression. The only way to pass expression to a parameter is to pass it from another pipeline. Another issue we have is one ADF limitation - there can not be nested iterations. Calling a second pipeline solves both the issues.
Split your flow in two pipelines.
First (parent) pipeline - Keep all steps up until generating the array over which iteration has to happen.
#split(split(item().name,'_')[4],'-')[1]
Then, inside a for each loop, invoke an "Execute pipeline" activity. In there, pass the expression that you desire in a similar fashion-
In the child pipeline, define a string parameter to absorb the passed value. Then, use #pipeline().parameters.ParamName to use it in there.
HTH
your description lacks a lot of context of what are you trying to do. I can only presume that you generate array in one pipeline and you want to iterate it in another. Looking at your print screen it looks like you typed in your value, therefore output is a plain text. you should hit dynamic context
so it would look like this: