ADF: Dynamic Content in parameters - azure-data-factory

I am trying to pass text with dynamic content as a parameter into a pipeline (execute pipeline activity).
As a super simple example, I want the input to my pipeline to be a timestamp, utcnow(). Here are my results:
I've noticed:
If I put #utcnow() in a set variable activity and set the execute pipeline parameter to that variable it works.
If I put #utcnow() (or #{utcnow()}) in the main parameter and set the execute pipeline parameter to that parameter it does not work. I get that string "utcnow()" as the result.
Is there anything that I am missing here? I definitely feel like I've done this successfully before.

If I understand your question correctly, the issue is caused by the main parameter(pipeline parameter) doesn't support expression or functions.
For example, we could pass the value from variable to pipeline active parameter, and it works well, because variable support expression/functions:
When the main pipeline only contains an Execute Pipeline active, we pass the value from main parameter(pipeline parameter) to the Execute Pipeline parameter:
When we debug the pipeline, we need pass the value of main parameter:
The value of pipeline parameter only support the String value, then function utcNow() or #{utcnow() will considered as the String.

Related

Passing Parameter to Data Flow in Azure Data Factory

I have a data flow which takes startTime and endTime as parameters and I am passing the values to these parameters via a pipeline using pipeline expression but the expression is not evaluated rather its considered as string. Attached the Images
I found out the root cause,
long() is not supported in expression builder.
# is missing at the beginning of the expression.
So the working expression was which gives CURRENT EPOCH TIME
#div(sub(ticks(startOfDay(utcNow())),ticks('1970-01-01T00:00:00.0000000Z')),10000000)

How to return integer value from notebook in adf pipeline

I have a usecase where I need to return an integer as output from a synapse notebook in pipeline and pass this output in next stage of my pipeline.
Currently mssparkutils.notebook.exit() takes only string values. Is there any utility methods available for this?
I know we can cast the integer to string type and send it to the exit("") method. I wanted to know if I could achieve this without casting.
cast()function is the standard and official method suggested by Spark itself. AFAIK, there is no other method. Otherwise, you need to manage it programmatically.
You can also try #equals in dynamic content to check whether the exitValue fetched from the notebook activity output equals to some specific value.
#equals(activity('Notebook').output.status.Output.result.exitValue, '<value>')
Refer: Spark Cast String Type to Integer Type (int), Transform data by running a Synapse notebook
instead, you can convert the string number to an integer in dynamic content. like this:
#equals(
int(activity('Notebook').output.status.Output.result.exitValue)
,1)
or add an activity that sets the string value to a variable that is an int.

passing a variable into a templates parameter

I have a template which has a parameter 'enableVM1' of type boolean.
Simply, I want this parameter to be set by an expression. I want this expression to resolve at runtime since the data is retrieved by an earlier step.
- stage: Build_Tenant_Refresh
displayName: "Destroying Tenant VM"
variables:
vm1ActiveFlip: $[ not(eq(stageDependencies.Shutdown_Tenant.Setup.outputs['Identify_built_VM.vm1Active'],'True')) ]
jobs:
- template: tenant-infrastructure-plan.yml
parameters:
enableVM1: <<ANY EXPRESSION WHICH I'D EXPECT TO RESOLVE TO A BOOL>>
When I press the run button on the pipeline I am immediately told that enableVM1s value is not a boolean.
This suggests that the a parameters are evaluated at parse/compile time rather than run time. Is this true?
I was intending for the expression to be: $[vm1ActiveFlip] (referencing the variable defined at the stage).
I tried lots of variants for the expression including:
$[eq('vm1ActiveFlip','True')]
$[eq('True','True')]
Is it possible to achieve what I need?
I tested enableVM1: $[eq(variables['Build.SourceBranch'],'refs/heads/main')] and reproduced your issue:
To solve this , you need to use compile-time expressions (${{ <expression> }}). This is because if you use runtime expression, then when you click the run button, the expression has not been parsed as a boolean value and is judged as a string.
In a compile-time expression (${{ <expression> }}), you have access to parameters and statically defined variables. In a runtime expression ($[ <expression> ]), you have access to more variables but no parameters.
This is stated in this document,please refer to it.
Update:
As workaround, using job output variables and introducing a dependsOn to the template. For details , please refer to this document.

how to pass in an expression through a parameter

Suppose I have a foreach inside of a pipe:
I'd like to iterate through the following:
#split(split(item().name,'_')[4],'-')[1]
However, I'd like to pass this formula in through a parameter.
I've defined a parameter myExpression with the desired value in the pipeline, and attempting to reference it like so:
Note that the full expression would be: {#pipeline().parameters.myExpression}
However, data factory does not execute that expression, rather it just accepts it as a verbatim string:
{#pipeline().parameters.myExpression}
How do we pass in an expression from parameters from within the pipeline?
When you define a parameter like this and pass the value, what you are doing is is send a string input, as the textbox doesn't accept expression. The only way to pass expression to a parameter is to pass it from another pipeline. Another issue we have is one ADF limitation - there can not be nested iterations. Calling a second pipeline solves both the issues.
Split your flow in two pipelines.
First (parent) pipeline - Keep all steps up until generating the array over which iteration has to happen.
#split(split(item().name,'_')[4],'-')[1]
Then, inside a for each loop, invoke an "Execute pipeline" activity. In there, pass the expression that you desire in a similar fashion-
In the child pipeline, define a string parameter to absorb the passed value. Then, use #pipeline().parameters.ParamName to use it in there.
HTH
your description lacks a lot of context of what are you trying to do. I can only presume that you generate array in one pipeline and you want to iterate it in another. Looking at your print screen it looks like you typed in your value, therefore output is a plain text. you should hit dynamic context
so it would look like this:

Dynamic T-SQL input to T-SQL Task in SSIS

My SSIS package includes a T-SQL task. I have a package parameter that I want to pass into my T-SQL task but I can't find the correct syntax to do this:
DECLARE #myVariable int;
SET #myVariable = $Package::myParameter --not working
SET #myVariable = #[$Package::myParameter] -- also not working
What is the correct way to pass parameters to a T-SQL task?
I'd recommend using an Execute SQL Task as it provides more functionality than an Execute T-SQL Statement Task. However if you're looking to use the T-SQL Task with a parameter this can be done by creating a string variable with an expression that includes the parameter. An example of this is below. To set this as the statement for the T-SQL task, go to the Properties window of the task (press F4), click the ellipsis next to the Expressions field, select the SqlStatementSource property and add the string variable containing the T-SQL as the Expression. Since the variable in your SQL is of the INT data type, I'm assuming the package parameter also is, thus it needs to be cast to a string to be included as part of the expression in the string variable. This will still be parsed as a numeric data type and submitted to SQL Server as such. This casting is done with the (DT_STR, length, code page) function below. This just uses an example length of 10. As a side note, the (DT_WSTR, length) function would be used for Unicode data. Make sure to enclose the SQL text in quotes as done below. Also be aware that parameter names are case sensitive within an expression, for example #[$Package::MyParameter] would return an error if the parameter name was #[$Package::myParameter], starting with a lower case m.
"DECLARE #myVariable INT;
SET #myVariable = " + (DT_STR, 10, 1252)#[$Package::myParameter] + "
UPDATE Database.Schema.Table
SET NAME = 'TW'
WHERE ID = #myVariable"
You can't pass parameters to a T-SQL task.
According to the documentation:
If you need to run parameterized queries, save the query results to
variables, or use property expressions, you should use the Execute SQL
task instead of the Execute T-SQL Statement task. For more
information, see Execute SQL Task.