I have a pipeline with a copy activity from storage.
I'm using the concat method to combine number of parameters to create the folder path in the Storage.
I have a wildcardFolderPath field which gets its data from the parameters file.
Part of the data is string and the other is a pipeline parameter
"wildcardFolderPath": {
"value": "[concat(parameters('folderPath'), '/', parameters('folderTime')]",
"type": "Expression"
}
When the pipeline runs, the string param folderPath is retrieved as is but the value of folderTime is not evaluated and this is what I see.
formatDateTime(pipeline().parameters.currentScheduleDateTime) instead of the datetime string.
I also tried using:
#concat(parameters('folderPath'), '/', parameters('folderTime')
and
#{concat(parameters('folderPath'), '/', parameters('folderTime')}
but I get: The workflow parameter 'folderPath' is not found.
Anyone encountered such an issue?
Create a parameter at pipeline level and pass in the expression builder with the following syntax.
#pipeline().parameters.parametername
Example:
You can add the parameter inside Add dynamic content if its not created before and select the parameters created to build an expression.
#concat(pipeline().parameters.Folderpath, '/', pipeline().parameters.Filedate)
Code:
Related
I'm trying to parametrize a pipeline in Azure Data Factory in order to enable a certain functionality to mulptiple environments. The idea is that the current environment is always available through a global parameter. I'd like to use this parameter to look up an array of environments to process data to. Example:
targetEnvs = [{ "dev": ["dev"], "test": ["dev", "test"], "acc": [], "prod": ["acc", "prod"] }]
Then one should be able to select the targetEnv array with something like targetEnvs[environment] or targetEnvs.environment. Subsequently a ForEach is used to execute some logic on these target environments.
I tried setting this up with targetEnvs as a pipeline parameter (with default value mapping each env directly to targetEnv, as follows: {"dev": ["dev"], "test": ["test"]}) Then I have a Set variable step to take value from the targetEnvs parameter, as follows:.
I'm now looking for a way to use the current environment (stored in a global parameter) instead of hardcoding "dev" in the Set Variable expression, but I'm not sure how to do this.
.
Using this expression won't even start the pipeline.
.
Question: how do I select this attribute of the object? Any other suggestions on how to do tackle this problem are welcome as well!
(Python analogy would be to have a dictionary target_envs and taking a value from it by using the key "current_env": target_envs[current_env].)
When I tried to access the object same as you, the same error occurred. I have taken the parameter targetEnv (given array) and global parameter environment with value as dev.
You can use the following dynamic content to access the key value.
#pipeline().parameters.targetEnv[0][pipeline().globalParameters.environment]
I have the following REST configuration in Azure Data Factory
As you can I'm getting the error:
'item' is not a recognized function
The full configuration is
convert?q=USD_#{item().Currency}&compact=ultra&apiKey=xxxxxxxxxxxxxxxxxxx
Do I need to configure #item in Parameters?
The guide suggests I need to following these steps
Based on your code in the dynamic context, you are using this REST resource inside a ForEach as above it has item() function. You can get item().<"Value"> in a ForEach using a lookup.
item() is a ForEach function and can be used inside a ForEach which is used inside a ADF pipeline. You are using the ForEach function inside Dataset which is not known for the dataset. That's why it is giving a warning. When you use that dataset only for that pipeline it will give you the result without any error. But for any other pipeline It will give you the warning as error.
To use a pipeline function in the Dataset, the best practice is to create a Dataset parameter and give the value for this in the pipeline like below.
Create a Dataset Parameter with string type and a Default value:
Give this parameter in the Dataset dynamic context:
Now you can give pipeline function values for this Parameter inside ForEach or inside Pipeline:
Here I have used Copy activity for sample and given the value as per my URL. You can give your Relative URL with item() function in dynamic context.
Based on the item().Currency values it will give the REST page URL in each iteration.
I have my ADF pipeline, Where my final output from set variable activity is something like this {name:test, value:1234},
The input coming to this variable is
{
"variableName": "test",
"value": "test:1234"
}
The expression provided in Set variable Item column is #item().ColumnName. And the ColumnName in my JSon file is something like this "ColumnName":"test:1234"
How can I change it so that I get only 1234. I am only interested in the value coming here.
It looks like you need to split the value by colon which you can do using Azure Data Factory (ADF) expressions and functions: the split function, which splits a string into an array and the last function to get the last item from the array. This works quite neatly in this case:
#last(split(variables('varWorking'), ':'))
Sample results:
Change the variable name to suit your case. You can also use string methods like lastIndexOf to locate the colon, and grab the rest of the string from there. A sample expression would be something like this:
#substring(variables('varWorking'),add(indexof(variables('varWorking'), ':'),1),4)
It's a bit more complicated but may work for you, depending on the requirement.
It seems like you are using it inside of an iterator since you got item but however, I tried with a simple json lookup value
#last(split(activity('Lookup').output.value[0].ColumnName,':'))
I have a concat expression defined in the Function Name setting of an Azure Function in my pipeline, where it concatenates the API Query with the current filename that I want to run on this function. When I debug the pipeline, it fails without giving me any feedback. It just says "AzureFunction failed:"
If I manually insert the string, it works fine.
the concat expression is:
#concat('HttpTrigger?filename=', variables('filename'))
I'm new to Azure, any way I can debug this?
try this way:
#concat(variables('FirstName') ,variables('LastName'))
You could use Set Variable Activity with your Azure Function Activity.
In the Variable Activity, set the value of the variable.
Then refer the variable in the Azure Function Activity:
#concat('HttpTriggerJS1?name=',variables('name'))
I have a pipeline with a ForEach loop that gets a list of cities to loop through and make a web service (HTTP source) call and writes to a database (sink).
It works great, however, the API does not return the city as part of the response so I wanted to pass the ForEach parameter (#{item().LocationName}) into the Copy task as part of the column mapping ie:
"columnMappings": {
"#{item().LocationName}": "location",
"blah": "blah",
"blah": "blah",
The 'problem' is that instead of mapping the value of #{item().LocationName} to be the written results, it uses the value of #{item().LocationName} as a column mapping name. ie. tries to find a column name "Vancouver" instead of passing the value "Vancouver" in to the location.
I would say the current behavior is not 'wrong', but looking for a way to pass a variable/parameter in as part of the copy task.
I've tried setting this at the source dataset level with no luck, and it just seems like the wrong location to be setting a variable anyways.
EDIT: I've also tried setting the #{item().LocationName} value in the Dataset in the connection in the JSONPath Expression section, it runs, but does not appear to set a value for location.
"typeProperties": {
"format": {
"type": "JsonFormat",
"filePattern": "setOfObjects",
"jsonPathDefinition": {
"location": "#{item().LocationName}"