Azure Data Factory pass parameters into copy task - azure-data-factory

I have a pipeline with a ForEach loop that gets a list of cities to loop through and make a web service (HTTP source) call and writes to a database (sink).
It works great, however, the API does not return the city as part of the response so I wanted to pass the ForEach parameter (#{item().LocationName}) into the Copy task as part of the column mapping ie:
"columnMappings": {
"#{item().LocationName}": "location",
"blah": "blah",
"blah": "blah",
The 'problem' is that instead of mapping the value of #{item().LocationName} to be the written results, it uses the value of #{item().LocationName} as a column mapping name. ie. tries to find a column name "Vancouver" instead of passing the value "Vancouver" in to the location.
I would say the current behavior is not 'wrong', but looking for a way to pass a variable/parameter in as part of the copy task.
I've tried setting this at the source dataset level with no luck, and it just seems like the wrong location to be setting a variable anyways.
EDIT: I've also tried setting the #{item().LocationName} value in the Dataset in the connection in the JSONPath Expression section, it runs, but does not appear to set a value for location.
"typeProperties": {
"format": {
"type": "JsonFormat",
"filePattern": "setOfObjects",
"jsonPathDefinition": {
"location": "#{item().LocationName}"

Related

How to select object attribute in ADF using variable

I'm trying to parametrize a pipeline in Azure Data Factory in order to enable a certain functionality to mulptiple environments. The idea is that the current environment is always available through a global parameter. I'd like to use this parameter to look up an array of environments to process data to. Example:
targetEnvs = [{ "dev": ["dev"], "test": ["dev", "test"], "acc": [], "prod": ["acc", "prod"] }]
Then one should be able to select the targetEnv array with something like targetEnvs[environment] or targetEnvs.environment. Subsequently a ForEach is used to execute some logic on these target environments.
I tried setting this up with targetEnvs as a pipeline parameter (with default value mapping each env directly to targetEnv, as follows: {"dev": ["dev"], "test": ["test"]}) Then I have a Set variable step to take value from the targetEnvs parameter, as follows:.
I'm now looking for a way to use the current environment (stored in a global parameter) instead of hardcoding "dev" in the Set Variable expression, but I'm not sure how to do this.
.
Using this expression won't even start the pipeline.
.
Question: how do I select this attribute of the object? Any other suggestions on how to do tackle this problem are welcome as well!
(Python analogy would be to have a dictionary target_envs and taking a value from it by using the key "current_env": target_envs[current_env].)
When I tried to access the object same as you, the same error occurred. I have taken the parameter targetEnv (given array) and global parameter environment with value as dev.
You can use the following dynamic content to access the key value.
#pipeline().parameters.targetEnv[0][pipeline().globalParameters.environment]

How to extract the value from a json object in Azure Data Factory

I have my ADF pipeline, Where my final output from set variable activity is something like this {name:test, value:1234},
The input coming to this variable is
{
"variableName": "test",
"value": "test:1234"
}
The expression provided in Set variable Item column is #item().ColumnName. And the ColumnName in my JSon file is something like this "ColumnName":"test:1234"
How can I change it so that I get only 1234. I am only interested in the value coming here.
It looks like you need to split the value by colon which you can do using Azure Data Factory (ADF) expressions and functions: the split function, which splits a string into an array and the last function to get the last item from the array. This works quite neatly in this case:
#last(split(variables('varWorking'), ':'))
Sample results:
Change the variable name to suit your case. You can also use string methods like lastIndexOf to locate the colon, and grab the rest of the string from there. A sample expression would be something like this:
#substring(variables('varWorking'),add(indexof(variables('varWorking'), ':'),1),4)
It's a bit more complicated but may work for you, depending on the requirement.
It seems like you are using it inside of an iterator since you got item but however, I tried with a simple json lookup value
#last(split(activity('Lookup').output.value[0].ColumnName,':'))

using concat in ADF with a pipeline parameter value

I have a pipeline with a copy activity from storage.
I'm using the concat method to combine number of parameters to create the folder path in the Storage.
I have a wildcardFolderPath field which gets its data from the parameters file.
Part of the data is string and the other is a pipeline parameter
"wildcardFolderPath": {
"value": "[concat(parameters('folderPath'), '/', parameters('folderTime')]",
"type": "Expression"
}
When the pipeline runs, the string param folderPath is retrieved as is but the value of folderTime is not evaluated and this is what I see.
formatDateTime(pipeline().parameters.currentScheduleDateTime) instead of the datetime string.
I also tried using:
#concat(parameters('folderPath'), '/', parameters('folderTime')
and
#{concat(parameters('folderPath'), '/', parameters('folderTime')}
but I get: The workflow parameter 'folderPath' is not found.
Anyone encountered such an issue?
Create a parameter at pipeline level and pass in the expression builder with the following syntax.
#pipeline().parameters.parametername
Example:
You can add the parameter inside Add dynamic content if its not created before and select the parameters created to build an expression.
#concat(pipeline().parameters.Folderpath, '/', pipeline().parameters.Filedate)
Code:

Data factory lookup (dot) in the item() name

I am having lookup wherein salesforce query is there. I am using elements (item()) in subsequent activities. Till now i had item().name or item().email but now i have item().NVMStatsSF__Related_Lead__r.FirstName which has (dot) in the field name.
How should i parse it through body tag so that it reads it correctly?
So I have the following data in item()
{
"NVMStatsSF__Related_Lead__c": "00QE000egrtgrAK",
"NVMStatsSF__Agent__r.Name": "ABC",
"NVMStatsSF__Related_Lead__r.Email": "geggegg#gmail.com",
"NVMStatsSF__Related_Lead__r.FirstName": "ABC",
"NVMStatsSF__Related_Lead__r.OwnerId": "0025434535IIAW"
}
now when i use item().NVMStatsSF__Agent__r.Name it will not parse because of (dot) after NVMStatsSF__Agent__r. And it is giving me the following error.
'item().NVMStatsSF__Related_Lead__r.Email' cannot be evaluated because property 'NVMStatsSF__Related_Lead__r' doesn't exist, available properties are 'NVMStatsSF__Related_Lead__c, NVMStatsSF__Agent__r.Name, NVMStatsSF__Related_Lead__r.Email, NVMStatsSF__Related_Lead__r.FirstName, NVMStatsSF__Related_Lead__r.OwnerId'.",
"failureType": "UserError",
"target": "WebActivityToAddPerson"
this is because ADF uses '.' for object reading.
Could you find a way to rename the field name which contains '.'?
Seems like you need a built-in function to get the value of an object according to the key. Like getValue(item(), 'key.nestkey'). But unfortunately, seems there isn't such a function. You may need handle your key first.
Finally, it worked. I was being silly.
Instead of taking the value from the child table with the help of (dot) operator I just used subquery. Silly see.
And it worked.

Cast values to string in Json Path Expression in Azure Data Factory copy activity

I have an input JSON file where the actual value of the property could be either a numeric value or a string.I extract the value by specifying a json path expression like
"fieldValue": "values[*].value"
in the azure data factory copy activity, connection tab for the source.
Since the actual field value in the JSON could be something like "X" or 2.34 it is not able parse it all into strings even though in the schema I specify the fieldValue as string.
So is there a way I could cast it so that it would take the string as is in case the value is "X" and if its 2.34 convert it to "2.34"
"fields" : "[{"fieldId":"fieldName", "values": [{value: 2.34}]},....}]"
You can use expressions in the value field.
Here's expression and functions in ADF doc.
Example usage:
"field": {
"value": "#string(your_value)",
"type": "Expression"
}
And on the ADF visual tool, there's a "add dynamic content" link below each field. Expressions, functions and system variables can be dynamically added there.