How to provide LinkedService with a parameter value from Mapping Dataflow? - azure-data-factory

I am doing a SQL Azure Database as a datasource in a mapping dataflow. The parameter KeyVaultConnectionString should be dynamic. Se picture 1.
My problem is that I am unable to fiugure out how to provide the value from mapping dataflow. I have tried different things like #pipeline.parameters.ConnString but thats just crap.
Thanks.
***** UPDATE (28-09-2022) *****
I forgot to explain that the flow is a Flowlet. The picture below shows the Flowlet that uses a SQL Server as a Source. I choose Inline dataset but I am not able to pass variable from the "outer" mapping data flow to the datasource.

For inline dataset as source in dataflows, the parameter option is not showing as a separate section. I tried to use the following linked service where I am using the linked service parameter to give my blob storage name.
Now when I use this in the dataflow source as inline dataset and try to test connection, it fails.
However, complete configuring the dataflow with required transformations and use dataflow activity in azure data factory pipeline. Here it shows the parameter to which the necessary value should be passed to linked service.

When you are in design/debug mode from the data flow designer, parameters will show in the right-panel when you click "Debug Settings" on the top of your designer window.

Related

Copy subdirs + files from storage acct to ADX using Data Factory

I'm trying to copy files from Az Storage blob to ADX using Data factory, but I can't find a solution to do this using json datasets (not binary), so I can map schema. I would love to have a template to do this.
I have tried to follow the resolution mentioned here (Get metadata from Blob storage with "folder like structure" using Azure Data Factory pipeline), but I'm lacking some more guidance (this is my first project using ADF)
enter image description here
enter image description here
enter image description here
Now I am getting another error as shown below
I'm actually also looking for a COMPLETE guide setting this up.
Here is my overall use-case target https://learn.microsoft.com/en-us/azure/sentinel/store-logs-in-azure-data-explorer?tabs=azure-storage-azure-data-factory - but the documentation is missing the detailed steps - in step 6 Create a data pipeline with a copy activity, based on when the blob properties were last modified. This step requires an extra understanding of Azure Data Factory. For more information, see Copy activity in Azure Data Factory and Azure Synapse Analytics.
enter image description here
enter image description here
It seems as I was going in a wrong direction.
I have actually found a simple solution for setting this up. I have tried the copy data tool, and it seems to be doing what I want. So case closed :-)
From the error message you have shared, it seems like your dynamic expression for passing the childItems of your get metadata activity to ForEach activity items is causing this problem.
You might be using Items = #activity('GetMetadataActivity').output instead of #activity('GetMetadataActivity').output.childItems.
Please use Items = #activity('GetMetadataActivity').output.childItems in your ForEach activity which should help resolve your error.
Here is a video demonstration by a community volunteer where this error has been explained in detail: ADF Error Function Length expects its parameter to be array or string

Azure Data Factory propagating parameters

Using the Debug on the pipeline with parameters throws a connection error but with Az Powershell the pipeline executes fine:
Invoke-AzDataFactoryV2Pipeline -ResourceGroupName "name" -DataFactoryName "name" -PipelineName "name" -ParameterFile "json"
The pipeline uses a Lookup activity that queries Cosmos after document identifiers that are handed over to a ForEach activity with a single Copy activity using the identifier to get the entire document then write it to Blob Storage. Any way to grab the linked service parameters when debugging?
I reproduced the above scenario with an ADLS gen2 linked service to a lookup activity.
When I check the connection of linked service in the dataset it will give connection error.
If we debug the pipeline at this time, it will give the linked service propagation error like below.
The pipeline only supports to give value for pipeline parameters only, at the debug. Other than those it will not give any option for giving values to dataset parameters or linked service parameters at the debug.
If you want to give the value for linked service parameter using pipeline dynamic content, you can use dataset parameters for this.
If a dataset parameter's default value is not given at that time, the pipeline will be validated only when we give that value in pipeline.
First create a dataset parameter.
Now give the dataset parameter to linked service parameter in the dynamic content.
Now, you can give the pipeline dynamic content to linked service parameter like I am giving the storage account name to the dataset parameter which will be the linked service parameter value also.
You can see my pipeline executed successfully.

Passing a variable from pipeline to a dataflow in azure data factory

I have the following setup in ADF and I would like to pass a variable which has been set in Set variable activity to my data flow activity. I tried creating parameter, but I am unable to do so. Can someone explain me how it is done??
As you see I am unable to get $token1 value in my source option in the dataflow.
You have done it in a correct way, this is how we add pipeline variables to the Data flow.
I have tried from my end the same way and it worked fine for me.
Create parameter in data flow.
Pass the value of data flow parameter from pipeline dataflow settings as below:
In Dataflow source options, open the expression builder to add dynamic content and select the data flow parameter created.
I created a string variable at the pipeline level and passed it to the data flow string parameter. So in the Authorization value, it is shown as 'abc' to indicate it is string.
Data preview of source:
Note: Make sure you are selecting the correct parameter from the expression builder. I see that you have created a parameter as a string but when the parameter is added in the value it is showing as 'Any'. Delete and recreate the parameter/header to refresh any changes.

Azure Copy Activity Rest Results Unexpected

I'm attempting to pull data from the Square Connect v1 API using ADF. I'm utilizing a Copy Activity with a REST source. I am successfully pulling back data, however, the results are unexpected.
The endpoint is /v1/{location_id}/payments. I have three parameters, shown below.
I can successfully pull this data via Postman.
The results are stored in a Blob and are as if I did not specify any parameters whatsoever.
Only when I hardcode the parameters into the relative path
do I get correct results.
I feel I must be missing a setting somewhere, but which one?
You can try setting the values you want into a setVariable activity, and then have your copyActivity reference those variables. This will tell you whether it is an issue with the dynamic content or not. I have run into some unexpected behavior myself. The benefit of the intermediate setVariable activity is twofold. Firstly it coerces the datatype, secondly, it lets you see what the value is.
My apologies for not using comments. I do not yet have enough points to comment.

ADF - Using parameter from loop in a copy data flow

I've created a loop that loops over URL's to fetch ODATA. Then, a copy flow is created for every path in the oDATA service. However, I need to be able to pass the URL into these tables as well.
How can I add the URL (#pipeline().parameters.ProjectUrl) to my sink when I am unable to import schemas because I'm working with parameters? Note that my query is a select, like so:
$select=Field1,Field2,Field3
I'd like to add my parameter here, so it gets added to the tables.
THanks!
By copy flow do you mean 'a mapping data flow, which is used just to copy' ?
If that is the case, go into the flow, and add a parameter. Keep the type as string (not all of the flow parameter types have pipeline equivalents). Go back to the pipeline, and look at the execute data flow activity. The parameter will now be visible in the activity. When you click on the 'value' field, you can choose between 'Pipeline expression' and 'Data flow expression'. Choose 'Pipeline expression' and place the #pipeline().parameters.ProjectUrl here.