Using the Debug on the pipeline with parameters throws a connection error but with Az Powershell the pipeline executes fine:
Invoke-AzDataFactoryV2Pipeline -ResourceGroupName "name" -DataFactoryName "name" -PipelineName "name" -ParameterFile "json"
The pipeline uses a Lookup activity that queries Cosmos after document identifiers that are handed over to a ForEach activity with a single Copy activity using the identifier to get the entire document then write it to Blob Storage. Any way to grab the linked service parameters when debugging?
I reproduced the above scenario with an ADLS gen2 linked service to a lookup activity.
When I check the connection of linked service in the dataset it will give connection error.
If we debug the pipeline at this time, it will give the linked service propagation error like below.
The pipeline only supports to give value for pipeline parameters only, at the debug. Other than those it will not give any option for giving values to dataset parameters or linked service parameters at the debug.
If you want to give the value for linked service parameter using pipeline dynamic content, you can use dataset parameters for this.
If a dataset parameter's default value is not given at that time, the pipeline will be validated only when we give that value in pipeline.
First create a dataset parameter.
Now give the dataset parameter to linked service parameter in the dynamic content.
Now, you can give the pipeline dynamic content to linked service parameter like I am giving the storage account name to the dataset parameter which will be the linked service parameter value also.
You can see my pipeline executed successfully.
Related
I am doing a SQL Azure Database as a datasource in a mapping dataflow. The parameter KeyVaultConnectionString should be dynamic. Se picture 1.
My problem is that I am unable to fiugure out how to provide the value from mapping dataflow. I have tried different things like #pipeline.parameters.ConnString but thats just crap.
Thanks.
***** UPDATE (28-09-2022) *****
I forgot to explain that the flow is a Flowlet. The picture below shows the Flowlet that uses a SQL Server as a Source. I choose Inline dataset but I am not able to pass variable from the "outer" mapping data flow to the datasource.
For inline dataset as source in dataflows, the parameter option is not showing as a separate section. I tried to use the following linked service where I am using the linked service parameter to give my blob storage name.
Now when I use this in the dataflow source as inline dataset and try to test connection, it fails.
However, complete configuring the dataflow with required transformations and use dataflow activity in azure data factory pipeline. Here it shows the parameter to which the necessary value should be passed to linked service.
When you are in design/debug mode from the data flow designer, parameters will show in the right-panel when you click "Debug Settings" on the top of your designer window.
I have a set of json files that I want to browse, in each file there is a field that contains a list of links that direct to an image. The goal is to download each image from the links using binary formats (I tested with several links and it already works).
Here, my problem is to make the nested ForEach, I manage to browse all the json files but when I make a second ForEach to browse the links and make a copy data to download the images using an Execute Pipeline I get this error
"ErrorCode=InvalidTemplate, ErrorMessage=cannot reference action 'Copy data1'. Action 'Copy data1' must either be in 'runAfter' path, or be a Trigger"
Example of file:
t1.json
{
"type": "jean",
"image":[
"pngmart.com/files/7/Denim-Jean-PNG-Transparent-Image.png",
"https://img2.freepng.fr/20171218/882/men-s-jeans-png-image-5a387658387590.0344736015136497522313.jpg",
"https://img2.freepng.fr/20171201/ed5/blue-jeans-png-image-5a21ed9dc7f436.281334271512172957819.jpg"
]
}
t1.json
{
"type": "socks",
"image":[ "https://upload.wikimedia.org/wikipedia/commons/thumb/5/52/Fun_socks.png/667px-Fun_socks.png",
"https://upload.wikimedia.org/wikipedia/commons/e/ed/Bulk_tube_socks.png",
"https://cdn.picpng.com/socks/socks-face-30640.png"
]
}
Do you have a solution?
Thanks
As per the documentation you cannot nest For Each activities in Azure Data Factory (ADF) or Synapse Pipelines, but you can use the Execute Pipeline activity to create nested pipelines, where the parent has a For Each activity and the child pipeline does too. You can also chain For Each activities one after the other, but not nest them.
Excerpt from the documentation:
Limitation
Workaround
You can't nest a ForEach loop inside another ForEach loop (or an Until loop).
Design a two-level pipeline where the outer pipeline with the outer ForEach loop iterates over an inner pipeline with the nested loop.
Or visually:
It may be that multiple nested pipelines is not what you want in which case you could pass this looping off to another activity, eg Stored Proc, Databricks Notebook, Synapse Notebook (if you're in Azure Synapse Analytics) etc. One example here might be to load up the json files into a table (or dataframe), extract the filenames once and then loop through that list, rather than each file. Just an idea.
I have repro’d and was able to copy all the links looping the copy data activity inside the ForEach activity and using the execute pipeline activity.
Parent pipeline:
If you have multiple JSON files, get the files list using the Get Metadata activity.
Loop the child items using the ForEach activity and add the execute pipeline activity to get the data from each file by passing the current item as a parameter (#item().name).
Child pipeline:
Create a parameter to store the file name from the parent pipeline.
Using the lookup activity, get the data from the current JSON file.
Filename property: #pipeline().parameters.filename
Here I have added https:// to your first image link as it is not validating in the copy activity and giving an error.
Pass the output to the ForEach activity and loop through each image value.
#activity('Lookup1').output.value[0].image
Add Copy data activity inside ForEach activity to copy each link from source to sink.
I have created a binary dataset with the HttpServer linked service and created a parameter for the Base URL in the linked service.
Passing the linked service parameter value from the dataset.
Pass the dataset parameter value from the copy activity source to use the current item (link) in the linked service.
I have the following setup in ADF and I would like to pass a variable which has been set in Set variable activity to my data flow activity. I tried creating parameter, but I am unable to do so. Can someone explain me how it is done??
As you see I am unable to get $token1 value in my source option in the dataflow.
You have done it in a correct way, this is how we add pipeline variables to the Data flow.
I have tried from my end the same way and it worked fine for me.
Create parameter in data flow.
Pass the value of data flow parameter from pipeline dataflow settings as below:
In Dataflow source options, open the expression builder to add dynamic content and select the data flow parameter created.
I created a string variable at the pipeline level and passed it to the data flow string parameter. So in the Authorization value, it is shown as 'abc' to indicate it is string.
Data preview of source:
Note: Make sure you are selecting the correct parameter from the expression builder. I see that you have created a parameter as a string but when the parameter is added in the value it is showing as 'Any'. Delete and recreate the parameter/header to refresh any changes.
I have created a pipeline (LogPipeline) that logs other pipelines' status to a database. The idea is that every pipeline will call the LogPipeline at the start and at the end by passing pipeline name and pipeline ID along with other parameters like started/ended/failed.
The last parameter is "Reason" where I want to capture the error message of why a pipeline may have failed.
However, in a given pipeline there are multiple activities that could fail. So I want to direct any and all failed activities to my Execute Pipeline activity and pass the error message.
But on the Execute Pipeline when filling out the parameters, I can only reference an activity by its name, e.g. Reason = #activity['Caller Activity'].Error.Message.
However, since multiple activities are calling this Execute Pipeline, is there a way to say
Reason = #activity[activityThatCalledExecutePipeline].Error.Message?
If my understanding is correct,there are multiple activities call the LogPipeline and you want to get those failed activities' names so that you could know the names inside LogPipeline. Per my knowledge,your requirement is not supported in ADF.
I'm not sure why you have to construct such complex scenario,even though you just want to log the specific fail activities and error messages anyway which is common requirement.There are many monitor ways supported by ADF,please follow below links:
1.https://learn.microsoft.com/en-us/azure/data-factory/monitor-using-azure-monitor#alerts
2.https://learn.microsoft.com/en-us/azure/data-factory/monitor-programmatically
I would suggest you getting an idea of Alerts and Monitor in ADF portal:
And you could set the Target Criteria
It includes:
I am trying to collect some metrics on releases in Azure Devops via a Powershell script.
I have very limited dev experience and am new to PowerShell. And this is the first time I have worked with an API. So far I have been able to authenticate, return a list of releases, loop through them and export the data to a file. Now I need to filter the releases based on a substring of the release name. For the record I have been doing my initial testing in Postman to make sure my syntax and results are correct. Then I migrated working syntax to Powershell.
https://{{organization}}.vsrm.visualstudio.com/{{project}}/_apis/release/releases?api-version=5.0
If I add the id filter as shown here:
https://{{organization}}.vsrm.visualstudio.com/{{project}}/_apis/release/releases?api-version=5.0&releaseId=34567
I get this result:
"id": 34567,
"name": "Test-Release-MyService",
But if use the same filter format for Release Name,
https://{{organization}}.vsrm.visualstudio.com/{{project}}/_apis/release/releases?api-version=5.0&releaseName="Test-Release-MyService"
I get back 50 results of which none match that criteria, whether I wrap the string in quotes or not. Furthermore, what i really want to do is to have the response only include records where the releaseName contains "XYZ".
So the question: Is there a filter operator for "contains" so I only get back records where the release name contains the "XYZ" substring?
Thanks in advance for your advice.
Every parameter you used in Azure DevOps REST API needs to be consistent with the description in the document, Azure DevOps REST API does not support custom parameters. For your question, the parameter searchText is used to filter the the searching result with the release name containing the keyword. I have tested with POSTMAN to call the api, it works fine. In addition, the value of parameter searchText is not case-sensitive. Filter release name
If you want to do more filter, in fact you can use powershell or other client library to deserialize the json response to an object, and do some convert or filter. Following documents may be helpful for you:
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/convertfrom-json?view=powershell-6
https://devblogs.microsoft.com/scripting/playing-with-json-and-powershell/