Get Output of Copy Activity present Inside the Switch in Azure Data Factory - azure-data-factory

I have a Copy Activity Inside a switch Activity. I need to pass the rowsWritten to a notebook present outside the switch activity.
How Will I be available to pass the output of copy activity to the notebook present outside the switch activity.
I have already used Pipeline Variables to achieve this. Exploring for an alternate solution without the use of Pipeline variables

AFAIK, We cannot get the activity outputs which are inside Switch in the outside of it by referencing it.
As the ADF and we don’t know which case will be executed inside Switch, that’s why when we reference those outside it will say The output of activity 'Lookup1' can't be referenced since it is either not an ancestor to the current activity or does not exist.
Here I have used two lookups inside two Switch cases and when I get the output of one Lookuo activity outside Switch, you can see the error.
Also, it will fail on Validation of the pipeline.
As you said, Using a Pipeline variable and set variable activities inside every case and storing the activity output(rowsWritten) and using that variable outside Switch activity is the workaround for it.

Related

Data Fusion: Pass runtime argument from one pipeline to another

I am having a runtime argument set at namespace which is business_date: ${logicalStartTime(yyyy-MM-dd)} . I am using this argument in my pipeline and want to use the same in other pipeline.
There are many pipelines back to back and I want to the value to be same throughout the pipelines once calculated in the first pipeline.
suppose the value is calculates as '2020-08-20 20:14:11' and once the pipeline one succeeded i am passing this argument to pipeline 2, but as this arguments are defined at namespace level it is getting overrided when pipeline2 starts.
How can I have prevent this value to be calculated again ?
As it was commented before, you could set up one pipeline to trigger another pipeline; you can set a runtime variable in the first pipeline and this variable will be set in the triggered pipelines. You can create inbound trigger by following the next steps:
Once you have created your pipelines, select the last pipeline you want to be run. In my case I have DataFusionQuickstart2 pipeline.
Into the pipeline application, on the left side, click on "Inbound triggers" -> "Set pipeline triggers" and you will see the pipes which you can be triggered. Check the event that will trigger the DataFusionQuickstart2 pipeline from DataFusionQuickstart and enable it.
If you take a look to the previous pipeline DataFusionQuickstar you will see, into the outbound trigger option (right side), the pipelines that will be triggered by DataFusionQuickstar.
Finally set your runtime argument.
Additional information
In this post, it was mentioned that there are three ways you can set the runtime argument of a pipeline:
Argument Setter plugin (You can write down that value in a file into the first pipeline. In all subsequent pipelines, create a parameter to read that file.)
Passing runtime argument when starting a pipeline (The one described above)
Setting Preferences (It provides the ability to save configuration information at various levels of the system, including the CDAP instance, namespace, application, and program levels.)
You can write down that value in a file in first pipeline. In all subsequent pipelines, create a parameter to read that file. That way, objective should be achieved.
#Sudhir, you may explore PREFERENCES. https://cdap.atlassian.net/wiki/spaces/DOCS/pages/477561058/Preferences+HTTP+RESTful+API
You have set the variable at namespace level and as per your finding it is getting evaluated each time its being used.
Can you try setting it at Application level?
And pass it to next pipeline. I believe in that case, it should be evaluated only once in that specific application (pipeline) and thereafter value would be passed.
Preference is also available at program level.

Azure Data Factory - 'variables' is not a recognized function

We are using an Azure SQL Database sink in a Copy Activity.
The requirement is for us to execute a stored procedure here via the "Pre-Copy Script" property of the Sink. We are using Dynamic Content, passing in a "ProcessName" parameter. As you can see, we have a ProcessName variable, and it is used in a call to #concat() to build the stored procedure string for this Sink property.
However, any time we use the variables collection in Dynamic Content, we get this warning as shown in the image. The warning states:
'variables' is not a recognized function
Is there a way to avoid having this "Warning" in the UI? It works fine, but it looks terrible. It appears everywhere we use variables, not just in this case.
Try using pipeline parameters instead of variables and calling it as explained here: https://learn.microsoft.com/en-us/azure/data-factory/control-flow-expression-language-functions
You are most likely going to have to change your logic to do this. But if it works fine as it is, I wouldn't mind changing logic or code just to stop seeing the warning in the UI.
Hope this helped!

How to pass nested Stack outputs to another step in Octopus Deploy

In my Octopus project, the first step launches a bunch of nested stacks implemented with cloudformation.
I need to share the outputs of the master stack launched from Octopus, how can I do that?
Thanks.
The output variables from the CloudFormation template will be available to later steps the same as any other Octopus output variable, this is mentioned in the first paragraph of the documentation page.
Output variables can be accessed a number of different ways, depending on where you are accessing them, for example, in Powershell they can be accessed via the parameters dictionary $OctopusParameters["Octopus.Action[Step Name].Output.VariableName"].
You can also access them using the Variable Binding syntax, #{Octopus.Action[Step Name].Output.VariableName}
More information about output variables is available in the docs.

Saving an Environment Variable back to Team City from Powershell

We have a need that periodically, we will run a build configuration that among other things, recreates tokens/logins etc. We want to save these back to Team City as Environment variables. Builds that we subsequently do will want to look at this Environment Variable store and do a string replace within our configurations as required.
I've taken a look at :
##teamcity[setParameter name='env.TEST' value='test']
But from reading the documentation, this is only used to pass variables between build steps within the same build. It doesn't actually save the variable back to Team City.
Is there any way (Preferably from a powershell script), to call Team City and tell it to add a Environment Variable (Or any other variable).
In order to persist a value back to a parameter you have to call the REST API.
I use a PowerShell script that acts as a wrapper around the Invoke-RestMethod cmdlets in PowerShell 3+ that can be reused in a build step to achieve what you want.
Step 1.
Save the script to a PowerShell file and add it to your source control rest-api-wrapper.ps1
Step 2.
Create a PowerShell build step referencing the script and pass in the following arguments, tailored for your situation
%teamcity.serverUrl%/httpAuth/app/rest/projects/project_id/parameters/parameter_name
"Username"
"Password"
"PUT"
"TheValueToSave"
More details can be found here - TeamCity Documentation
Hope this helps

PowerShell InitializeDefaultDrive

In my custom powershell provider I want the user to be able to skip the internal call to InitializeDefaultDrives.
The InitializeDefaultDrives method is called when the provider starts. I guess this is when I use the Add-SnapIn cmdlet to load my provider. So it looks like i'm searching for a way to add dynamic parameters to to the Add-SnapIn cmdlet.
I know i can just skip the implementation of InitializeDefaultDrives and have the user use new-PsDrive to add the drive manually if desired. That is not want I want. I want to always create a default drive except in the case where the user wants to skip this.
Thanks!
AFAIK dynamic parameters only work when you control the source code i.e. in your code you can choose to expose a dynamic parameter based on the value of another parameter (like a path). Then your code does something different based on the value of the dynamic parameter. However, you should be able to accomplish what you want via a session preference variable.
$SuppressDriveInit = $true
Add-PSSnapin Acme
If the variable is not definied or is set to false, then go ahead and initialize the drive. It is low tech but should work fine.