global parameters in ADF - azure-data-factory

I have defined a global parameter but it is not passed correctly. When I try to debug the pipeline it gives below error:
This is how I defined the parameters:
Then I tried to triger the pipeline to see if it was picking all the parameters but leaving out the global parameter.
What could be a reason? and how can I fix it?

Related

Azure DevOps YAML: Passing a variable from a variable group to a parameter in a template file throws type error

I am attempting to read a variable from a Variable Group in my YAML file and passing it on to a template file that consumes that value. The template file defines this parameter as a number and the variable group does indeed store this as a number.
Here I have my pipeline that is invoking the template stage-run-tests with the 4 parameters, 3 of which are read from a variable group.
The template accepts the 4 parameters and have them typed as number
However, when I try running the pipeline, I get an error stating that the parameter was expecting a number and what was passed in was not a number.
If I remove the strong typing and I simply use the parameters as is, then everything work fine.
Can anyone help me with what's happening here? What should I do to ensure that I can have the parameters typed and still have the pipeline pass in the value from the variable group?
Change the type: number to type: string, as there is no number type in a variable group.

Data Fusion: Pass runtime argument from one pipeline to another

I am having a runtime argument set at namespace which is business_date: ${logicalStartTime(yyyy-MM-dd)} . I am using this argument in my pipeline and want to use the same in other pipeline.
There are many pipelines back to back and I want to the value to be same throughout the pipelines once calculated in the first pipeline.
suppose the value is calculates as '2020-08-20 20:14:11' and once the pipeline one succeeded i am passing this argument to pipeline 2, but as this arguments are defined at namespace level it is getting overrided when pipeline2 starts.
How can I have prevent this value to be calculated again ?
As it was commented before, you could set up one pipeline to trigger another pipeline; you can set a runtime variable in the first pipeline and this variable will be set in the triggered pipelines. You can create inbound trigger by following the next steps:
Once you have created your pipelines, select the last pipeline you want to be run. In my case I have DataFusionQuickstart2 pipeline.
Into the pipeline application, on the left side, click on "Inbound triggers" -> "Set pipeline triggers" and you will see the pipes which you can be triggered. Check the event that will trigger the DataFusionQuickstart2 pipeline from DataFusionQuickstart and enable it.
If you take a look to the previous pipeline DataFusionQuickstar you will see, into the outbound trigger option (right side), the pipelines that will be triggered by DataFusionQuickstar.
Finally set your runtime argument.
Additional information
In this post, it was mentioned that there are three ways you can set the runtime argument of a pipeline:
Argument Setter plugin (You can write down that value in a file into the first pipeline. In all subsequent pipelines, create a parameter to read that file.)
Passing runtime argument when starting a pipeline (The one described above)
Setting Preferences (It provides the ability to save configuration information at various levels of the system, including the CDAP instance, namespace, application, and program levels.)
You can write down that value in a file in first pipeline. In all subsequent pipelines, create a parameter to read that file. That way, objective should be achieved.
#Sudhir, you may explore PREFERENCES. https://cdap.atlassian.net/wiki/spaces/DOCS/pages/477561058/Preferences+HTTP+RESTful+API
You have set the variable at namespace level and as per your finding it is getting evaluated each time its being used.
Can you try setting it at Application level?
And pass it to next pipeline. I believe in that case, it should be evaluated only once in that specific application (pipeline) and thereafter value would be passed.
Preference is also available at program level.

Data Fusion - Argument defined in Argument Setter Plugin Supersedes Runtime Arguments intermittently

Using Data Fusion Argument Setter, I've defined all parameters in it for a reusable pipeline. While executing it, I provide runtime arguments for some parameters which are different from default arguments provided in the JSON URL embedded in Argument Setter.
But a number of times, the pipeline ends up taking the default values from Argument Setter URL instead of Runtime Arguments causing failures.
This behavior is not consistent in every pipeline I create - which confirms that Runtime arguments are supposed to supersede any prior value defined for an argument.
The workarounds I use is by deleting the plugin and re-adding it for every new pipeline. But that defeats the purpose of creating a re-usable pipeline.
Has anyone experienced this issue ?
Current Runtime Options
This wiki https://cloud.google.com/data-fusion/docs/tutorials/reusable-pipeline provides the sample of how to create re-usable pipeline using Argument Setter. From there, it seems like the runtime arguments was used to notify the data fusion pipeline to use the macro from Argument Setter URL. Argument Setter is a type of Action plugin that allows one to create reusable pipelines by dynamically substituting the configurations that can be served by an HTTP Server. It looks like no matter how you change the runtime arguments, as long as long the same marco can be read when pipeline is running, the arguments will be override.

Azure DevOps Release Pipeline Variable Nested/Composed/Expression

I have a release pipeline with a variable, but there doesn't seem to be any way to set the value of that variable to something that's evaluated at release time. For example, another variable.
Here's a real example:
All I want to do is set the value of MyExpressionBasedVariable to the value of MyOtherVariable.
All the docs and examples online seem to suggest it's possible, but I can't get it to work. I always end up with the literal string rather than the evaluated value.
I've tried using these different syntaxes:
$(MyOtherVariable)
$[variables['MyOtherVariable']]
${{variables['MyOtherVariable']}}
I've seen that you can define custom tasks to set variable names as part of the pipeline but this seems massively overkill.
Essentially all I want to do is rename a key vault secret to a different variable name for convention-based XML variable replacement in config files.
E.g. I have a secret called this-is-a-secret-name-which-is-a-different-naming-convention-to-my-connectionstrings but I need it in a variable called MySecret-ConnectionString.
How do I use the value of another variable in a release pipeline variable?
How do I use the value of another variable in a release pipeline variable?
As I test, what you set should be work. You can try to follow below steps to check if you still have this issue:
Create a new release pipeline without link any variable group.
Set the Variable like following:
Add a Run Inline Powershell task to output the value of the Variable:
Write-Output 'The value of MyExpressionBasedVariable is $(MyExpressionBasedVariable)'
Write-Output 'The value of $(MyOtherVariable) is $(MyOtherVariable)'
Then we could get the log:
So, what you set should be work, if this still does not work for you, then you need to make sure that the variable you describe in the question is the variable your actual test.
Besides, at this moment, the value of nested variables (like $(TestVar_$(Release.Reason))) are not yet supported in the build/release pipelines, check this thread for some details, so make sure there are no such nested variables in your project.
Hope this helps.

How do I get around PowerShell not binding pipeline parameters until after BeginProcessing is called?

I'm writing a Cmdlet that can be called in the middle of a pipeline. With this Cmdlet, there are parameters that have the ValueFromPipelineByPropertyName attribute defined so that the Cmdlet can use parameters with the same names that are defined earlier in the pipeline.
The paradox that I've run into is in the overridden BeginProcessing() method, I utilize one of the parameters that can get its value bound from the pipeline. According to the Cmdlet Processing Lifecycle, the binding of pipeline parameters does not occur until after BeginProcessing() is called. Therefore, it seems that I'm unable to rely on pipeline bound parameters if they're attempting to be used in BeginProcessing().
I've thought about moving things to the ProcessRecord() method. Unfortunately, there is a one time, relatively expensive operation that needs to occur. The best place for this to happen seems to be in the BeginProcessing() method to help ensure that it only happens once in the pipeline.
A few questions question surrounding this:
Is there a good way around this?
These same parameters also have the Mandatory attribute set on them. How can I even get this far without PowerShell complaining about not having these required parameters?
Thanks in advance for your thoughts.
Update
I took out the second part of the question as I realized I just didn't understand pipeline bound parameters well enough. I mistakingly thought that pipeline bound parameters came from the previous Cmdlet that executed in the pipeline. The actually come from the object being passed through the pipeline! I referenced a post by Keith Hill to help understand this.
You could set an instance field bool (Init) to false in BeginProcessing. Then check to see if the parameter is set in BeginProcessing. If it is then call a method that does the one time init (InitMe). In ProcessRecord check the value of Init and if it is false, then call InitMe. InitMe should set Init to true before returning.
Regarding your second question, if you've marked the parameter as mandatory then it must be supplied either as a parameter or via the pipeline. Are you using multiple parameter sets? If so, then even if a parameter is marked as mandatory, it is only mandatory if the associated parameter set is the one that is determined by PowerShell to be in use for a particular invocation of the cmdlet.