Passing Generated Value to Downstream Job - powershell

I'm struggling to figure out a way to populate a parameter for a downstream, freestyle project based on a value generated during my pipeline run.
A simplified example would probably best serve to illustrate the issue:
//other stuff...
stage('Environment Creation') {
steps {
dir(path: "${MY_DIR}") {
powershell '''
$generatedProps = New-Instance #instancePropsSplat
$generatedProps | Export-Clixml -Depth 99 -Path .\\props.xml
'''
stash includes: 'props.xml', name: 'props'
}
}
}
//...later
stage('Cleanup') {
unstash props
// either pass props.xml
build job: 'EnvironmentCleanup', wait: false, parameters: [file: ???]
// or i could read the xml and then pass as a string
powershell '''
$props = Import-Clixml -Path props.xml
# how to get this out of this powershell script???
'''
}
I create some resources in one stage, and then in a subsequent stage I want to kick off a job using those resources as a parameter. I can modify the downstream job however I want, but I am struggling to figure out how to do this.
Things I've tried:
File Parameter (just unstashing and passing through)
Apparently do not work with pipelines
Potential Paths:
EnvInject: may not be safe to use and apparently doesn't work with pipelines?
Defining a "global" variable like here, but I'm not sure how powershell changes that
So, what's the best way of accomplishing this? I have a value that is generated in one stage of a pipeline, and I then need to pass that value (or a file containing that value) as a parameter to a downstream job.

So here's the approach I've found that works for me.
In my stage that depends on the file created in a previous stage, I am doing the following:
stage ("Environment Cleanup') {
unstash props
archiveArtifacts "props.xml"
build job: 'EnvironmentCleanup', parameters: [string(name: "PROJECT", value: "${JOB_NAME}")], wait: false
}
Then in my dependent job (freestyle), I copy the "props.xml" file from the triggering build and using the job name passed in as a parameter, then execute my powershell to deserialize the xml to an object, and then read the properties I need.
The last, and most confusing, part I was missing was in the options for my triggering pipeline project I needed to grant copy permission to the downstream job:
options {
copyArtifactPermission('EnvironmentCleanup') // or just '*' if you want
}
This now works like a charm and I will be able to use it across my pipelines that follow this same workflow.

Related

how can i pass a powershell variable to terraform

I need to pass a powershell/devops variable to terraform, is there a way of doing this? As in the below example i want the below PR number to be used as a variable in terraform.
testvalue-$(System.PullRequest.PullRequestNumber)
As far as I know, there is no way to define a variable by the output of a command (shell ..), but you can take a look at this data source external data source ,
the idea is that you define a bash script or any program and use it's output as parameters for other resources.
Example
data "external" "PullRequest" {
program = [
"${path.module}/scriptWhichReturnsPullRequestName.sh",
]
result {
}
}
...
resource ... {
value = data.external.PullRequest.property
}
I put my variables in a variables.tf and trigger terraform execution from a powershell script. Prior that execution I just replace certain strings in variables.tf.

Terraform Azure Devops Provider

We are trying to automate the Azure DevOps functions using Terraform. We are able to create Projects and Repos using Terraform. But we need to create multiple projects and repos specific to each project.
I have my terraform.tfvars file as given below
Proj1_Repos = ["Repo1","Repo2","Repo3"]
Proj2_Repos = ["Repo4","Repo5","Repo7"]
Project_Name = ["Proj1","Proj2"]
How i can write my terraform configuration file to create Proj1_Repos in Proj1 and Proj2_Repos in Proj2
I think you will have an easier time restructuring the variables to look something like:
"Projects" = {
"Proj1" = {
"repos" = ["Repo1","Repo2","Repo3"]
},
"Proj2" = {
"repos" = ["Repo4","Repo5","Repo6"]
}
}
This way you can more cleanly iterate over your declarations using the for_each operator for your devops repo resources.
Alternatively, if restructuring the input variables isn't an option, you can use the locals block to construct an association map for your variables. Something like this
If you are looking for a way to feed a variable value to reference another variable, you will not be able to do so without constructing a custom data object using the key and value of your variables. This route can get pretty wonky and not recommended.

How to make the output in the pipeline appear on the console when running pester tests?

By default, the output in the pipeline is hidden,But sometimes I really want to know the output at that time.
Of course, I knew I could add additional commands, such as write-host or out-default.
But does Pester itself have a mechanism to make the output display properly?
I checked the help document and didn't find the relevant content, so I came here for help.
It is possible to write a custom Should wrapper (proxy) using this technique. The wrapper can write the pipeline objects to the console.
Here is a complete example of such a Should wrapper and how to override Pester's Should.
To apply it to your case, edit the process{} block of the wrapper like this:
process {
try {
# Here $_ is the current pipeline object
Write-Host "Current test case input:`n$( $_ | Out-String )"
# forward it to "process" block of original "Should"
$steppablePipeline.Process( $_ )
} catch {
throw
}
}

Default or prevent ADF Pipeline Activity Parameters

How do you specify that an activity should not be parameterised in an exported ARM template, or ensure the parameter default value is whatever is already specified?
I have an ADF pipeline which contains a WebActivity. This WebActivity URL is set by an expression which concatenates some text with some pipeline parameters:
#concat(pipeline().parameters.URL,'path/',pipeline().parameters.ANOTHER_PARAMETER,'/morePath/', pipeline().parameters.YET_ANOTHER_PARAMETER,'/lastBitOfPath')
When I export the ADF template through the UI, there are some parameters added which look like: PIPELINE_NAME_properties_0_typeProperties, are type String, but are blank. These appear to correspond to the WebActivity URL fields in various activities.
If I then import that ARM template and parameter file into a new Data Factory, the WebActivity URL is blank. This means I need to override the parameter as normal, fine, but why....? I don't need a new parameter to specify a value that is already set by parameters... how do I ensure that this activity is imported with the same expression? It seems mad that to use a WebActivity means you have to parameterise the expression. I want the ARM Template > Export ARM Template to export what I've got, not add redundant parameters that I do not need.
I have also tried editing the pipeline JSON to add a default and defaultValue attribute for the URL activity, but they are removed and have no effect.
It seems the reason for this is that the parameterization template has been modified to include:
"Microsoft.DataFactory/factories/pipelines": {
"properties": {
...
"activities": [{
"typeProperties": {
"url": "-::string"
}
}
]
}
},
Which removes the default for all URL properties of all activities.
https://learn.microsoft.com/en-gb/azure/data-factory/continuous-integration-deployment#use-custom-parameters-with-the-resource-manager-template
This applies to all activites so it seems the only alternative is to specify
"url": "=::string"
Which will parameterise the URL (so any existing parameterization will continue to function) but keep the original value by default. Care must then be taken to override any other activity url properties that I do not wish to move.

Azure devops custom extension output variables

I am trying to create an extension using node api which publishes a path variable on completion.
I did set outputVariables in task.json and tried to use both
tl.setVariable('outVar1', 'outVal1'))
tl.setTaskVariable('outVar1', 'outVal1'))
task.json (only outvariable section):
"OutputVariables": [
{
"name": "outVar1",
"description": "This publish a output variable."
}
],
I tried printing it in the subsequent steps in the same job using all the recomended constructs
$(taskName.outVar1)
$taskName.outVar1
$outVar1
$(outVar1)
But the variable is not visible. I also printed all the environment variables and the variable is not present there.
Is someone able to create an extension which outputs a variable successfully?
You don't need to declare an output variable fro this purpose.
Just set the variable:
tl.setVariable("varNamr","varValue", false);
The fasle indicate that is not a secret variable.
In the enxt steps you can use the variable wirh $(varName).