I need to pass a powershell/devops variable to terraform, is there a way of doing this? As in the below example i want the below PR number to be used as a variable in terraform.
testvalue-$(System.PullRequest.PullRequestNumber)
As far as I know, there is no way to define a variable by the output of a command (shell ..), but you can take a look at this data source external data source ,
the idea is that you define a bash script or any program and use it's output as parameters for other resources.
Example
data "external" "PullRequest" {
program = [
"${path.module}/scriptWhichReturnsPullRequestName.sh",
]
result {
}
}
...
resource ... {
value = data.external.PullRequest.property
}
I put my variables in a variables.tf and trigger terraform execution from a powershell script. Prior that execution I just replace certain strings in variables.tf.
Related
We are trying to automate the Azure DevOps functions using Terraform. We are able to create Projects and Repos using Terraform. But we need to create multiple projects and repos specific to each project.
I have my terraform.tfvars file as given below
Proj1_Repos = ["Repo1","Repo2","Repo3"]
Proj2_Repos = ["Repo4","Repo5","Repo7"]
Project_Name = ["Proj1","Proj2"]
How i can write my terraform configuration file to create Proj1_Repos in Proj1 and Proj2_Repos in Proj2
I think you will have an easier time restructuring the variables to look something like:
"Projects" = {
"Proj1" = {
"repos" = ["Repo1","Repo2","Repo3"]
},
"Proj2" = {
"repos" = ["Repo4","Repo5","Repo6"]
}
}
This way you can more cleanly iterate over your declarations using the for_each operator for your devops repo resources.
Alternatively, if restructuring the input variables isn't an option, you can use the locals block to construct an association map for your variables. Something like this
If you are looking for a way to feed a variable value to reference another variable, you will not be able to do so without constructing a custom data object using the key and value of your variables. This route can get pretty wonky and not recommended.
By default, the output in the pipeline is hidden,But sometimes I really want to know the output at that time.
Of course, I knew I could add additional commands, such as write-host or out-default.
But does Pester itself have a mechanism to make the output display properly?
I checked the help document and didn't find the relevant content, so I came here for help.
It is possible to write a custom Should wrapper (proxy) using this technique. The wrapper can write the pipeline objects to the console.
Here is a complete example of such a Should wrapper and how to override Pester's Should.
To apply it to your case, edit the process{} block of the wrapper like this:
process {
try {
# Here $_ is the current pipeline object
Write-Host "Current test case input:`n$( $_ | Out-String )"
# forward it to "process" block of original "Should"
$steppablePipeline.Process( $_ )
} catch {
throw
}
}
I am trying to create an extension using node api which publishes a path variable on completion.
I did set outputVariables in task.json and tried to use both
tl.setVariable('outVar1', 'outVal1'))
tl.setTaskVariable('outVar1', 'outVal1'))
task.json (only outvariable section):
"OutputVariables": [
{
"name": "outVar1",
"description": "This publish a output variable."
}
],
I tried printing it in the subsequent steps in the same job using all the recomended constructs
$(taskName.outVar1)
$taskName.outVar1
$outVar1
$(outVar1)
But the variable is not visible. I also printed all the environment variables and the variable is not present there.
Is someone able to create an extension which outputs a variable successfully?
You don't need to declare an output variable fro this purpose.
Just set the variable:
tl.setVariable("varNamr","varValue", false);
The fasle indicate that is not a secret variable.
In the enxt steps you can use the variable wirh $(varName).
I'm struggling to figure out a way to populate a parameter for a downstream, freestyle project based on a value generated during my pipeline run.
A simplified example would probably best serve to illustrate the issue:
//other stuff...
stage('Environment Creation') {
steps {
dir(path: "${MY_DIR}") {
powershell '''
$generatedProps = New-Instance #instancePropsSplat
$generatedProps | Export-Clixml -Depth 99 -Path .\\props.xml
'''
stash includes: 'props.xml', name: 'props'
}
}
}
//...later
stage('Cleanup') {
unstash props
// either pass props.xml
build job: 'EnvironmentCleanup', wait: false, parameters: [file: ???]
// or i could read the xml and then pass as a string
powershell '''
$props = Import-Clixml -Path props.xml
# how to get this out of this powershell script???
'''
}
I create some resources in one stage, and then in a subsequent stage I want to kick off a job using those resources as a parameter. I can modify the downstream job however I want, but I am struggling to figure out how to do this.
Things I've tried:
File Parameter (just unstashing and passing through)
Apparently do not work with pipelines
Potential Paths:
EnvInject: may not be safe to use and apparently doesn't work with pipelines?
Defining a "global" variable like here, but I'm not sure how powershell changes that
So, what's the best way of accomplishing this? I have a value that is generated in one stage of a pipeline, and I then need to pass that value (or a file containing that value) as a parameter to a downstream job.
So here's the approach I've found that works for me.
In my stage that depends on the file created in a previous stage, I am doing the following:
stage ("Environment Cleanup') {
unstash props
archiveArtifacts "props.xml"
build job: 'EnvironmentCleanup', parameters: [string(name: "PROJECT", value: "${JOB_NAME}")], wait: false
}
Then in my dependent job (freestyle), I copy the "props.xml" file from the triggering build and using the job name passed in as a parameter, then execute my powershell to deserialize the xml to an object, and then read the properties I need.
The last, and most confusing, part I was missing was in the options for my triggering pipeline project I needed to grant copy permission to the downstream job:
options {
copyArtifactPermission('EnvironmentCleanup') // or just '*' if you want
}
This now works like a charm and I will be able to use it across my pipelines that follow this same workflow.
I've created a windows service using install4j and everything works but now I need to pass it command line arguments to the service. I know I can configure them at service creation time in the new service wizard but i was hoping to either pass the arguments to the register service command ie:
myservice.exe --install --arg arg1=val1 --arg arg1=val2 "My Service Name1"
or by putting them in the .vmoptions file like:
-Xmx256m
arg1=val1
arg2=val2
It seems like the only way to do this is to modify my code to pick up the service name via exe4j.launchName and then load some other file or environment variables that has the necessary configuration for that particular service. I've used other service creation tools for java in the past and they all had straightforward support for command line arguments registered by the user.
I know you asked this back in January, but did you ever figure this out?
I don't know where you're sourcing val1, val2 etc from. Are they entered by the user into fields in a form in the installation process? Assuming they are, then this is a similar problem to one I faced a while back.
My approach for this was to have a Configurable Form with the necessary fields (as Text Field objects), and obviously have variables assigned to the values of the text fields (under the 'User Input/Variable Name' category of the text field).
Later in the installation process I had a Display Progress screen with a Run Script action attached to it with some java to achieve what I wanted to do.
There are two 'gotchas' when optionally setting variables in install4j this way. Firstly, the variable HAS to be set no matter what, even if it's just to the empty string. So, if the user leaves a field blank (ie they don't want to pass that argument into the service), you'll still need to provide an empty string to the Run executable or Launch Service task (more in that in a moment) Secondly, arguments can't have spaces - every space-separated argument has to have its own line.
With that in mind, here's a Run script code snippet that might achieve what you want:
final String[] argumentNames = {"arg1", "arg2", "arg3"};
// For each argument this method creates two variables. For example for arg1 it creates
// arg1ArgumentIdentifierOptional and arg1ArgumentAssignmentOptional.
// If the value of the variable set from the previous form (in this case, arg1) is not empty, then it will
// set 'arg1ArgumentIdentifierOptional' to '--arg', and 'arg1ArgumentAssignmentOptional' to the string arg1=val1 (where val1
// was the value the user entered in the form for the variable).
// Otherwise, both arg1ArgumentIdentifierOptional and arg1ArgumentAssignmentOptional will be set to empty.
//
// This allows the installer to pass both parameters in a later Run executable task without worrying about if they're
// set or not.
for (String argumentName : argumentNames) {
String argumentValue = context.getVariable(argumentName)==null?null:context.getVariable(argumentName)+"";
boolean valueNonEmpty = (argumentValue != null && argumentValue.length() > 0);
context.setVariable(
argumentName + "ArgumentIdentifierOptional",
valueNonEmpty ? "--arg": ""
);
context.setVariable(
argumentName + "ArgumentAssignmentOptional",
valueNonEmpty ? argumentName+"="+argumentValue : ""
);
}
return true;
The final step is to launch the service or executable. I'm not too sure how services work, but with the executable, you create the task then edit the 'Arguments' field, giving it a line-separated list of values.
So in your case, it might look like this:
--install
${installer:arg1ArgumentIdentifierOptional}
${installer:arg1ArgumentAssignmentOptional}
${installer:arg2ArgumentIdentifierOptional}
${installer:arg2ArgumentAssignmentOptional}
${installer:arg3ArgumentIdentifierOptional}
${installer:arg3ArgumentAssignmentOptional}
"My Service Name1"
And that's it. If anyone else knows how to do this better feel free to improve on this method (this is for install4j 4.2.8, btw).