I'm trying to set up a variable in the Azure Build Pipelines Classic Editor to use conditional functions to determine its value. YAML is not an option (unless there's some way to inject YAML as part of a Classic build...?).
In my current scenario, the idea is that the variable would return one of a few possible string values (or empty string) depending on the branch that triggered the build.
I want something along the lines of this:
I fear this may be a YAML-only thing, but hopefully someone can tell me I'm wrong about that.
Like Matt mentioned in the comment, the best approach for this would be to use a script (powershell or bash) that will have the logic to set the variable.
For more details about how to set a variable, have a look at this documentation.
Related
I am creating a build pipeline (YAML) with some "Powershell" task in it in which all are not inline scripts. I need to have a PSCustomObject that can be passed as a global variable (this param should be accessible across the pipeline). For example, it needs to be used like this. $myPSObject.value1 in first task, $myPSObject.value2 in second task etc. Is it possible?
Any leads would be much appreciated.
No, this is definitely not possible, what you can do, however, is cast it to json (compressed would, probably, make more sense) and then in your steps read it and cast it back to an object and work with it like you normally would. You'd need to use azure devops build variable for that, not powershell variable
In my tests I'm using environmental variable. Due to security reasons, when setting it in azure devops pipeline I need to mark it as secret. Is it possible to pass it as argument, within vstest task?
Apparently lacking enough reputation to comment on the answer by #daniel-mann, I need to follow-up through an answer myself.
Regarding the use of a runsettings file;
Yes, I'm sure you can do it like this, but there's a much simpler way. In the task definition, you have the option to override test run parameters.
For a non-YAML pipeline, you do this in the "Override test run parameters" option (the tooltip says "Override parameters defined in the TestRunParameters section of runsettings file or Properties section of testsettings file. For example: -key1 value1 -key2 value2."), so like this then:
-SomeSecret $(SomeSecret)
For a YAML-pipeline, you can do this:
overrideTestrunParameters: '-SomeSecret $(SomeSecret)'
The nice thing is that the test run parameter that you'd like to override DOES NOT need to exist in your runsettings file. In fact, there's not even need for a runsettings file at all!
The bad thing about this approach in general is that you'll have to access your secrets through the "TestContext.Properties" collection (for NUnit), which really sucks if you want a transparent solution that works equally well both for local development (using "user secrets") and in an Azure pipeline.
Another potentially bad thing about this is that these "overridden" test run parameters are just that - parameters for THAT test run only. If you happen to have a mixture of .NET Framework and .NET Core tests, you would want to execute those in two different test runs (for it to work at all...), and then you'd need to duplicate those overrides (given that some of the same secrets are needed, that is).
Regarding adding an additional task in your pipeline to set the appropriate environment variables;
Absolutely. Using the "Batch script" task is well suited for this, where you just pass your secret-based variables as parameters to the script and pick them up inside the script file.
For this to work as expected, though, you will need to allow the task to modify the environment.
For a non-YAML pipeline, this is done by ticking off the "Modify Environment" checkbox.
For a YAML pipeline, you could do it like this:
- task: BatchScript#1
displayName: 'Export key vault vars as env. vars (for tests)'
inputs:
filename: 'ExportKeyVaultEnvironmentVariables.cmd'
arguments: '$(SomeSecret)'
modifyEnvironment: true
Where in the "ExportKeyVaultEnvironmentVariables.cmd" script, you just do this:
set SomeSecret=%1
Note: If your secret by chance has some funny characters, especially having a trailing "=" character, you might experience that what you get when collecting the parameter inside the script is NOT what you sent in.
You can avoid this problem by enclosing the parameters in double quotes, like this:
arguments: '"$(SomeSecret)"'
And then collect the parameter by removing those surrounding quotes by using the "~" parameter modifier, like this:
set SomeSecret=%~1
A nice bonus effect of this approach is that your shiny new full-blown environment variables persist for the remainder of your pipeline. Referencing back to my "bad thing" about having to potentially duplicate the test run parameter overrides, that would not be needed here.
Regarding the additional option mentioned by the OP;
Absolutely, in which case you wouldn't need a "Batch script" task (that needs to call a script file), but just a "Command line" task.
BUT, be aware of a possible gotcha! Yes, this will create an environment variable for your test code to pick up, if you access it through "Environment.GetEnvironmentVariable".
In my case, I was building an "IConfigurationRoot" instance, like this:
/// <summary>
/// Gets a configuration instance, based on user secrets (for local test execution) and environment variables (for Azure pipeline execution).
/// </summary>
/// <typeparam name="T">The type of the class that represents the "runtime" (and thus is able to get hold of any configuration).</typeparam>
/// <returns>An <see cref="IConfigurationRoot"/> instance.</returns>
public static IConfigurationRoot GetConfigurationRoot<T>()
where T : class =>
new ConfigurationBuilder()
//// Note: The "AddUserSecrets" method requires the "Microsoft.Extensions.Configuration.UserSecrets" package
.AddUserSecrets<T>()
//// Note: The "AddEnvironmentVariables" method requires the "Microsoft.Extensions.Configuration.EnvironmentVariables" package
.AddEnvironmentVariables()
.Build();
This works by adding various "configuration providers", which eventually allows you to access them all seamlessly through "configuration["SomeSecret"]".
What I found, though, if I'm not seriously mistaken, is that "SomeSecret" was still not available in the "EnvironmentVariablesConfigurationProvider" that's added, even though I could perfectly fine access it directly with the above mentioned method. Go figure (but I might be mistaken...).
Possible alternative approach (but for YAML pipelines only?);
It seems that for a YAML pipeline, you can explicitly set environment variables for a task, like this:
- task: VSTest#2
env:
SomeSecret: $(SomeSecret)
[...]
I haven't tested this myself, but seen a colleague do it (myself, I currently don't have a YAML pipeline). At least this variable can be picked up with "Environment.GetEnvironmentVariable", but I don't know if this can be picked up through an "IConfigurationRoot" instance.
I haven't seen any option to achieve the same in a non-YAML pipeline.
But again, this also suffers from the same possible "having to duplicate the environment variables across several test runs" problem.
See also my solution to my own question over at the Azure DevOps guys
I've been through this. There's no way to pass variables to VSTest on the command line, which means you have to jump through a few hoops.
You have a few options:
Use a runsettings file with a TestRunParameters section, then access it via TestContext.Properties["variableName"] within the tests themselves. You can use standard token replacement patterns to transform the XML file.
Use an app.config or appsettings.json (depending on your platform). This works pretty much the same as above, except, of course, you use the standard configuration classes to retrieve the values.
Add a step to your pipeline that sets the appropriate environment variables. Secrets don't get automatically mapped to environment variables for security purposes, but there's nothing that's stopping you from doing it yourself.
Move the secret values into a keyvault or some other sort of external secret storage and configure the test to pull the secrets at runtime.
I'm not sure if this is possible as I've searched but couldn't find anything related, but didn't find anything saying it's not!
In my Azure DevOps build pipeline I have a variable which holds a JSON value e.g:
Is there anyway of using a value of a json variable within a task? I've tried (on the off chance) $(myJson.message) and $(myJson).message but these didn't work.
So do pipeline variables only support simple types? Or is there a way round this?
You could always use something like powershell's Convert-FromJson to create a json object, but there's no way to tell a random task "treat this variable value as an object, rather than a raw string". You'll have to parse it yourself.
Is it possible to get the values for custom variables being used in the build? I know they can be dumped to the console output as per what this example describe. But still want to find an easier way to archive it.
http://www.codewrecks.com/blog/index.php/2017/08/04/dump-all-environment-variables-during-a-tfs-vsts-build/
There isn’t the easier way then the way you provided to retrieve build variables, the value of variable can be changed during the build time (Logging Command), so it’s better to retrieve the variable at the end of the build (the way you provided).
Note, the secret variables can’t be output as general text.
I'm looking through the Octopus powershell library and trying to identify a way to output all the variable names and their values used in a deployment - not the project overall, but only for a deployment.
So say I have 3 variables like the below
VariableOne Value1
VariableTwo Value2
VariableThree Value3
And I only use the first and third and want those printed with their names (VariableOne, VariableThree) and their values (Value1, Value3).
There is an option for outputting all the variables into the deployment log for debugging purposes.
Set one (or both) of the following in your project variables list:
OctopusPrintVariables True
OctopusPrintEvaluatedVariables True
I find that the latter of the two is generally sufficient.
This feature is written up at https://octopus.com/docs/how-to/debug-problems-with-octopus-variables
<TL;DR>
No. It can't.
It's something we tried as well, but Octopus Deploy has so many ways in which Variables can be used, from XPath to .config files, JsonPath to json files, direct references and inline scripts in the workflows as well as direct references in the #{var} syntax.
None of these options track which variables were actually transformed or referenced, plus, some optional expansion may actually shortcircuit.
I've asked Octopus whether they could actually extend the object model to detect the requests to the values of a variable, so we can see which values have actually been found. But that is currently not in place.
And they came back with the problem that the step scripts may actually change or override the values of variables between steps, so the value may actually change during the workflow, making tracking them even harder.