Azure DevOps - Can we reuse the value of a key in the same variable group? - azure-devops

I have lots of URL values and their keys. But there is no way to batch import the variables and the "value" controls are also not text boxes in the Variables Group page to perform chrome browser extensions assisted find and replace.
If this is possible, what is the syntax to refer to the key?
As in, I have a variable App.URL : www.contoso.com.
I am using the key to substitute value in my next variable like this Login.URL : $(App.URL)\Login and this doesn't work.
GitHub link : https://github.com/MicrosoftDocs/vsts-docs/issues/3902#issuecomment-489694654

This isn't currently available, not sure if it will be. Can you create a task early in your pipeline that sets the variables you need in subsequent tasks/steps? This gives you more control as you can store the script along with your source. You could then use a pipeline variable for the environment you're in and let your script use that to set values appropriately.
See Set variables in scripts in the MS docs.

If it's not possible to re-architect your app to concatenate the url strings in the application, what the previous commenter said about creating a simple script to do that for you would be the way to go. Ie:
#!/bin/bash
#full login url
fullLoginUrl=$APP.URL\$LOGINSUFFIX
echo "##vso[task.setvariable variable=Login.URL]$fullLoginUrl
Otherwise, perhaps playing around with the run time vs compile time variables in YAML pipelines might be worth trying.
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#understand-variable-syntax

Related

Question regarding parameter store and Cloud formation integration

I was trying this scenario but I am not able to figure out
I have a name and value in parameter store in SSM, now I am running the CF template from CLI using code pipeline, and I want the CF template take values directly from parameter store and should not prompt
on screen asking me to give the value.
I tried this but it prompt me .
AWS::SSM::Parameter::value
this is prompting when I used to upload a template in screen. how to avoid it and make the script take the value from parameter store directly
You have two choices for that:
Provide a default value for the parameter.
Use ParameterOverrides in your CodePiepline to provide the required value for the parameter.

Can I enumerate a variable group in VSTS?

TL;DR: Search and replace placeholders in a text file with the decrypted values of secrets in a variable group.
I would like to use a PowerShell script to receive a variable group in a release pipeline and then iterate through the list, performing search-and-replace on a file being released.
The variables in the variable list are secrets so I want to overwrite the placeholders in the file with the decrypted value of the variables.
The values of the variables are environment specific, so I do not want to provide the values at build time and do not want to include the decrypted values in a stored artifact.
The file being search-replaced will be used in an execution at release time but will not be deployed to a host, so will be destroyed upon completion of the pipeline execution.
The Tokenization task from the Visual Studio Marketplace does this job well. You'll need to install it into your Azure DevOps organisation, it's available at https://marketplace.visualstudio.com/items?itemName=TotalALM.totalalm-tokenization
By default the Tokenization task uses double underscores to identify the placeholders. It will replace any text that that matches the name of a variable in your release definition as long as it's surrounded by double underscores.
So if you want to write the value of a variable called MySecretVariable into your file you'll need to add a place holder like __MySecretVariable__ into the file where ever you want that value to be written.
The Tokenization task will write any encrypted values into the file in plain text but in the release logs they will be obfuscated.
If you're storing your variables in a variable group just link that to the release definition and set the scope to the appropriate environment.
So, I've found that, as long as I have a Base64 token configured for Variable Groups (click 'Show all scopes' when creating a new PAT) then I can call GET https://dev.azure.com/{organization}/{project}/_apis/distributedtask/variablegroups/{groupId}?api-version=5.1-preview.1 to get the variable group I need.
The above, however, will not return a value for secrets, although there is a hack/workaround for this (involving multiple pipeline steps).
The advisable route is to create a Key Vault in Azure and perform the processing either in Powershell or code.
As I need the decrypted key values passed into my application via a repeated find-replace, I have implemented a Powershell script in one Release pipeline step and consume the output in the next step.

Using global variables in a ps1

I can't seem to find good enough solution to my problem. Is there a good way of grouping variables in some kind of file so that multiple scripts could access them?
I've been doing some work with Desired State Configuration but the work that needs to be done cannot be efficiently implemented that way. The point is to install Azure Build Agent on a server and then to configure it. There are some variables that really should not be inside a script file just copypasted like Personal Access Token. I just want to be able to easily change it without the need to go inside every script that would be using it. In DSC you can just make a .psd1 file and access the variables like for example AllNodes.NodeName. The config file invocation and parameters look like this:
.\config.cmd --unattended --url $myUrl --auth PAT --token $myToken --pool default --agent "$env:COMPUTERNAME" --acceptTeeEula --work $workDir'
I want to make the variable $myToken accessible from outside file for better security and having a centralized place from where I can change values. $myUrl is also important to have access to due to it changing with new update to Build Agent.
Thank you in advance for your effort. If anything is not clear please let me know.
I have two very different answers to your question, although either one of them may miss your point.
First, it's possible to define veriables inside your profile script. Most people only use the profile script to define a library of functions or classes. But a variable can be made global the same way.
I have a variable named $myps that identifies the folder where I keep my PS scripts (in subfolders).
When I start a session I generally switch to this directory (oops, I called it a folder above.
The second way involde storing values of variables in a CSV file, while the names are stored in the CSV header.i then have a quickie little comandlet that steps through a CSV file, record by record, generating different expansions of a template each time through.
These values are not quite global, but they can be used in more than one context.
Thank you for the help. Those are very useful solutions in some cases, but I dug a bit deeper and found solution that suits my purpose. Basically if you have a psd1 file suited for DSC use you can also access its content via normal ps1 file. For example:
NonNodeData =
#{
Pat = 'somePAT'
}
Let's say this section of a psd1 file called ENV.psd1 is on your local machine in C:/Configuration
To access the content of this file you have to make a variable inside your script and use Import-PowerShellDataFile like so:
$configData = Import-PowerShellDataFile -Path "C:\Configuration\ENV.psd1"
And now you are free to use anything stored inside ENV.psd1. For example if I want to extract my PAT from config file to be able to store it in a variable in the script:
$myPat = $configData.NonNodeData.Pat
Thanks to that I can just pass $myPat as a parameter when invoking config.cmd like so:
.\config.cmd --unattended --auth PAT --token $myPat
Keeping my code cleaner and easier for any future updates.

Unable to run experiment on Azure ML Studio after copying from different workspace

My simple experiment reads from an Azure Storage Table, Selects a few columns and writes to another Azure Storage Table. This experiment runs fine on the Workspace (Let's call it workspace1).
Now I need to move this experiment as is to another workspace(Call it WorkSpace2) using Powershell and need to be able to run the experiment.
I am currently using this Library - https://github.com/hning86/azuremlps
Problem :
When I Copy the experiment using 'Copy-AmlExperiment' from WorkSpace 1 to WorkSpace 2, the experiment and all it's properties get copied except the Azure Table Account Key.
Now, this experiment runs fine if I manually enter the account Key for the Import/Export Modules on studio.azureml.net
But I am unable to perform this via powershell. If I Export(Export-AmlExperimentGraph) the copied experiment from WorkSpace2 as a JSON and insert the AccountKey into the JSON file and Import(Import-AmlExperiment) it into WorkSpace 2. The experiment fails to run.
On PowerShell I get an "Internal Server Error : 500".
While running on studio.azureml.net, I get the notification as "Your experiment cannot be run because it has been updated in another session. Please re-open this experiment to see the latest version."
Is there anyway to move an experiment with external dependencies to another workspace and run it?
Edit : I think the problem is something to do with how the experiment handles the AccountKey. When I enter it manually, it's converted into a JSON array comprising of RecordKey and IndexInRecord. But when I upload the JSON experiment with the accountKey, it continues to remain the same and does not get resolved into RecordKey and IndexInRecord.
For me publishing the experiment as a private experiment for the cortana gallery is one of the most useful options. Only the people with the link can see and add the experiment for the gallery. On the below link I've explained the steps I followed.
https://naadispeaks.wordpress.com/2017/08/14/copying-migrating-azureml-experiments/
When the experiment is copied, the pwd is wiped for security reasons. If you want to programmatically inject it back, you have to set another metadata field to signal that this is a plain-text password, not an encrypted password that you are setting. If you export the experiment in JSON format, you can easily figure this out.
I think I found the issue why you are unable to export the credentials back.
Export the JSON graph into your local disk, then update whatever parameter has to be updated.
Also, you will notice that the credentials are stored as 'Placeholders' instead of 'Literals'. Hence it makes sense to change them to Literals instead of placeholders.
This you can do by traversing through the JSON to find the relevant parameters you need to update.
Here is a brief illustration.
Changing the Placeholder to a Literal:

Jenkins How can i upload a text file and use it as a parameter

I have a txt file that is holding a string inside, I want to be able to use this string in one of my scripts, so I'm wondering if there is a way to set the content of the file as one of the build properties or parameters which I'll be able to use in my scripts it should be the same as using one of the build environment properties.
For example : ${JOB_NAME} which is holding the the job name, so in the same way I want to access the content of the file which is holding some value inside.
Is it possible?
You can upload a file from your computer to the workspace through the File parameter of the job.
You can use Extended Choice plugin parameter, to read value(s) from a file and display them in a dropdown/radio-button/checkbox for the user to select, dynamically, every time the build is triggered.
You can use EnvInject plugin to read value(s) from a file and inject them into the build as environment variables, so that they can be used by the rest of the build steps/scripts.
Your question is very unclear on what your are trying to do. Pick one of the 3 methods above based on what you need, or clarify your question.