Powershell Access Azure DevOps Secret Variables - powershell

I am attempting to read an Azure DevOps secret variable from a Powershell pipeline script. The variable looks like this within Azure:
I've attempted to access the secret variable both as a param such as
[CmdletBinding()]
Param (
$SecurePassword = $env:Password
)
and simply as an environment variable such as
$SecurePassword = $env:Password
Unfortunately the variable continues to appear null using either method.
I have no issue accessing non-secret variables. Any help would be greatly appreciated.
---------------------------------------- EDIT ----------------------------------------
I found documentation here stating that secrets are available to scripts within the pipeline if explicitly mapped in the environment section of the task.
I've updated my Powershell task and attempted to map the variable as both $(Password) and Password without any luck.
Mapping $(Password) as above reveals the string hidden behind asterisks.

We had a requirement to create a new project in Azure DevOps and needed to migrate all of the pipelines to the new project. Lo and behold, no one knew all of the secrets, and export / import doesn't accomplish this.
I wrote a script to output all environment variables into an "Extensions" tab next to the build summary. It's formatted and everything.
The key to outputting the secret is by altering the string by inserting the '<-eliminate->' phrase within the secret value and saving to a file. Once the file is created, we then remove all instances of the string '<-eliminate->', save the file, and there it sits as an extension page to the build summary.
I would like to somehow find All secrets dynamically, but for now manually defining the variable name does the trick.
I re-formatted for this post and removed proprietary info, please let me know if it's broken :)
function GetSecretLength ($secretVar){
$i = 0;
while($true){
try {
$secretVar.substring(0,$i)|out-null
} catch {
break
};
$i++;
}
if ($i -le 1) { return 1 }
else { return $i-1 };
}
function GetSecret($secret){
$length = GetSecretLength($secret);
if ($length -ge 2) {
return $secret.substring(0,$length-1 )+"<-eliminate->"+$secret.substring($length-1,1)
} elseif ($length -eq 1) {
return $secret+"<-eliminate->"
} else {
return ""
}
}
$var = (gci env:*).GetEnumerator() | Sort-Object Name
$out = ""
Foreach ($v in $var) { $out = $out + "`t{0,-28} = {1,-28}`n" -f $v.Name, (GetSecret($v.Value)) }
$fileName = "$env:BUILD_ARTIFACTSTAGINGDIRECTORY\build-variables.md"
write-output "dump variables on $fileName"
set-content $fileName $out
write-output "##vso[task.addattachment type=Distributedtask.Core.Summary;name=Environment Variables;]$fileName"
((Get-Content -path $fileName -Raw) -replace '<-eliminate->', '') | Set-Content -Path $fileName
You have to add the secret variables that you want into the "Environment Variables" of the Powershell task:
You end up with this pretty tab:

Why not just store the password in a keyvault as a secret? then there are azure commands for accessing the secret, and avoid all this. Heck, we generate random passwords, store them in keyvaults, then access the contents in the appropriate resource, without ever needing to expose the decrypted secret in a powershell command, like in an ARM template for an azure sql server database.
I know this doesn't solve your initial question, but it is a workaround that does work.

Related

How to create powershell script to retry on error

I am writing a Powershell script to copy unencrypted EBS Snapshots in AWS to Encrypted Snapshots. In AWS the max number of concurrent copies is currently 20 at one time, but I have 1400 snapshots to copy. I wrote a script in Powershell using a For Each loop to loop through the snapshot IDs stored in a Text file, and it works as expected until it gets to 20 snapshots being copied. Then it will throw the following error and fail:
An error occurred (ResourceLimitExceeded) when calling the CopySnapshot operation: Too many snapshot copies in progress. The limit is 20 for this destination region.
I have tried to use a While Do statement, but I believe I am missing some items on here. The script is listed below. Essentially I am trying to have it if the script gets to 20 concurrent copies, it will retry on the one snapshot until a free spot opens up and then move on to the next. Ideally I would like to just have this run in the background for a day or so. See the current script below:
function Get-TimeStamp {
return "[{0:MM/dd/yy} {0:HH:mm:ss}]" -f (Get-Date)
}
$kmsID = "blah"
$region = "us-east-1"
$stoploop = $false
[int]$Retrycount = "0"
Foreach($line in get-content C:\snaps4.txt) {
do {
$desc = aws ec2 describe-snapshots --snapshot-ids $line | ConvertFrom-Json
$description = $desc.Snapshots.Description
Write-Output "$description"
$snap = aws ec2 copy-snapshot --description "[Copied $line from us-east-1] $description" --source-region $region --source-snapshot-id $line --encrypted --kms-key-id $kmsID | ConvertFrom-Json
$newsnap = $snap.SnapshotId
Write-Output "$(Get-TimeStamp) Created copy of $line $description with NEW SnapshotID $newsnap" >> C:\log.txt
$stoploop = $true
}
While ($Stoploop -eq $false)
}
Please let me know if you have any questions, and I appreciate any help in advance.
Thanks!
You can put the copy command inside a try/catch block.
Something like this:
try {
copy command;
mark as complete
}
catch{
mark as failed
}
one approach is to make the content file a csv with file name and complete columns. Use import-csv to read it; iterate the imported list when $_.complete -ne "Y" and set complete to "Y" when it succeeds. Export the file at the end.
Re-run as needed

Powershell workflow function issue

I´m trying to reuse code on my SMA runbooks but everything I try to put inside a function doesn´t seem to work as expected.
For example, If I do this it works and returns the username of the credential:
workflow RB_Test
{
$credent = Get-AutomationPSCredential -Name "CRED_TESTE"
$var = $credent.Username
"result = ${var}"
}
Output:
But if I turn into this it doesn't work anymore (returns null):
workflow RB_Test
{
function FN_Test
{
$credent = Get-AutomationPSCredential -Name "CRED_TESTE"
$var = $credent.Username
"result = ${var}"
}
FN_Test
}
Output:
I've tried different things but without success. The debug/verbose screen don't return anything different. That also doesn't work:
Inlinescript {
. FN_Test
}
My goal would be to put several functions into a separate module and then import it on my runbooks for reusability but this really seems not to work.
This is a runbook (powershell workflow) created in the Service Management Automation (SMA).
I've read that there are some restrictions with Powershell workflow compared to pure Powershell but I am not sure if I am hitting one of them:
https://blogs.technet.microsoft.com/heyscriptingguy/2013/01/02/powershell-workflows-restrictions/
Thanks
Here's what I've had to do to get functions to work:
workflow FunctionTest {
function log {
param(
[string]$Message
)
Write-Output $Message
Write-Output "Filename: $Filename"
Write-Output "using:Filename: $using:Filename"
Write-Output "workflow:Filename: $workflow:Filename"
Write-Output "----"
## Under what conditions is 'global' used? Can't be used in a workflow...Hey Scripting Guy!
}
workflow DoSomething {
param(
[string]$Filename
)
log "Starting DoSomething"
}
$Filename = "LogFile_2017.csv"
log "Starting workflow"
## Variables need to be passed into workflow from parent-workflow
DoSomething -Filename $Filename
log "End workflow"
}
FunctionTest
I found you need to define your functions before using them. The tricky part was discovering that you have to pass your variables into the child-workflow.
The scope of the variables takes some getting used to.

PowerShell Using **DacServices** With SQLCMD Variables To Deploy A DACPAC

In PowerShell I'm using Microsoft.SqlServer.Dac.DacServices and Microsoft.SqlServer.Dac.DacDeployOptions to deploy/update a database DACPAC. The problem I am having is finding where to set the SQLCMD Variables the package requires.
Abbreviated Sample
# Create a DacServices object, which needs a connection string
$dacsvcs = New-Object Microsoft.SqlServer.Dac.DacServices "server=$sqlserver"
# Load dacpac from file
$dp = [Microsoft.SqlServer.Dac.DacPackage]::Load($dacpac)
# Deploy options
$deployOptions = New-Object Microsoft.SqlServer.Dac.DacDeployOptions
$deployOptions.IncludeCompositeObjects = $true
I know I can input these just fine with SqlPackage.exe, and maybe that's what I should do. But no where in the documentation or web grok can I find an example of DacServices usage with SQLCMD variables as an option--SQLCMD variables as required parameters for my project's DACPAC.
You should set options in the $deployOptions.SqlCommandVariableValues property. This is an updateabase Dictionary - you can't assign a new dictionary but you can update the key/value pairs inside it. For example to set a variable "MyDatabaseRef" to "Database123" use
$deployOptions.SqlCommandVariableValues.Add("MyDatabaseRef", "Database123");
The API reference is here.
I have another code snippet to share in relation to this, a method of processing multiple variables from a Powershell script argument;
param(
[hashtable] $SqlCmdVar
)
$deployOptions = New-Object Microsoft.SqlServer.Dac.DacDeployOptions
# Process the Sql Command Variables
#
if ($SqlCmdVar -ne $null)
{
foreach($key in $SqlCmdVar.keys)
{
Write-Verbose -Message "Adding Sql Command Variable ""$key""..."
$deployOptions.SqlCommandVariableValues.Add($key,$SqlCmdVar[$key])
}
}
You would call the script like this;
myscript.ps1 -SqlCmdVar #{ variable1 = "my first value"; variable2 = "my second value"; variableetc = "more values"}

PowerShell DSC - how to pass configuration parameters to ScriptResources?

I'm having a lot of trouble trying to get a PowerShell Desired State Configuration script working to configure an in-house application. The root of the problem is that I can't seem to pass my configuration data down into a ScriptResource (at least not with the way I'm trying to do it).
My script is supposed to create a config folder for our in-house application, and then write some settings into a file:
configuration MyApp {
param (
[string[]] $ComputerName = $env:ComputerName
)
node $ComputerName {
File ConfigurationFolder {
Type = "Directory"
DestinationPath = $Node.ConfigFolder
Ensure = "Present"
}
Script ConfigurationFile {
SetScript = {
write-verbose "running ConfigurationFile.SetScript";
write-verbose "folder = $($Node.ConfigFolder)";
write-verbose "filename = $($Node.ConfigFile)";
[System.IO.File]::WriteAllText($Node.ConfigFile, "enabled=" + $Node.Enabled);
}
TestScript = {
write-verbose "running ConfigurationFile.TestScript";
write-verbose "folder = $($Node.ConfigFolder)";
write-verbose "filename = $($Node.ConfigFile)";
return (Test-Path $Node.ConfigFile);
}
GetScript = { #{Configured = (Test-Path $Node.ConfigFile)} }
DependsOn = "[File]ConfigurationFolder"
}
}
}
For reference, my configuration data looks like this:
$config = #{
AllNodes = #(
#{
NodeName = "*"
ConfigFolder = "C:\myapp\config"
ConfigFile = "C:\myapp\config\config.txt"
}
#{
NodeName = "ServerA"
Enabled = "true"
}
#{
NodeName = "ServerB"
Enabled = "false"
}
)
}
And I'm applying DSC with the following:
$mof = MyApp -ConfigurationData $config;
Start-DscConfiguration MyApp –Wait –Verbose;
When I apply this configuration it happily creates the folder, but fails to do anything with the config file. Looking at the output below, it's obvious that it's because the $Node variable is null inside the scope of ConfigurationFile / TestScript, but I've got no idea how to reference it from within that block.
LCM: [ Start Resource ] [[Script]ConfigurationFile]
LCM: [ Start Test ] [[Script]ConfigurationFile]
[[Script]ConfigurationFile] running ConfigurationFile.TestScript
[[Script]ConfigurationFile] node is null = True
[[Script]ConfigurationFile] folder =
[[Script]ConfigurationFile] filename =
LCM: [ End Test ] [[Script]ConfigurationFile] in 0.4850 seconds.
I've burnt off an entire day searching online for this specific problem, but all the examples of variables, parameters and configuration data all use File and Registry resources or other non-script resources, which I've already got working in the "ConfigurationFolder" block in my script. The thing I'm stuck on is how to reference the configuration data from within a Script resource like my "ConfigurationFile".
I've drawn a complete blank so any help would be greatly appreciated. If all else fails I may have to create a separate "configuration" script per server and hard-code the values, which I really don't want to do if at all possible.
Cheers,
Mike
Change this: $Node.ConfigFolder to $using:Node.ConfigFolder.
If you have a variable called $Foo and you want it to be passed to a script DSC resource, then use $using:Foo
Based on David's answer, I've written a utility function which converts my script block to a string and then performs a very naive search and replace to expand out references to the configuration data as follows.
function Format-DscScriptBlock()
{
param(
[parameter(Mandatory=$true)]
[System.Collections.Hashtable] $node,
[parameter(Mandatory=$true)]
[System.Management.Automation.ScriptBlock] $scriptBlock
)
$result = $scriptBlock.ToString();
foreach( $key in $node.Keys )
{
$result = $result.Replace("`$Node.$key", $node[$key]);
}
return $result;
}
My SetScript then becomes:
SetScript = Format-DscScriptBlock -Node $Node -ScriptBlock {
write-verbose "running ConfigurationFile.SetScript";
write-verbose "folder = $Node.ConfigFolder";
write-verbose "filename = $Node.ConfigFile)";
[System.IO.File]::WriteAllText("$Node.ConfigFile", "enabled=" + $Node.Enabled);
}
You have to be mindful of quotes and escapes in your configuration data because Format-DscScriptBlock only performs literal substitution, but this was good enough for my purposes.
A quite elegant way to solve this problem is to work with the regular {0} placeholders. By applying the -f operator the placeholders can be replaced with their actual values.
The only downside with this method is that you cannot use the curly braces { } for anything other than placeholders (i.e. say a hashtable or a for-loop), because the -f operator requires the braces to contain an integer.
Your code then looks like this:
SetScript = ({
Set-ItemProperty "IIS:\AppPools\{0}" "managedRuntimeVersion" "v4.0"
Set-ItemProperty "IIS:\AppPools\{0}" "managedPipelineMode" 1 # 0 = Integrated, 1 = Classic
} -f #($ApplicationPoolName))
Also, a good way to find out if you're doing it right is by simply viewing the generated .mof file with a text editor; if you look at the generated TestScript / GetScript / SetScript members, you'll see that the code fragment really is a string. The $placeholder values should already have been replaced there.
ConfigurationData only exists at the time the MOF files are compiled, not at runtime when the DSC engine applies your scripts. The SetScript, GetScript, and TestScript attributes of the Script resource are actually strings, not script blocks.
It's possible to generate those script strings (with all of the required data from your ConfigurationData already expanded), but you have to be careful to use escapes, subexpressions and quotation marks correctly.
I posted a brief example of this over on the original TechNet thread at http://social.technet.microsoft.com/Forums/en-US/2eb97d67-f1fb-4857-8840-de9c4cb9cae0/dsc-configuration-data-for-script-resources?forum=winserverpowershell

Powershell Script: prompt a file (by mask) and use that file in a command line

Disclaimer: I don't know enough about ps to accomplish this in a reasonable amount of time, so yes, I am asking someone else to do my dirty job.
I want to be able to run a web.config transformation without opening a command line.
I have following files in a folder:
web.config - actual web config
web.qa.config - web config transformation for qa env
web.production.config - web config transformation for production env
transform.ps1 - powershell script I want to use to run transformation
Here is what I want:
PS file shall enumerate current directory using .*\.(?<env>.*?)\.config and let me choose which <env> I am interested in generate web.config for. In my example I will be presented with two options: "qa", "production".
After I (user) select the environment (let's say it is "qa", selected environment is stored as $env, and corresponding filename will be stored as $transformation) script shall do following:
backup original web.config as web.config.bak
execute following command:
.
echo applying $transformation...
[ctt][1].exe source:web.config transformation:$transformation destination:web.config preservewhitespaces verbose
echo done.
ctt.exe is a tool based on XDT that runs web.config transformation from command line.
Okay, looks simple enough, I'll do your dirty job for you. ;)
Save the following as transform.ps1:
$environments = #()f
gci | %{if ($_ -match '.*\.(?<env>.*?)\.config') {$environments += $matches.env}}
Write-Host "`nEnvironments:"
for ($i = 0; $i -lt $environments.Length; $i++) {Write-Host "[$($i + 1)] $($environments[$i])"}
Write-Host
do {
$selection = [int](Read-Host "Please select an environment")
if ($selection -gt 0 -and $selection -le $environments.Length) {
$continue = $true
} else {
Write-Host "Invalid selection. Please enter the number of the environment you would like to select from the list."
}
} until ($continue)
$transformation = "web.$($environments[$selection - 1]).config"
if (Test-Path .\web.config) {
cpi .\web.config .\web.config.bak
} else {
Write-Warning "web.config does not exist. No backup will be created."
}
if ($?) {
Write-Host "`nApplying $transformation..."
[ctt][1].exe source:web.config transformation:$transformation destination:web.config preservewhitespaces verbose
Write-Host "Done.`n"
} else {
Write-Error "Failed to create a backup of web.config. Transformation aborted."
}