Passing parameters using AZ CLI for ARM template deployment - azure-devops

I am trying to use az group deployment create to perform an ARM template deployment and I want to pass in parameters where the values are defined in variables. I can do a single parameter with no issues using the syntax below:
--parameters parameter1=$var1
But when I try to add additional parameters using the syntax below, it fails:
--parameters parameter1=$var1, parameter2=$var2
The syntax below fails as well since it will not use the values of the variables:
--parameters '{
"parameter1": { "value": "$var1" },
"parameter2": { "value": "$var2" }
}'
Does anyone know if what I am trying to do is possible and what the correct syntax would be?

I was fighting a combination of a corrupt shell and slightly incorrect syntax. The correct syntax for what I was trying to do is listed below:
--parameters parameter1=$var1 parameter2=$var2
Or, for a cleaning view when several parameters are involved:
--parameters parameter1=$var1 `
parameter2=$var2 `
parameter3=$var3

Related

Bicep/ARM XML parameter as string not parsed correctly

In my Bicep file, I have a parameter:
#secure()
param securityToken string
This parameter is an XML value provided as a string from an environment variable, not read from a file. I have trouble correctly setting this parameter from my YML pipeline. I have this piece of code:
az deployment sub create --template-file main.bicep --parameters securityToken='<xml>'
Unfortunately, this returns an error:
< was unexpected at this time.
Ok, so I somehow have to escape this parameter? I also tried:
az deployment sub create --template-file main.bicep --parameters securityToken='<xml>'
But this gives the following error:
The system cannot find the file specified.
So my questions is:
How can I provide an XML string parameter to my Bicep / ARM deployment?
Thanks!
Deployments to subscriptions need the --location <location> parameter, which is missing in the examples above.
When I run the following on Windows, Git Bash, the deployment succeeds.
az deployment sub create --location eastus --template-file main.bicep --parameters securityToken='<xml>'
Result
main.bicep
targetScope = 'subscription'
#secure()
param securityToken string
output exposedSecret string = securityToken
So I found the problem. Apparently, the < and > needs to be escaped, and I finally found out how: putting it between ". So, the value "<"xml">" did the trick. Now my working code is this:
$token = '<xml></xml>'
$token = $token .replace('<', '"<"') // = "<"xml>"<"/xml>
$token = $token .replace('>', '">"') // = "<"xml">""<"/xml">"
$token = $token .replace('""', '" "') // The double "" only escapes the " itself. Therefore, add an extra space: "<"xml">" "<"/xml">"

Submit command line arguments to a pyspark job on airflow

I have a pyspark job available on GCP Dataproc to be triggered on airflow as shown below:
config = help.loadJSON("batch/config_file")
MY_PYSPARK_JOB = {
"reference": {"project_id": "my_project_id"},
"placement": {"cluster_name": "my_cluster_name"},
"pyspark_job": {
"main_python_file_uri": "gs://file/loc/my_spark_file.py"]
"properties": config["spark_properties"]
"args": <TO_BE_ADDED>
},
}
I need to supply command line arguments to this pyspark job as show below [this is how I am running my pyspark job from command line]:
spark-submit gs://file/loc/my_spark_file.py --arg1 val1 --arg2 val2
I am providing the arguments to my pyspark job using "configparser". Therefore, arg1 is the key and val1 is the value from my spark-submit commant above.
How do I define the "args" param in the "MY_PYSPARK_JOB" defined above [equivalent to my command line arguments]?
I finally managed to solve this conundrum.
If we are making use of ConfigParser, the key has to be specified as below [irrespective of whether the argument is being passed as command or on airflow]:
--arg1
In airflow, the configs are passed as a Sequence[str] (as mentioned by #Betjens below) and each argument is defined as follows:
arg1=val1
Therefore, as per my requirement, command line arguments are defined as depicted below:
"args": ["--arg1=val1",
"--arg2=val2"]
PS: Thank you #Betjens for all your suggestions.
You have to pass a Sequence[str]. If you check DataprocSubmitJobOperator you will see that the params job implements a class google.cloud.dataproc_v1.types.Job.
class DataprocSubmitJobOperator(BaseOperator):
...
:param job: Required. The job resource. If a dict is provided, it must be of the same form as the protobuf message.
:class:`~google.cloud.dataproc_v1.types.Job`
So, on the section about job type pySpark which is google.cloud.dataproc_v1.types.PySparkJob:
args Sequence[str]
Optional. The arguments to pass to the driver. Do not include arguments, such as --conf, that can be set as job properties, since a collision may occur that causes an incorrect job submission.

Unexpected error while passing variable group variables (Azure DevOps) to YAML pipeline

I'm a newbie to both Azure DevOps and Terraform but, I'm trying to deploy a pipeline using a YAML file.
I have tried to run a terraform plan using a YAML file and passing variables (from AZ DevOps) but, I got the following error:
2021-11-24T18:39:46.4604561Z Error: "name" may only contain alphanumeric characters, dash, underscores, parentheses and periods
2021-11-24T18:39:46.4604832Z
2021-11-24T18:39:46.4605940Z on modules/aks/main.tf line 2, in resource "azurerm_resource_group" "aks-resource-group":
2021-11-24T18:39:46.4606436Z 2: name = var.resource_group_name
2021-11-24T18:39:46.4606609Z
2021-11-24T18:39:46.4606722Z
2021-11-24T18:39:46.4606818Z
2021-11-24T18:39:46.4607525Z Error: Error: Subnet: (Name "#{vnet_subnet_name}#" / Virtual Network Name "#{vnet_name}#" / Resource Group "RG-XX-XXXX-XXXXX-001") was not found
2021-11-24T18:39:46.4608006Z
2021-11-24T18:39:46.4608580Z on modules/aks/main.tf line 16, in data "azurerm_subnet" "subnet-project":
2021-11-24T18:39:46.4609335Z 16: data "azurerm_subnet" "subnet-project" {
The 'name' has the following format at the Variable group in the Azure DevOps UI:
RG-XX-XXXX-XXXXX-001
This is the snippet of where I included the replace token at the YAML file:
displayName: 'Replace Secrets'
inputs:
targetFiles: |
variables.tfvars
encoding: 'utf-8'
actionOnMissing: fail
tokenPattern: #{MyVar}#
And this is a sample of the variables I have in a variable group:
variable-group-sample
Also, I replace the terraform.tfvars file with something like this:
resource_group_name = "#{resource_group_name}#"
I have checked the name inserted at the UI several times but I feel the error is pointing to something else I cannot see.
Have anyone experienced something related to this error?
Thank you in advance!
tokenPattern: #{MyVar}#
It is looking for the pattern #{MyVar}# to replace. Not "something contained between #{ and }#, but the actual value #{MyVar}#. I'm guessing it's expecting a regular expression, but I'm not familiar with that task.
So the end result is that your #{token values}# aren't getting replaced.
Assuming you're using https://marketplace.visualstudio.com/items?itemName=qetza.replacetokens, you probably want to specify tokenPrefix: #{ and tokenSuffix: }# instead of using tokenPattern.
Now, having said that...
There is no reason for you to be using token replacement on a tfvars file. You should create different tfvars files for each environment, then pass in a tfvars file via the -var-file argument to Terraform. Secrets can be passed in on the command line via -var 'foo=bar'
Storing variables that represent application or deployment configuration in Azure DevOps (or GitHub, or any other CI system) is a big, big anti-pattern, because it's tightly coupling your deployment process to a particular platform. If you're sourcing all of your variables from Azure DevOps, you can't easily test locally or migrate to a different CI/CD provider like GitHub Actions in the future.
For values that shouldn't be in source control, such a secrets, you should use a secret provider like Azure KeyVault and integrate it with your application (or, in this case, use a data resource in Terraform to pull the necessary secrets automatically at deployment time).

How to use predefined variables in Azure DevOps PowerShell Task?

I'm trying to create a simple Azure DevOps Extension task which consists of a simple PowerShell script installing a dotnet tool and then running it like this:
dotnet tool install dotnet-stryker --tool-path $(Agent.BuildDirectory)/tools
$(Agent.BuildDirectory)/tools/dotnet-stryker
But when I try to run my task I get the following error:
##[error]System.Management.Automation.ParseException: At D:\a\_tasks\run-stryker_400ea42f-b258-4da4-9a55-68b174cae84c\0.14.0\run-stryker.ps1:17 char:25
##[debug]Processed: ##vso[task.logissue type=error;]System.Management.Automation.ParseException: At D:\a\_tasks\run-stryker_400ea42f-b258-4da4-9a55-68b174cae84c\0.14.0\run-stryker.ps1:17 char:25
+ $(Agent.BuildDirectory)/tools/dotnet-stryker
+ ~
You must provide a value expression following the '/' operator.
At D:\a\_tasks\run-stryker_400ea42f-b258-4da4-9a55-68b174cae84c\0.14.0\run-stryker.ps1:17 char:25
+ $(Agent.BuildDirectory)/tools/dotnet-stryker
+ ~~~~~~~~~~~~~~~~~~~~
Unexpected token 'tools/dotnet-stryker' in expression or statement.
at System.Management.Automation.Runspaces.PipelineBase.Invoke(IEnumerable input)
at System.Management.Automation.PowerShell.Worker.ConstructPipelineAndDoWork(Runspace rs, Boolean performSyncInvoke)
at System.Management.Automation.PowerShell.Worker.CreateRunspaceIfNeededAndDoWork(Runspace rsToUse, Boolean isSync)
at System.Management.Automation.PowerShell.CoreInvokeHelper[TInput,TOutput](PSDataCollection`1 input, PSDataCollection`1 output, PSInvocationSettings settings)
at System.Management.Automation.PowerShell.CoreInvoke[TInput,TOutput](PSDataCollection`1 input, PSDataCollection`1 output, PSInvocationSettings settings)
at Microsoft.TeamFoundation.DistributedTask.Handlers.LegacyVSTSPowerShellHost.VSTSPowerShellHost.Main(String[] args)
##[error]LegacyVSTSPowerShellHost.exe completed with return code: -1.
I've tried to implement small changes in my code like exchanging \ for / and wrapping it in "" but I end up always getting the same result.
Note that this same code works just fine if I run it inside an Azure Pipeline as inline PowerShell.
What am I missing here?
In a PowerShell script (it inline) you need to use the following syntax:
$env:Agent_BuildDirectory
Reference: https://learn.microsoft.com/en-us/azure/devops/pipelines/release/variables?view=azure-devops&tabs=batch#using-default-variables
How to use predefined variables in Azure DevOps PowerShell Task?
You could try to pass this predefined variable as a input in the task.json file, like:
"inputs": [
{
"name": "BuildDirectory",
"type": "string",
"label": "The Build Directory",
"defaultValue": "$(Agent.BuildDirectory)",
"required": true,
"helpMarkDown": "The folder of the test project to execute(e.g. Sample.Tests.csproj)."
},
Then in the run-stryker.ps1 file:
param(
[string] $BuildDirectory
)
It ended up being that my issue was not related with Azure Devops variables, but with the fact that I was trying to run a string as if it was a Powershell command.
Solved it with a simple Invoke-Expression.
Invoke-Expression "$(Agent.BuildDirectory)/tools/dotnet-stryker"

Getting Outputs from aws cloudformation describe-stacks

I am using the below to get the stack information I want via AWS Cli:
aws cloudformation --region ap-southeast-2 describe-stacks --stack-name mystack
It's returning result OK:
{
"Stacks": [
{
"StackId": "arn:aws:mystackid",
"LastUpdatedTime": "2017-01-13T04:59:17.472Z",
"Tags": [],
"Outputs": [
{
"OutputKey": "Ec2Sg",
"OutputValue": "sg-97e13dff"
},
{
"OutputKey": "DbUrl",
"OutputValue": "myUrl"
}
],
"CreationTime": "2017-01-13T03:27:18.893Z",
"StackName": "mystack",
"NotificationARNs": [],
"StackStatus": "UPDATE_ROLLBACK_COMPLETE",
"DisableRollback": false
}
]
}
But I do not know how to return only the value of OutputValue which is myUrl
As I do not need the rest, just myUrl.
Is that possible via aws cloudformation describe-stacks?
Edit
I just realize I can use --query:
--query "Stacks[0].Outputs[1].OutputValue"
will get exactly what I want but I would like to use DbUrl else if the number of Outputs changes, my result will be unexpected.
I got the answer, use the below:
--query 'Stacks[0].Outputs[?OutputKey==`DbUrl`].OutputValue' --output text
Or
--query 'Stacks[0].Outputs[?OutputKey==`DbUrl`].OutputValue' --output text
Or
--query 'Stacks[?StackName==`mystack`][].Outputs[?OutputKey==`DbUrl`].OutputValue' --output text
While querying works, it may prove problematic if you have multiple stacks. Realistically, you should probably be leveraging exports for things that are distinct and authoritative.
By way of example - if you modified your CloudFormation snippet to look like this:
"Outputs" : {
"DbUrl" : {
"Description" : "My Database Url",
"Value" : "myUrl",
"Export" : {
"Name" : "DbUrl"
}
}
}
Then you could use:
aws cloudformation list-exports --query "Exports[?Name==\`DbUrl\`].Value" --no-paginate --output text
to retrieve it. Exports are required to be unique - only one stack can export any given name. This way, you're assured that you get the right value, every time. If you attempt to create a new stack that exports a name that already exists elsewhere, that stack creation will fail.
Avoid using hardcoded indexes such as [0]. This will lead to unpredicted query returns when you have multiple stacks. Try a more dynamic query such as the following:
aws cloudformation describe-stacks --region my_region --query "Stacks[?StackName=='my_stack_name'][].Outputs[?OutputKey=='my_output_key'].OutputValue" --output text
For your example this would be:
aws cloudformation describe-stacks --region ap-southeast-2 --query "Stacks[?StackName=='mystack'][].Outputs[?OutputKey=='Ec2Sg'].OutputValue" --output text
Take note of []. Without it, your query will not return anything.
To clarify the correct way of using list-exports:
There was a little bit issue on above post, we cannot use \' or \` around output name. The correct syntax is to use single quote ' without using escape character.
Example:
Using escape of tilde
C:\>aws cloudformation list-exports --query "Exports[?Name==\`my-output-name4\`].Value" --no-paginate --output text
Bad value for --query Exports[?Name==\`my-output-name4\`].Value: Bad jmespath expression: Unknown token \:
Exports[?Name==\`my-output-name4\`].Value
Using escape of single quote:
C:\>aws cloudformation list-exports --query "Exports[?Name==\'my-output-name4\'].Value" --no-paginate --output text
Bad value for --query Exports[?Name==\'my-output-name4\'].Value: Bad jmespath expression: Unknown token \:
Exports[?Name==\'my-output-name4\'].Value
And finally, the correct syntax:
C:\>aws cloudformation list-exports --query "Exports[?Name=='myexportname'].Value" --no-paginate --output text
If you had errors or unexpected empty output, please be aware all CLI is case sensitive. For example, if you used
--query "Exports[?Name=='myexportname'].value"
The output would be empty.
Using the Windows AWS CLI I had to ensure the --query param was doubled quoted.
aws cloudformation describe-stacks --stack-name <stack_name> --query "Stacks[0].Outputs[?OutputKey==`<key_we_want>`].OutputValue" --output text
Failing to use double quotes, resulted in the query returning:
Stacks[0].Outputs[?OutputKey==].OutputValue
Not so helpful.