Terraform outputs 'Error: Variables not allowed' when passing list variable by command line - command-line

My variable.tf file is
variable "users"{
type = list
}
My output-list.tf file is,
output print-users{
value = "First user is ${var.users[0]}\nSecond user is ${var.users[1]}"
}
When I do terraform plan and provide input as ["Abhishek","Arya"] it gives output as expected.
But when I do terraform plan -var 'users=["Abhishek","Arya"]' it gives following error,
Maybe I'm doing silly mistake but can't find it, any help would be appreciated.
Thank you : )

terraform plan -var "users=[\"Abhishek\",\"Arya\"]" this one works. So basically it's the difference in ways windows understands characters.
terraform-plan-passing-value-for-list-variable-on-windows
Thank you #martin-atkins

Related

Azure Cli PowerShell - how to pass list query results (array) as inline Parameters?

In Azure Cli with Powershell. What I'm trying to do is list out all the existing KV secret names and then pass the array in as an inline parameter for 'az deployment group create' command.
However, seems like the command doesn't like the format of JSON array so I got this error: ERROR: Failed to parse string as JSON
Upon checking docus, seems like it want something like "['value1','value2']" but my query result is ["t1","t2"], so it will throw the error out.
This surprised me as the query result are not supported natively by Azure CLI, wondering how can I convert my query results to the right format?
$formattedOutput = #()
foreach($line in $secretNamesArray) {
$formattedOutput += "'"+$line+"'"
}
$existingSecretNames = "[" + ($formattedOutput -join ",") + "]"
Figured out, have to transform it.... what a wonderful job MS...
also remember to wrap the results with double quote as well like:
--parameters existingSecretNames="$existingSecretNames"
So it will pass something in like "['t1', 't2', 't3']"

Power Automate Cloud - OneDrive When a file is created Trigger Condition

I'm using the OneDrive when a file is created trigger in my flow. I want to set a trigger condition that will only trigger if the file type is .xlsx
I tried the following condition:
#contains(triggerOutputs()?['headers/x-ms-file-name-encoded'], 'xlsx')
I get the following error:
InvalidTemplate. Unable to process template language expressions for trigger 'When_a_file_is_created' at line '1' and column '37225': 'The template language function 'endswith' expects its first parameter to be of type string. The provided value is of type 'Null'. Please see https://aka.ms/logicexpressions#endswith for usage details.'.
I did some research on-line and it appears to be due to the OneDrive dynamic selector "File name" being encoded, but I wasn't able to find any solutions for working around this issue.
Any help is greatly appreciated
Thanks
You can use the base64tostring function to turn it into a readable string.
Try an trigger condition expression like below
#endswith(base64tostring(triggerOutputs()?['headers/x-ms-file-name-encoded']), '.xlsx')

How to set a cloudformation parameter to a powershell variable

Trying to simply run some powershell in my cloudformation based on a user inpute parameter in cloudformation.
This works
write-host ${CFParameter} >> C:\temp\log.txt
but this does not
$PSVariable = ${CFParameter}
write-host $PSVariable >> C:\temp\log.txt
the second one just returns a blank line but the first one returns the correct information
If your powershell is being used in userdata and you can use ref function to refer to the parameter. I would recommend using cloudkast which is an online cloudformation template generator. It makes it easy to generate cloudformation templates.

How to reference a DAG's execution date inside of a `KubernetesPodOperator`?

I am writing an Airflow DAG to pull data from an API and store it in a database I own. Following best practices outlined in We're All Using Airflow Wrong, I'm writing the DAG as a sequence of KubernetesPodOperators that run pretty simple Python functions as the entry point to the Docker image.
The problem I'm trying to solve is that this DAG should only pull data for the execution_date.
If I was using a PythonOperator (doc), I could use the provide_context argument to make the execution date available to the function. But judging from the KubernetesPodOperator's documentation, it seems that the Kubernetes operator has no argument that does what provide_context does.
My best guess is that you could use the arguments command to pass in a date range, and since it's templated, you can reference it like this:
my_pod_operator = KubernetesPodOperator(
# ... other args here
arguments=['python', 'my_script.py', '{{ ds }}'],
# arguments continue
)
And then you'd get the start date like you'd get any other argument provided to a Python file run as a script, by using sys.argv.
Is this the right way of doing it?
Thanks for the help.
Yes, that is the correct way of doing it.
Each Operator would have template_fields. All the parameters listed in template_fields can render Jinja2 templates and Airflow Macros.
For KubernetesPodOperator, if you check docs, you would find:
template_fields = ['cmds', 'arguments', 'env_vars', 'config_file']
which means you can pass '{{ ds }}'to any of the four params listed above.

Why is my powershell Base64 String truncated to 36 characters?

I'm using PowerShell inside VSTS Build for querying the VSTS API. I'm using PAT for authentication.
However, I see it failing when I generate the Auth string. Here is my code inside
$VstsAccessEmail = $Env:VstsAccessEmail
$VstsAccessToken = $Env:VstsAccessToken
$pair = "${VstsAccessEmail}:${VstsAccessToken}"
$base64 = [Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes($pair))
Write-Host $base64
If I see the outcome here, I see the first 36 characters of the actual auth string. I generated the string using Powershell on my machine and I get the entire 108 characters. I have hard-coded this for time being to test this like the following steps that follow the above code.
$base641 = "a2FuZ2thbi5nb3N3YW1pQHVuaXN5cy5jb206cHh4b25oaHBlNmtjb3g3aTRhdHZxMzdoNms2ZnpuNHhyaWhyZ2ozdGZ3ejRlNmxxxxXXXX=="
if($base64.length -ne 108){
$base64 = $base641
}
Write-Host "base64 is: $base64 "
This works correctly. Initially, I thought it might be an issue with Writing to host. However, If I do invoke the RestMethod without updating with the hard-coded one, I get 401 Unauthorized.
Please help.
UPDATE:
I found the issue. I set the VstsAccessToken as a secret in the build variables. So the value is not coming through. making it unsecured works fine.
Can someone help how this can be done keeping the token as a secret?
SOLVED
Using $Env:variable does not allow to use the value when the variable is secret. However, passing it as a parameter to the PowerShell let the code read it, though the output to log is masked.
This section is definitely wrong. I see one thing that's definitely wrong, another that's possibly wrong.
$VstsAccessEmail = $Env:VstsAccessEmail
$VstsAccessToken = $Env:VstsAccessToken
$pair = "${VstsAccessEmail}:${VstsAccessToken}"
$base64 = [Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes($pair))
$pair = "${VstsAccessEmail}:${VstsAccessToken}" should be $pair = ":${VstsAccessToken}". No email address is neccessary, just a colon and then the auth token.
UTF8.GetBytes may be wrong. I always use ASCII.GetBytes.
Using $Env:variable does not allow to use the value when the variable is secret. However, passing it as a parameter to the PowerShell let the code read it, though the output to log is masked.