I am trying to query the output of an AWS cli command using an environment variable as the query string. This works fine for me using the AWS Cli in Linux but in Powershell I am having trouble getting the Cli to use the variable in Powershell.
For example - thsi works for me in Linux:
SECGRP="RDP from Home"
aws ec2 describe-security-groups --query \
'SecurityGroups[?GroupName==`'"$SECGRP"'`].GroupId' --output text
If i run this in Powershell:
$SECGRP="RDP from Home"
aws ec2 describe-security-groups --query \
'SecurityGroups[?GroupName==`'"$SECGRP"'`].GroupId' --output text
Error Details:
Bad value for --query SecurityGroups[?GroupName==`: Bad jmespath
expression: Unclosed ` delimiter:
SecurityGroups[?GroupName==`
^
I have tried a few combinations of quotes inisde the query expression but either get errors or no output.
I have also run the following to demonstrate i can get the correct output using Powershell (but not using a variable):
aws ec2 describe-security-groups --query \
'SecurityGroups[?GroupName==`RDP from Home`].GroupId' --output text
Try this:
$SECGRP="RDP from Home"
aws ec2 describe-security-groups --query "SecurityGroups[?GroupName=='$SECGRP'].GroupId" --output text
Related
I'm using a github workflow to automate some actions for AWS. I haven't changed anything for a while as the script has been working nicely for me. Recently I've been getting this error: Unable to process file command 'env' successfully whenever the workflow runs. I've got no idea why this is happening. Any help or pointers would greatly appreciated. Thanks. Here's the workflow which is outputting the error:
- name: "Get AWS Resource values"
id: get_aws_resource_values
env:
SHARED_RESOURCES_ENV: ${{ github.event.inputs.shared_resources_workspace }}
run: |
BASTION_INSTANCE_ID=$(aws ec2 describe-instances \
--filters "Name=tag:env,Values=$SHARED_RESOURCES_ENV" \
--query "Reservations[*].Instances[*].InstanceId" \
--output text)
RDS_ENDPOINT=$(aws rds describe-db-instances \
--db-instance-identifier $SHARED_RESOURCES_ENV-rds \
--query "DBInstances[0].Endpoint.Address" \
--output text)
echo "rds_endpoint=$RDS_ENDPOINT" >> $GITHUB_ENV
echo "bastion_instance_id=$BASTION_INSTANCE_ID" >> $GITHUB_ENV
From the RDS endpoint query expression (Reservations[*].Instances[*].InstanceId) in your aws cli command, it seems you expect a multiline string. It could also be that before you started to receive this error the command was producing a single line string, and that changed at some point.
In GitHub actions, multiline strings for environment variables and outputs need to be created with a different syntax.
For the RDS endpoint you should set the environment variable like this:
echo "rds_endpoint<<EOF" >> $GITHUB_ENV
echo "$RDS_ENDPOINT" >> $GITHUB_ENV
echo "EOF" >> $GITHUB_ENV
I guess that the bastion instance id will not be a problem since it's a single line string.
I am using this CloudFormation template
The List parameter I'm trying to pass values to is:
"Subnets" : {
"Type" : "List<AWS::EC2::Subnet::Id>",
"Description" : "The list of SubnetIds in your Virtual Private Cloud (VPC)",
"ConstraintDescription" : "must be a list of at least two existing subnets associated with at least two different availability zones. They should be residing in the selected Virtual Private Cloud."
},
I've written an utility script that looks like this:
#!/bin/bash
SUBNET1=subnet-abcdef
SUBNET2=subnet-ghijlm
echo -e "\n==Deploying stack.cf.yaml===\n"
aws cloudformation deploy \
--region $REGION \
--profile $CLI_PROFILE \
--stack-name $STACK_NAME \
--template-file stack.cf.json \
--no-fail-on-empty-changeset \
--capabilities CAPABILITY_NAMED_IAM \
--parameter-overrides \
VpcId=$VPC_ID \
Subnets="$SUBNET1 $SUBNET2" \ #<---------------this fails
InstanceType=$EC2_INSTANCE_TYPE \
OperatorEMail=$OPERATOR_EMAIL \
KeyName=$KEY_NAME \
If I deploy this, after a while my stack fails to deploy saying that a Subnet with the value "subnet-abcdef subnet-ghijlmn" does not exist.
The correct way to pass parameters to list is to comma separate them
So:
#!/bin/bash
SUBNET1=subnet-abcdef
SUBNET2=subnet-ghijlm
aws cloudformation deploy --parameter-overrides Subnets="$SUBNET1,SUBNET2"
will work
Tried every possible solution found online, none worked.
According to the documentation below, you should escape the comma without double-slashes. Tried that, didn't work either.
https://docs.aws.amazon.com/cli/latest/reference/cloudformation/create-stack.html
What worked FOR ME (apparently this is very environment-dependent) was the command below, escaping the coma with just one slash.
aws cloudformation create-stack --stack-name teste-memdb --template-body file://memorydb.yml --parameters ParameterKey=VpcId,ParameterValue=vpc-xxxx ParameterKey=SubnetIDs,ParameterValue=subnet-xxxxx\,subnet-yyyy --profile whatever
From the Documentation here
List/Array can be passed just like python Lists.
'["value1", "value2", "value3"]'
Also to note Cloudformation internally used python.
I'm executing gcloud composer commands:
gcloud composer environments run airflow-composer \
--location europe-west1 --user-output-enabled=true \
backfill -- -s 20171201 -e 20171208 dags.my_dag_name \
kubeconfig entry generated for europe-west1-airflow-compos-007-gke.
It's a regular airflow backfill. The command above is printing the results at the end of the whole backfill range, is there any way to get the output in a streaming manner ? Each time a DAG gets backfilled it will be printed in the standard output, like in a regular airflow-cli.
I'm trying to deploy a cloud function via my local terminal. For this i use the following code:
gcloud beta functions deploy networkcheck \
--region=europe-west1 \
--project=project-id \
--entry-point functionName \
--trigger-event providers/cloud.firestore/eventTypes/document.write \
--trigger-resource projects/project-id/databases/(default)/documents/test/test_id \
--runtime nodejs8
This will result in the following error:
deploy.sh: line 7: syntax error near unexpected token `('
deploy.sh: line 7: ` --trigger-resource projects/project-id/databases/(default)/documents/test/test_id \'
The script executes perfectly fine when i change '(default)' to 'default or any other string'. But then the cloud function will not work, because the only id that can be used for an Firestore database is '(default)', as mentioned in this post: How to find the database id of a cloud firestore project?
Is this a bug? Or can i fix this somehow?
Parenthesis are special characters in the bash command shell. You will need to escape them so they are taken literally, instead of being interpreted by your shell. Here, I am just putting the --trigger-resource parameter in single quotes so the parenthesis won't have a special meaning:
--trigger-resource "projects/project-id/databases/(default)/documents/test/test_id"
I am having problems running commands on an EC2 Instance from my Bamboo server.
I have a command generated from the Run Command in the AWS Console. I place that command in a script on my bamboo server and run it:
aws ssm send-command --document-name "AWS-RunPowerShellScript" --targets '{\"Key\":\"tag:Name\",\"Values\":[\"Auto-Scaling-Group\"]}' --parameters '{\"commands\":[\"$fileEXE = \\\"C:\\\\Program Files (x86)\\\\NUnit\\\\NUnit.ConsoleRunner.3.7.0\\\\tools\\\\nunit3-console.exe\\\\\\\"\",\"$testDll = \\\"C:\\\\TestFramework\\\\TestFramework\\\\Tests\\\\bin\\\\Debug\\\\TESTS.dll\\\"\",\"[System.Diagnostics.Process]::Start($fileEXE,$testDll)\"]}' --comment "Run Test UI Testing" --timeout-seconds 600 --region us-east-1
It does run the tests. But it runs the Chrome.exe browser AND the chromedriver.exe as background processes. This throws a NoSuchWindowException because there is no browser showing up...
I can run the same command in PowerShell on the instance locally: (*Note that this is the same command I pasted into the Run Command console to generate the code mentioned above.)
$fileEXE = "C:\Program Files (x86)\NUnit\NUnit.ConsoleRunner.3.7.0\tools\nunit3-console.exe\"
$testDll = "C:\TestFramework\TestFramework\Tests\bin\Debug\TESTS.dll"
[System.Diagnostics.Process]::Start($fileEXE,$testDll)
It works just fine. chromedriver.exe is a background process and chrome.exe (the browser) is a regular app that works like normal.
I believe my problem is how Run Command is running my test program.
What is the difference between Run Command (send-command) and running the PowerShell commands locally? Shouldn't it do the same thing?
I think there is a mess up with quotes and the way how they're escaped.
See: How to escape a double quote inside double quotes?
This version should look much simpler:
CMD='$fileEXE = "C:\Program Files (x86)\NUnit\NUnit.ConsoleRunner.3.7.0\tools\nunit3-console.exe";'
CMD+='$testDll = "C:\TestFramework\TestFramework\Tests\bin\Debug\TESTS.dll";'
CMD+='[System.Diagnostics.Process]::Start($fileEXE,$testDll);'
aws ssm send-command --document-name "AWS-RunPowerShellScript" \
--filters "Name=tag:Name,Values=Auto-Scaling-Group" \
--comment "Run Test UI Testing" --timeout-seconds 600 --region us-east-1 \
--parameters commands="'$CMD'"
Note: Run it in the Bash shell.