how to update ecs taskdefinition using AWS CLI - amazon-ecs

I need to change the environment variable under container definition in the ecs Task Definition.
TASK_DEFINITION=$(aws ecs describe-task-definition --task-definition $TASKDEFINITION_ARN --output json)
echo $TASK_DEFINITION | jq '.taskDefinition.containerDefinitions[0] | ( .environment[] |= if
.name == "ES_PORT_9200_TCP_ADDR" then .value = "vpc-kkslke-shared-3-abcdkalssdfy.us-east-1.es.amazonaws.com"
else . end)' | jq -s . >container-definition.json
CONTAINER_DEF=$(<$container-definition.json)
aws ecs register-task-definition --family $FAMILY_NAME --container-definitions $CONTAINER_DEF
Error Message:
Error parsing parameter '--container-definitions': Invalid JSON:
Expecting property name enclosed in double quotes: line 1 column 2
(char 1) JSON received: {
One observation not sure if it is related to a bug in VScode or not . When I try to use the view the variable value in debug mode. I only get partial text as seen below. But when I do echo for the same variable I do see the full json. Not sure if the whole value is being passed to the container definition.

Related

How to pass values from csv into az cli deployment as parameters

I don't have much experience with PowerShell and this simple issue has been driving me up the wall. I'm hoping someone can point me in the right direction.
I have a CSV-file with IP-range values
I wish to pass these IP values as a parameter to a Bicep template
The parameter is of type array, see code snippets below
CSV-file:
IP,Comment
10.0.0.1, Comment blabla
10.0.0.52, Comment more blabla
I wish to pass the IP-values into a Azure Bicep template with the following parameter:
param ipArray array
The cli command is as follows:
az deployment group validate -g test-rg -f .\main.bicep -p ipArray=$ipRange
I am unable to populate $ipRange properly. I have tested the following and know it works:
az deployment group validate -g test-rg -f .\main.bicep -p ipArray="['10.0.0.1','10.0.0.52']"
So I need to figure out how to build my Powershell variable according to above syntax
$ipRange = ((Get-Content .\ip_list.csv) | ConvertFrom-Csv).IP
Failed to parse string as JSON:
10.0.0.1 10.0.0.52
Error detail: Extra data: line 1 column 6 (char 5)
Any nudge in the right direction will be greatly appreciated
Thanks!
This code will convert the ip range as you asked for:
$ipRange = ((Get-Content C:\Temp\ip.csv) | ConvertFrom-Csv).IP | ConvertTo-Json
$ipRange = $ipRange.ToString() -replace '"',"'"
$ipRange
one final thing, in your param, it is mentioned as vlkIpArray and in deployment it is mentioned as ipArray. is this a typo error?
param vlkIpArray array and
az deployment group validate -g test-rg -f .\main.bicep -p ipArray="['10.0.0.1','10.0.0.52']"

Not able to set a github environment variable in github actions

My workflow is using a windows runner.
The workflow runs an aws cli command to get instance IDs. I then try to set the returned IDs as a github environment variable to use in future steps
Here is the snippet from my workflow file:
- name: Get IDs
id: get_instances
run: |
$getinfo = aws ec2 describe-instances --filters 'Name=tag:Name,Values=project-bell' --output text --query 'Reservations[*].Instances[*].InstanceId' --region "eu-west-1"
$instances = ($getinfo -Join " ")
echo $instances
echo "INSTANCE_ID=$instances" >> $GITHUB_ENV
- name: Print IDs
run: |
echo {{ env.INSTANCE_ID }}
echo "{{ env.INSTANCE_ID }}"
So the 1st line in my run command of the "Get IDs" step returns the IDs as an array. So $getinfo looks like this:
instanceID 1
instanceID 2
2nd line I use -Join command to get the IDs to be returned in a single line: instanceID 1 instanceID 2
I then try and set this single line as a github environment variable but it isn's working and not sure why.
When i echo $instances i can see that the -Join command has worked and i see the IDs printed as a single line instanceID 1 instanceID 2.
But on the "Print IDs" step {{ env.INSTANCE_ID }} is returned rather than the instanceID 1 instanceID 2 which i tried to set as a github environment variable in the previous step.
Notes:
I have to 2 echos on the "Print IDs" job as i was just testing if i needed quotations for it to work"
I have the -join command because when i tried to set $getinfo (1st run command) as a github env var i got an error saying it was an array
Question
Am I writing the command wrong to set github env?
Or is it something to do with the values themselves - like does it need to be a string?

Bicep/ARM XML parameter as string not parsed correctly

In my Bicep file, I have a parameter:
#secure()
param securityToken string
This parameter is an XML value provided as a string from an environment variable, not read from a file. I have trouble correctly setting this parameter from my YML pipeline. I have this piece of code:
az deployment sub create --template-file main.bicep --parameters securityToken='<xml>'
Unfortunately, this returns an error:
< was unexpected at this time.
Ok, so I somehow have to escape this parameter? I also tried:
az deployment sub create --template-file main.bicep --parameters securityToken='<xml>'
But this gives the following error:
The system cannot find the file specified.
So my questions is:
How can I provide an XML string parameter to my Bicep / ARM deployment?
Thanks!
Deployments to subscriptions need the --location <location> parameter, which is missing in the examples above.
When I run the following on Windows, Git Bash, the deployment succeeds.
az deployment sub create --location eastus --template-file main.bicep --parameters securityToken='<xml>'
Result
main.bicep
targetScope = 'subscription'
#secure()
param securityToken string
output exposedSecret string = securityToken
So I found the problem. Apparently, the < and > needs to be escaped, and I finally found out how: putting it between ". So, the value "<"xml">" did the trick. Now my working code is this:
$token = '<xml></xml>'
$token = $token .replace('<', '"<"') // = "<"xml>"<"/xml>
$token = $token .replace('>', '">"') // = "<"xml">""<"/xml">"
$token = $token .replace('""', '" "') // The double "" only escapes the " itself. Therefore, add an extra space: "<"xml">" "<"/xml">"

Cloud Foundry: How do I get the contents of the VCAP_SERVICES environment variable? (and only this variable!)

When I deploy an app to Cloud Foundry and attach it to instances of Cloud Foundry services,
and I use the Cloud Foundry CLI to get the environment variables: cf env my-app,
then I get an output like:
Getting env variables for app my-app in org my-org / space my-space as user#company.com...
System-Provided:
VCAP_SERVICES: {
"service1": [
// ...
],
"service2": [
// ...
]
}
VCAP_APPLICATION: {
// ...
}
User-Provided:
VARIABLE1: value
VARIABLE2: value
Running Environment Variable Groups:
CREDHUB_API: https://credhub.company.com
No staging env variables have been set
How do I filter this output to get only the contents of the environment variable VCAP_SERVICES, so that when I test/debug my app locally, it behaves as if it was attached to the instances of the Cloud Foundry services?
My goal is to write a file named default-env.json containing only:
{
VCAP_SERVICES: {
"service1": [
// ...
],
"service2": [
// ...
]
}
}
Ideally, the command to produce this output should be a zsh one-liner.
cf env my-app | sed -n '/VCAP_SERVICES/,/VCAP_APPLICATION/p' | sed '$d' | sed '1s;^;{\n;' | sed '$s/$/}/' > default-env.json
Explanation
sed -n '/VCAP_SERVICES/,/VCAP_APPLICATION/p'
keeps only the section between the regular expressions VCAP_SERVICES and VCAP_APPLICATION.
sed '$d' deletes the last line (the line containing VCAP_APPLICATION).
sed '1s;^;{\n;' prepends {\n to the first line.
sed '$s/$/}/' appends } to the end of the file.
Credits
Handy one-liners for SED
BASH Prepend A Text / Lines To a File
SED: insert text after the last line?
Another option would be:
cf curl "/v2/apps/$(cf app --guid my-super-cool-app)/env" | jq -r '.system_env_json.VCAP_SERVICES'
Explanation:
$(cf app --guid <your-app-name) will run in a subshell and get the app guid for your app. You could alternatively just replace that bit with the guid for your app, if you know it already (it'll make the command faster).
cf curl "/v2/apps/<guid>/env" will return all of the env variables for your app.
jq -r '.system_env_json.VCAP_SERVICES' picks out the bit you want.
You could optionally redirect output to a file.
Other interesting bits from that API:
.application_env_json.VCAP_APPLICATION would give you VCAP_APPLICATION.
'.environment_json' would give you any env variables you've set

Getting Outputs from aws cloudformation describe-stacks

I am using the below to get the stack information I want via AWS Cli:
aws cloudformation --region ap-southeast-2 describe-stacks --stack-name mystack
It's returning result OK:
{
"Stacks": [
{
"StackId": "arn:aws:mystackid",
"LastUpdatedTime": "2017-01-13T04:59:17.472Z",
"Tags": [],
"Outputs": [
{
"OutputKey": "Ec2Sg",
"OutputValue": "sg-97e13dff"
},
{
"OutputKey": "DbUrl",
"OutputValue": "myUrl"
}
],
"CreationTime": "2017-01-13T03:27:18.893Z",
"StackName": "mystack",
"NotificationARNs": [],
"StackStatus": "UPDATE_ROLLBACK_COMPLETE",
"DisableRollback": false
}
]
}
But I do not know how to return only the value of OutputValue which is myUrl
As I do not need the rest, just myUrl.
Is that possible via aws cloudformation describe-stacks?
Edit
I just realize I can use --query:
--query "Stacks[0].Outputs[1].OutputValue"
will get exactly what I want but I would like to use DbUrl else if the number of Outputs changes, my result will be unexpected.
I got the answer, use the below:
--query 'Stacks[0].Outputs[?OutputKey==`DbUrl`].OutputValue' --output text
Or
--query 'Stacks[0].Outputs[?OutputKey==`DbUrl`].OutputValue' --output text
Or
--query 'Stacks[?StackName==`mystack`][].Outputs[?OutputKey==`DbUrl`].OutputValue' --output text
While querying works, it may prove problematic if you have multiple stacks. Realistically, you should probably be leveraging exports for things that are distinct and authoritative.
By way of example - if you modified your CloudFormation snippet to look like this:
"Outputs" : {
"DbUrl" : {
"Description" : "My Database Url",
"Value" : "myUrl",
"Export" : {
"Name" : "DbUrl"
}
}
}
Then you could use:
aws cloudformation list-exports --query "Exports[?Name==\`DbUrl\`].Value" --no-paginate --output text
to retrieve it. Exports are required to be unique - only one stack can export any given name. This way, you're assured that you get the right value, every time. If you attempt to create a new stack that exports a name that already exists elsewhere, that stack creation will fail.
Avoid using hardcoded indexes such as [0]. This will lead to unpredicted query returns when you have multiple stacks. Try a more dynamic query such as the following:
aws cloudformation describe-stacks --region my_region --query "Stacks[?StackName=='my_stack_name'][].Outputs[?OutputKey=='my_output_key'].OutputValue" --output text
For your example this would be:
aws cloudformation describe-stacks --region ap-southeast-2 --query "Stacks[?StackName=='mystack'][].Outputs[?OutputKey=='Ec2Sg'].OutputValue" --output text
Take note of []. Without it, your query will not return anything.
To clarify the correct way of using list-exports:
There was a little bit issue on above post, we cannot use \' or \` around output name. The correct syntax is to use single quote ' without using escape character.
Example:
Using escape of tilde
C:\>aws cloudformation list-exports --query "Exports[?Name==\`my-output-name4\`].Value" --no-paginate --output text
Bad value for --query Exports[?Name==\`my-output-name4\`].Value: Bad jmespath expression: Unknown token \:
Exports[?Name==\`my-output-name4\`].Value
Using escape of single quote:
C:\>aws cloudformation list-exports --query "Exports[?Name==\'my-output-name4\'].Value" --no-paginate --output text
Bad value for --query Exports[?Name==\'my-output-name4\'].Value: Bad jmespath expression: Unknown token \:
Exports[?Name==\'my-output-name4\'].Value
And finally, the correct syntax:
C:\>aws cloudformation list-exports --query "Exports[?Name=='myexportname'].Value" --no-paginate --output text
If you had errors or unexpected empty output, please be aware all CLI is case sensitive. For example, if you used
--query "Exports[?Name=='myexportname'].value"
The output would be empty.
Using the Windows AWS CLI I had to ensure the --query param was doubled quoted.
aws cloudformation describe-stacks --stack-name <stack_name> --query "Stacks[0].Outputs[?OutputKey==`<key_we_want>`].OutputValue" --output text
Failing to use double quotes, resulted in the query returning:
Stacks[0].Outputs[?OutputKey==].OutputValue
Not so helpful.