AWS Run Command act different than running on server locally - powershell

I am having problems running commands on an EC2 Instance from my Bamboo server.
I have a command generated from the Run Command in the AWS Console. I place that command in a script on my bamboo server and run it:
aws ssm send-command --document-name "AWS-RunPowerShellScript" --targets '{\"Key\":\"tag:Name\",\"Values\":[\"Auto-Scaling-Group\"]}' --parameters '{\"commands\":[\"$fileEXE = \\\"C:\\\\Program Files (x86)\\\\NUnit\\\\NUnit.ConsoleRunner.3.7.0\\\\tools\\\\nunit3-console.exe\\\\\\\"\",\"$testDll = \\\"C:\\\\TestFramework\\\\TestFramework\\\\Tests\\\\bin\\\\Debug\\\\TESTS.dll\\\"\",\"[System.Diagnostics.Process]::Start($fileEXE,$testDll)\"]}' --comment "Run Test UI Testing" --timeout-seconds 600 --region us-east-1
It does run the tests. But it runs the Chrome.exe browser AND the chromedriver.exe as background processes. This throws a NoSuchWindowException because there is no browser showing up...
I can run the same command in PowerShell on the instance locally: (*Note that this is the same command I pasted into the Run Command console to generate the code mentioned above.)
$fileEXE = "C:\Program Files (x86)\NUnit\NUnit.ConsoleRunner.3.7.0\tools\nunit3-console.exe\"
$testDll = "C:\TestFramework\TestFramework\Tests\bin\Debug\TESTS.dll"
[System.Diagnostics.Process]::Start($fileEXE,$testDll)
It works just fine. chromedriver.exe is a background process and chrome.exe (the browser) is a regular app that works like normal.
I believe my problem is how Run Command is running my test program.
What is the difference between Run Command (send-command) and running the PowerShell commands locally? Shouldn't it do the same thing?

I think there is a mess up with quotes and the way how they're escaped.
See: How to escape a double quote inside double quotes?
This version should look much simpler:
CMD='$fileEXE = "C:\Program Files (x86)\NUnit\NUnit.ConsoleRunner.3.7.0\tools\nunit3-console.exe";'
CMD+='$testDll = "C:\TestFramework\TestFramework\Tests\bin\Debug\TESTS.dll";'
CMD+='[System.Diagnostics.Process]::Start($fileEXE,$testDll);'
aws ssm send-command --document-name "AWS-RunPowerShellScript" \
--filters "Name=tag:Name,Values=Auto-Scaling-Group" \
--comment "Run Test UI Testing" --timeout-seconds 600 --region us-east-1 \
--parameters commands="'$CMD'"
Note: Run it in the Bash shell.

Related

gcloud compute ssh port forwarding in powershell

I would like to use gcloud compute ssh with portforwarding options in windows. When I executed following command in "Google Cloud SDK Shell" shortcut on desktop, it worked.
Welcome to the Google Cloud CLI! Run "gcloud -h" to get the list of available commands.
---
C:\Program Files (x86)\Google\Cloud SDK>gcloud compute ssh instance-name --tunnel-through-iap -- -L xxxx:localhost:vvvv
C:\Program Files (x86)\Google\Cloud SDK>
But when I executed same command in powershell, it failed.
PS C:\Users\xxxx> gcloud compute ssh instance-name --tunnel-through-iap -- -L xxxx:localhost:vvvv
ERROR: (gcloud.compute.ssh) unrecognized arguments:
-L
xxxx:localhost:vvvv
To search the help text of gcloud commands, run:
gcloud help -- SEARCH_TERMS
How can I pass SSH args to gcloud command in powershell? I could not use past command history in "Google Cloud SDK Shell" shortcut, so I would like to use powershell (in which I can use past command history). Thanks.
I should quote 2 dashes as '--' when I execute commands in powershell. Following command works in powershell according to link.
gcloud compute ssh instance-name --tunnel-through-iap '--' -L xxxx:localhost:vvvv

postCreateCommand in VSCode Remote Container Development doesn't show Azure Cli output

I'm running a bash script as postCreateCommand when building a VSCode remote development container. In this bash script the command is az login. The problem is that there is not output from this command and therefore I cannot login the Azure CLI. When ran manually in the terminal, it works.
devcontainer.json has this command:
"postCreateCommand": "bash .devcontainer/install.sh",
install.sh has only one line:
az login
The expected output is to be asked by Azure CLI to login on microsoft.com/devicelogin using a code.
The VSCode output doesn't show any Azure CLI output; it only tells that the install.sh script is being executed and it waits until the az login command gives a timeout error and finishes.
The CLI command az login is an interactive command. If you want to use it in the script without input, then you need to change it into a non-interactive command. I recommend you use the service principal to achieve it:
az login --service-principal -u username -p password --tenant tenantId
This CLI command will execute directly without input and give the output if the service principal has no problem.

backup postgresql from azure container instance

I created Azure Container Instance and ran postgresql in it. Mounted an azure container instance storage account. How can I start backup work, possibly by sheduler?
When I run the command
az container exec --resource-group Vitalii-demo --name vitalii-demo --exec-command "pg_dumpall -c -U postgrace > dump.sql"
I get an error error: code = 2 desc = oci runtime error: exec failed: container_linux.go:247: starting container process caused "exec: \ "pg_dumpall -c -U postgrace > dump.sql\": executable file not found in $PATH"
I read that
Azure Container Instances currently supports launching a single process with az container exec, and you cannot pass command arguments. For example, you cannot chain commands like in sh -c "echo FOO && echo BAR", or execute echo FOO.
Perhaps there is an opportunity to run as a task? Thanks.
Unfortunately - and as you already mentioned - it's not possible to run any commands with arguments like echo FOO or chain multiple commands together with &&.
https://learn.microsoft.com/en-us/azure/container-instances/container-instances-exec#run-a-command-with-azure-cli
You should be able to run an interactive shell by using --exec-command /bin/bash.
But this will not help if you want to schedule the backups programatically.
pg_dumpall can also be configured by environment variables:
https://www.postgresql.org/docs/9.3/libpq-envars.html
You could launch your backup-container with the correct environment variables in order to connect your database service:
PGHOST
PGPORT
PGUSER
PGPASSWORD
When having these variables set, a simple pg_dumpall should totally do what you want.
Hope that helps.
UPDATE:
Yikes, even when configuring the connection via environment-variables you won't be able to state the desired output file... Sorry.
You could create your own Dockerimage with a pre-configured script for dumping your PostgreSQL-database.
Doing it that way, you can configure the output-file in your script and then simply execute the script with --exec-command dump_my_db.sh.
Keep in mind that your script has to be located somewhere in the default $PATH - e.g. /usr/local/bin.

Running interactive container with power shell

I'm trying to setup an automated build container in Windows(host and guest). Right now I'm having problems executing a simple powershell inside the container. I've done the following:
Created this DockerFile:
# escape=`
FROM microsoft/windowsservercore
SHELL ["cmd", "/S", "/C"]
CMD ["powershell.exe", "-NoLogo", "-ExecutionPolicy", "Bypass"]
Executed this build command:
docker build -t test:latest .
Started the docker with this command:
docker run test
The PowerShell prints this and the container exits:
PS C:\>
D:\repo\docker\Teste
Tried again with this command:
docker start d05ee -ai
The PowerShell prints the same output:
PS C:\>
D:\repo\docker\Teste
I wish to use the container interactively in a first moment to validate the tools I will install on it, but I'm not able to do that. I don't now which error is blocking me to do it and that is my question.
Obs1: The powershell in a windows cmd with the same parameters work fine.
Obs2: I've based my DockerFile on the one in this tutorial.
Obs3: Running this works fine:
docker run -it microsoft/windowsservercore powershell -NoLogo -ExecutionPolicy Bypass
Therefore I presume the problem is on the image generation.
you need to run your container with the -it switch. this will make you container interactive, so you can poke around
docker run -it test

AWS Cli - Query output using an environment variable in powershell

I am trying to query the output of an AWS cli command using an environment variable as the query string. This works fine for me using the AWS Cli in Linux but in Powershell I am having trouble getting the Cli to use the variable in Powershell.
For example - thsi works for me in Linux:
SECGRP="RDP from Home"
aws ec2 describe-security-groups --query \
'SecurityGroups[?GroupName==`'"$SECGRP"'`].GroupId' --output text
If i run this in Powershell:
$SECGRP="RDP from Home"
aws ec2 describe-security-groups --query \
'SecurityGroups[?GroupName==`'"$SECGRP"'`].GroupId' --output text
Error Details:
Bad value for --query SecurityGroups[?GroupName==`: Bad jmespath
expression: Unclosed ` delimiter:
SecurityGroups[?GroupName==`
^
I have tried a few combinations of quotes inisde the query expression but either get errors or no output.
I have also run the following to demonstrate i can get the correct output using Powershell (but not using a variable):
aws ec2 describe-security-groups --query \
'SecurityGroups[?GroupName==`RDP from Home`].GroupId' --output text
Try this:
$SECGRP="RDP from Home"
aws ec2 describe-security-groups --query "SecurityGroups[?GroupName=='$SECGRP'].GroupId" --output text