postCreateCommand in VSCode Remote Container Development doesn't show Azure Cli output - visual-studio-code

I'm running a bash script as postCreateCommand when building a VSCode remote development container. In this bash script the command is az login. The problem is that there is not output from this command and therefore I cannot login the Azure CLI. When ran manually in the terminal, it works.
devcontainer.json has this command:
"postCreateCommand": "bash .devcontainer/install.sh",
install.sh has only one line:
az login
The expected output is to be asked by Azure CLI to login on microsoft.com/devicelogin using a code.
The VSCode output doesn't show any Azure CLI output; it only tells that the install.sh script is being executed and it waits until the az login command gives a timeout error and finishes.

The CLI command az login is an interactive command. If you want to use it in the script without input, then you need to change it into a non-interactive command. I recommend you use the service principal to achieve it:
az login --service-principal -u username -p password --tenant tenantId
This CLI command will execute directly without input and give the output if the service principal has no problem.

Related

gcloud compute ssh port forwarding in powershell

I would like to use gcloud compute ssh with portforwarding options in windows. When I executed following command in "Google Cloud SDK Shell" shortcut on desktop, it worked.
Welcome to the Google Cloud CLI! Run "gcloud -h" to get the list of available commands.
---
C:\Program Files (x86)\Google\Cloud SDK>gcloud compute ssh instance-name --tunnel-through-iap -- -L xxxx:localhost:vvvv
C:\Program Files (x86)\Google\Cloud SDK>
But when I executed same command in powershell, it failed.
PS C:\Users\xxxx> gcloud compute ssh instance-name --tunnel-through-iap -- -L xxxx:localhost:vvvv
ERROR: (gcloud.compute.ssh) unrecognized arguments:
-L
xxxx:localhost:vvvv
To search the help text of gcloud commands, run:
gcloud help -- SEARCH_TERMS
How can I pass SSH args to gcloud command in powershell? I could not use past command history in "Google Cloud SDK Shell" shortcut, so I would like to use powershell (in which I can use past command history). Thanks.
I should quote 2 dashes as '--' when I execute commands in powershell. Following command works in powershell according to link.
gcloud compute ssh instance-name --tunnel-through-iap '--' -L xxxx:localhost:vvvv

backup postgresql from azure container instance

I created Azure Container Instance and ran postgresql in it. Mounted an azure container instance storage account. How can I start backup work, possibly by sheduler?
When I run the command
az container exec --resource-group Vitalii-demo --name vitalii-demo --exec-command "pg_dumpall -c -U postgrace > dump.sql"
I get an error error: code = 2 desc = oci runtime error: exec failed: container_linux.go:247: starting container process caused "exec: \ "pg_dumpall -c -U postgrace > dump.sql\": executable file not found in $PATH"
I read that
Azure Container Instances currently supports launching a single process with az container exec, and you cannot pass command arguments. For example, you cannot chain commands like in sh -c "echo FOO && echo BAR", or execute echo FOO.
Perhaps there is an opportunity to run as a task? Thanks.
Unfortunately - and as you already mentioned - it's not possible to run any commands with arguments like echo FOO or chain multiple commands together with &&.
https://learn.microsoft.com/en-us/azure/container-instances/container-instances-exec#run-a-command-with-azure-cli
You should be able to run an interactive shell by using --exec-command /bin/bash.
But this will not help if you want to schedule the backups programatically.
pg_dumpall can also be configured by environment variables:
https://www.postgresql.org/docs/9.3/libpq-envars.html
You could launch your backup-container with the correct environment variables in order to connect your database service:
PGHOST
PGPORT
PGUSER
PGPASSWORD
When having these variables set, a simple pg_dumpall should totally do what you want.
Hope that helps.
UPDATE:
Yikes, even when configuring the connection via environment-variables you won't be able to state the desired output file... Sorry.
You could create your own Dockerimage with a pre-configured script for dumping your PostgreSQL-database.
Doing it that way, you can configure the output-file in your script and then simply execute the script with --exec-command dump_my_db.sh.
Keep in mind that your script has to be located somewhere in the default $PATH - e.g. /usr/local/bin.

Docker Login to gcr.io in Powershell

I'm trying to log in to Google's Container Registry on Windows 10 by using a JSON Key file. I have this working without issues on my Mac so the keyfile is definitely working.
First off I had issues getting the docker login function to accept the contents of the JSON key file. I've tried running the "/set /p PASS..." command in CMD and I've tried something along the lines of this in Powershell:
docker login -u _json_key -p "$(Get-Content keyfile.json)" https://gcr.io
These all result in either an error or this:
"docker login" requires at most 1 argument.
Since I couldn't get this to work, I ran:
docker login -u _json_key https://gcr.io
And then just removed all breaks from the JSON file manually, copied it to clipboard and pasted it when prompted for my password.
That results in:
Login Succeeded
Problem solved! Right?
Well apparently not. I was still unable to pull my images and when I ran docker info the only registry listed was "https://index.docker.io/v1/".
I've tried starting Powershell as admin, and restarted and reset docker but nothing seems to help.
Anyone got any clue what is going on? How do I debug this?
I'm running Docker version 17.12.0 (stable)
I found a solution that works for both Windows (in PowerShell) and bash. The secret is to use the "provide a password using stdin".
cat gcr_registry-ro.json | docker login -u _json_key --password-stdin https://gcr.io
Login Succeeded
Help text and versions:
PS C:\Users\andy> docker login --help
Usage: docker login [OPTIONS] [SERVER]
Log in to a Docker registry
Options:
-p, --password string Password
--password-stdin Take the password from stdin
-u, --username string Username
PS C:\Users\andy> docker -v
Docker version 18.06.1-ce, build e68fc7a
PS C:\Users\andy>
In PowerShell:
(Get-Content keyfile.json) | docker login -u _json_key --password-stdin https://gcr.io
I got it working now by using the trick I did with removing all line breaks in the key, copying it and then pasting it when I'm prompted for my password, instead of trying to pass it into the -p parameter.
Turns out I had to change the url to eu.gcr.io instead of just gcr.io.
I wrote some feedback on the GCR documentation page - Hopefully they'll fix it and maybe even add the correct way to do it in Powershell.
Only thing I'm still wondering is why I can't see that I'm logged into GCR in docker info.

AWS Run Command act different than running on server locally

I am having problems running commands on an EC2 Instance from my Bamboo server.
I have a command generated from the Run Command in the AWS Console. I place that command in a script on my bamboo server and run it:
aws ssm send-command --document-name "AWS-RunPowerShellScript" --targets '{\"Key\":\"tag:Name\",\"Values\":[\"Auto-Scaling-Group\"]}' --parameters '{\"commands\":[\"$fileEXE = \\\"C:\\\\Program Files (x86)\\\\NUnit\\\\NUnit.ConsoleRunner.3.7.0\\\\tools\\\\nunit3-console.exe\\\\\\\"\",\"$testDll = \\\"C:\\\\TestFramework\\\\TestFramework\\\\Tests\\\\bin\\\\Debug\\\\TESTS.dll\\\"\",\"[System.Diagnostics.Process]::Start($fileEXE,$testDll)\"]}' --comment "Run Test UI Testing" --timeout-seconds 600 --region us-east-1
It does run the tests. But it runs the Chrome.exe browser AND the chromedriver.exe as background processes. This throws a NoSuchWindowException because there is no browser showing up...
I can run the same command in PowerShell on the instance locally: (*Note that this is the same command I pasted into the Run Command console to generate the code mentioned above.)
$fileEXE = "C:\Program Files (x86)\NUnit\NUnit.ConsoleRunner.3.7.0\tools\nunit3-console.exe\"
$testDll = "C:\TestFramework\TestFramework\Tests\bin\Debug\TESTS.dll"
[System.Diagnostics.Process]::Start($fileEXE,$testDll)
It works just fine. chromedriver.exe is a background process and chrome.exe (the browser) is a regular app that works like normal.
I believe my problem is how Run Command is running my test program.
What is the difference between Run Command (send-command) and running the PowerShell commands locally? Shouldn't it do the same thing?
I think there is a mess up with quotes and the way how they're escaped.
See: How to escape a double quote inside double quotes?
This version should look much simpler:
CMD='$fileEXE = "C:\Program Files (x86)\NUnit\NUnit.ConsoleRunner.3.7.0\tools\nunit3-console.exe";'
CMD+='$testDll = "C:\TestFramework\TestFramework\Tests\bin\Debug\TESTS.dll";'
CMD+='[System.Diagnostics.Process]::Start($fileEXE,$testDll);'
aws ssm send-command --document-name "AWS-RunPowerShellScript" \
--filters "Name=tag:Name,Values=Auto-Scaling-Group" \
--comment "Run Test UI Testing" --timeout-seconds 600 --region us-east-1 \
--parameters commands="'$CMD'"
Note: Run it in the Bash shell.

running a batch file in a remote machine as an administrator

I am having a virtual machine which i am using it as a server. I have my local machine as a client.
I have a windows batch file in the virtual machine, a.k.a the server which has a series of command.
I try to run the batch file from the client through psexec. I can access the file and execute the file. But not all the commands are executed. They need administrative privileges.
The command that i use is
psexec \virtualmachinename -s -u domainname\username -p PASSWORD c:\foldername\batchfile.bat
NOTE 1: I cannot select the option of "Run as Administrator" in the properties of the batch file. The check box is grayed, that means i cannot select/deselect anything.
NOTE 2: I have given the user of my virtual machine full administrative privileges.
Any insight or possible solutions will be of great help.
If the account you are logging into with is an Administrator then your code should be working.
However the first thing I would try would be to add runas /user:administrator ie:
psexec \\virtualmachinename -u domainname\username -p PASSWORD cmd && runas /user:administrator && c:\foldername\batchfile.bat