I utilise a number of 'throwaway' servers in AWS and we're looking at trying to keep the cost of these down.
Initially, we're looking for a fairly basic 'awsec2 stop all' command to be run on a scheduled basis from a server we do know will be running 24/7.
Upon checking against what AWS have documented, it appears that we need to pull in all the currently running instances, grab the ID's of these and then pass them through into the command, rather than simply stating I want all instances to turn off.
Is there a better method collecting these ID's such as simply being able to issue a 'stop all'?
Appreciate the help.
The AWS CLI provides built-in JSON parsing with the --query option. Additionally, you can use the --filter option to execute stop commands on running instances only.
aws ec2 describe-instances \
--filter Name=instance-state-name,Values=running \
--query 'Reservations[].Instances[].InstanceId' \
--output text | xargs aws ec2 stop-instances --instance-ids
This is untested, but should do the trick with AWS Tools for Powershell:
#(Get-EC2Instance) | % {$_.RunningInstance} | % {Stop-EC2Instance $_.InstanceId}
In plain English, the line above gets a collection of EC2 instance objects (Amazon.EC2.Model.Reservation), grabs the RunningInstance property for each (a collection of various properties relating to instance), and uses that to grab the InstanceId of each and stop the instance.
These functions are mapped as follows:
Get-EC2Instance -> ec2-describe-instances
Stop-EC2Instance -> ec2-stop-instances
Be sure to check out the help for Stop-EC2Instance... has some useful parameters like -Terminate and -Force that you may be interested in.
This one-liner will stop all the instnaces:
for i in $(aws ec2 describe-instances | jq '.Reservations[].Instances[].InstanceId'); do aws ec2 stop-instances --instance-ids $i; done
Provided:
You have AWS-CLI instlled (http://aws.amazon.com/cli/)
You have jq json parser installed. (http://stedolan.github.io/jq/)
..and yeah, above syntax is for Linux Bash shell specific. You can mimic the same for powershell on windows and figure out a powersehll way of parsing json.
if anyone ever wants to do what Peter Moon described via AWS DataPipeline:
aws ec2 describe-instances --region eu-west-1 --filter Name=instance-state-name,Values=running --query 'Reservations[].Instances[].InstanceId' --output text | xargs aws ec2 stop-instances --region eu-west-1 --instance-ids
it's basically the same command but you have to add the --region after describe-instances and after stop-instances to make it work. watch out for the a/b/c that's usually included in the region name. that does seems to cause errors if included here.
Related
We have installed the pg_repack extension in Cloud SQL by following the guide:
https://cloud.google.com/sql/docs/postgres/extensions#pg_repack
The installation of the extension works fine and it shows up in the list of extensions when running \dx.
We then want to invoke the extension, but it is unclear from where this should be done. The docs just say to "run the command":
pg_repack -h <hostname> -d testdb -U csuper1 -k -t t1
We cant find anywhere in our project where this command can be invoked though. Do we have to set up a compute engine instance for this, or is there some other way?
We only use Cloud Run for running our code at the moment and would like to keep things as small/simple as possible.
Our solution: We built a docker image that wrapped pg_repack with http, and then deployed it as a Cloud Run service. This enabled us to invoke pg_repack periodically using the cloud scheduler.
I don't know this is a relevant question or not? I have one csv file and 1 shape file in my own drive and I used to run one script in cmd and combined these two files and stored in a pgsql using pgfuttur. I want to do the same thing in aws. If I kept these two files on a bucket is it possible to do the same with the below command I used in cmd?
shp2pgsql -I "<Shape file directory>" <Tablename> | psql -U <username> -d <DatabaseName> Example : shp2pgsql -I "C:\Test\filep" public.geo | psql -U postgres -d gisDB
if yes please help me to get this. If no please let me know the reason.. [Please note that I am new to AWS]
You can do it in two ways:
You plan to do it only once or few times: Download the files locally using AWS CLI: aws s3 cp or aws s3 sync and then pass those files as input
You will be accessing multiple files: Use another AWS service to expose your S3 objects as files. Check AWS Storage Gateway and choose AWS Storage Gateway for Files. Once configured, you can refer to the S3 objects as files.
When downloading kubernetes and attempting to deploy a cluster using -
export KUBERNETES_PROVIDER=YOUR_PROVIDER; wget -q -O - https://get.k8s.io | bash
I get an error message that states - govc: network 'VM Network' not found.
VM Network is a valid network in my environment and I currently have VMs on it, so I know that it works.
I have tried using other networks and haven't had any luck. Any help would be appreciated.
Thanks!
with all your other GOVC_* environment variables set (things like username, password, etc) try:
govc ls -l
This should give you a set of resources, one of which should look like:
/<your-datacenter>/network
After that, running:
govc ls -l /<your-datacenter>/network
should give you a set of one or more network resources. Choose one and run:
export GOVC_NETWORK='/<your-datacenter>/network/<your-network>'
then try kube-up again.
How can I update the gcloud components programmatically within a shell script?
Calling gcloud components update requires an user entry, e.g.:
$ gcloud components update
The following components will be installed:
--------------------------------------------
| kubectl (Linux, x86_64) | 1.0.1 | 4.5 MB |
--------------------------------------------
For the latest release notes, please visit:
https://dl.google.com/dl/cloudsdk/release/RELEASE_NOTES
Do you want to continue (Y/n)?
I can't find an argument for gcloud to enforce the update.
You're looking for the --quiet flag.
From gcloud --help:
--quiet, -q
Disable all interactive prompts when running gcloud commands. If input
is required, defaults will be used, or an error will be raised.
This is generally a flag you'll want for non-interactive contexts.
You may also set the CLOUDSDK_CORE_DISABLE_PROMPTS environment variable to a non-empty value:
export CLOUDSDK_CORE_DISABLE_PROMPTS=1
gcloud components update # This works for all gcloud commands
If you encounter this problem while running a gcloud command on a Continuous Integration (CI) server, one thing you can try is run with a Docker image that already contains the components you need. Thus you can avoid having to update gcloud components.
For example, if you're trying to run gcloud beta firebase test android run, you could use the image: google/cloud-sdk:latest because at https://github.com/GoogleCloudPlatform/cloud-sdk-docker it shows :latest contains gcloud Beta Commands.
I tested this on Gitlab hosted CI (.gitlab-ci.yml) and it worked.
I have been trying to apply my startup scripts to new Windows instances on Google Compute Engine as described here, however when I check the instances there is no trace of them ever being executed. Here is the gcloud command I am running:
gcloud compute instances create "my-instance"
--project "my-project"
--zone "us-central1-a"
--machine-type "g1-small"
--network "default"
--metadata "gce-initial-windows-user=my-user" "gce-initial-windows-password=my-pass"
--maintenance-policy "MIGRATE"
--scopes "storage-ro"
--tags "http-server" "https-server"
--image "https://www.googleapis.com/compute/v1/projects/windows-cloud/global/images/windows-server-2008-r2-dc-v20150110"
--boot-disk-type "pd-standard"
--boot-disk-device-name "my-instance"
--metadata-from-file sysprep-oobe-script-ps1=D:\Path\To\startup.ps1
I tried using all 3 startup types (sysprep-specialize-script-ps1, sysprep-oobe-script-ps1, windows-startup-script-ps1) but none worked. Can't see any indication in the Task Scheduler or Event Viewer either. The file on my system exists and does work when I run it manually. How can I get this working?
A good way to debug Powershell scripts is to have them write to the serial console (COM1). You'll be able to see the output of the script from GCE's serial port output.
gcloud compute instances get-serial-port-output my-instance --zone
us-central1-a
If there's no script you'll see something like:
Calling oobe-script from metadata.
attributes/sysprep-oobe-script-bat value is not set or metadata server is not reachable.
attributes/sysprep-oobe-script-cmd value is not set or metadata server is not reachable.
attributes/sysprep-oobe-script-ps1 value is not set or metadata server is not reachable.
Running schtasks with arguments /run /tn GCEStartup
--> SUCCESS: Attempted to run the scheduled task "GCEStartup".
-------------------------------------------------------------
Instance setup finished. windows is ready to use.
-------------------------------------------------------------
Booting on date 01/25/2015 06:26:26
attributes/windows-startup-script-bat value is not set or metadata server is not reachable.
attributes/windows-startup-script-cmd value is not set or metadata server is not reachable.
attributes/windows-startup-script-ps1 value is not set or metadata server is not reachable.
Make sure that contents of the ps1 file is actually attached to the instance.
gcloud compute instances describe my-instance --zone us-central1-a
--format json
The JSON dump should contain the powershell script within it.
Lastly, a great way to debug Powershell startup scripts is to write the output to the serial console.
You can print log messages and see them in the Google Developer Console > Compute > Compute Engine > VM Instances > (Instance Name). Then scroll to the bottom and click the expand option for "Serial console".
Function Write-SerialPort ([string] $message) {
$port = new-Object System.IO.Ports.SerialPort COM1,9600,None,8,one
$port.open()
$port.WriteLine($message)
$port.Close()
}
Write-SerialPort ("Testing GCE Startup Script")
This command worked for me, I had to make sure that the script was written in ascii. Powershell ISE writes with a different encoding that breaks gcloud compute.
gcloud compute instances create testwin2 --zone us-central1-a
--metadata-from-file sysprep-oobe-script-ps1=testconsole.ps1 --image windows-2008-r2