Delete amazon ses template from cli - email

How can I delete an amazon ses template using the cli?
I'm trying to use this command from the amazon doc:
aws delete-template --template-name xxx
but does not work from the cli. When I try to list all available options when I hit this command:
aws help
The delete-template is not available. I have the latest version:
aws-cli/1.15.79
https://docs.aws.amazon.com/cli/latest/reference/ses/delete-template.html

you are missing ses after aws in the command
aws ses delete-template --template-name XXXX --region XXXXX

Try this:
aws ses delete-template --template-name templateName
Do not add the .json nor template_ prefix.

Related

Accessing Azure Cli from Prowershell in devops pipeline

I'm currently working on a pipeline job that requires kubernetes access through powershell.
The only issue is that I need to sign in for Az cli. For testing I'm using my personal credentials, clearly not a good definitive option. Are there any other options for Azure cli login that could be used instead?
I'm guessing you are working with hosted agents, therefore, you need to configure kube.config on the hosted agent.
in order to do that, run az aks get-credentials --name $(CLUSTER_NAME) --resource-group $(RESOURCE_GROUP_NAME). The easiest way is to use Azure CLI task. Be aware that this task required authorization from Azure DevOps to Azure.
More info can be found here.
In case you are the subscription owner- select your subscription and click on Authorize.
When the kube.config configured on the hosted agent, you can run any kubectl command you wish (Using Powershell\Bash\CMD).

Configure terraform to connect to IBM Cloud

I try to connect terraform to IBM Cloud and I got messed up with
Softlayer and IBM Cloud credentials.
I followed the instruction on IBM sites to connect my terraform to the IBM Cloud and I am confused, because I may use SL and IBM Cloud connec-
tion information like API-keys etc.
I may not run terraform init and/or plan, because there are some
information missing. No I am asked for the organization (var.org).
Sometimes I got asked about the SL credentials. Our account started
in January 2019 and I am sure not to worked with SL at all and only
heard about API key from IBM cloud.
May some one have an example, how terraform.tfvars looks like to work
properly together with IBM Cloud Kubernetes Service, VPC and classic
infrastructure?
Thank you very much.
Jan
I recommend starting to take a look at these two tutorials, dealing with a LAMP stack on classic vertical servers and with Kubernetes and other services. Both provide step by step instructions and guide you through the process of setting up Terraform-based deployments.
They provide the necessary code in GitHub repos. For the Kubernetes sample credentials.tfvars you only need the API key:
ibmcloud_api_key = "your api key"
For the public_key a string containing the public key should be provided instead of a file that contains the key.
$ cat ~/.ssh/id_rsa.pub
ssh-rsa CCCde...
Then in terraform:
resource "ibm_compute_ssh_key" "test_ssh_key" {
public_key = "ssh-rsa CCCde..."
}
Alternatively you can use a key that you created earlier:
data "ibm_compute_ssh_key" "ssh_key" {
label = "yourexistingkey"
}
resource "ibm_compute_vm_instance" "onprem_vsi" {
ssh_key_ids = ["${data.ibm_compute_ssh_key.ssh_key.id}"]
}
Here is what you will need to run an init or plan for IBM Cloud Kubernetes Service clusters with terraform...
In your .tf file
terraform {
required_providers {
ibm = {
source = "IBM-Cloud/ibm"
}
}
}
provider "ibm" {
ibmcloud_api_key = var.ibmcloud_api_key
iaas_classic_username = var.classic_username
iaas_classic_api_key = var.classic_api_key
}
In your shell, set the following environment variables
export IBMCLOUD_API_KEY=<value of your IBM Cloud api key>
export CLASSIC_API_KEY=<Value of you r IBM Cloud classic (i.e. SL) api key>
export CLASSIC_USERNAME=<Value of your IBM Cloud classic username>
Run your init as follows:
terraform init
Run your plan as follows:
terraform plan \
-var ibmcloud_api_key="${IBMCLOUD_API_KEY}" \
-var classic_api_key="${CLASSIC_API_KEY}" \
-var classic_username="${CLASSIC_USERNAME}"

Using CodePipeline - deploy a CloudFormation stack to another account

I am configuring a CodePipeline in Account 00000000000.
I would like to deploy a CloudFormation stack
by executing a CloudFromation template via the CodePipeline
but not in account 123456789123 and not in 00000000000
Question
How do I configure the CodePipeline action of type "Deploy" to do so?
Especially how do I point it to the account 123456789123 ?
What I did so far
I assume it works via roles.123456789123.
I created an IAM role in account 123456789123,
with trust to the account 00000000000,
with trust to the service cloudformation.
I named it arn:aws:iam::123456789123:role/CFDep
Below is the configuration of my CodePipeline-Action.
I am getting an error The role name is invalid. Check that the specified role exists and can be assumed by AWS CloudFormation. Why?
From the docs:
You cannot use the AWS CodePipeline console to create or edit a
pipeline that uses resources associated with another AWS account.
However, you can use the console to create the general structure of
the pipeline, and then use the AWS CLI to edit the pipeline and add
those resources. Alternatively, you can use the structure of an
existing pipeline and manually add the resources to it.
You can do one of the following 2 things:
Use aws codepipeline cli to edit the pipeline
aws codepipeline update-pipeline --cli-input-json file://pipeline.json
OR
Create the pipeline itself using cloudformation
You can use this pipeline definition from aws reference architecture for cross account pipeline as a starting point for your template.

Generate multiple API keys through powershell in AWS

I want to create API keys to my REST API in AWS programmatically through a script.
response = client.create_api_key(
name='string',
description='string',
enabled=True|False,
generateDistinctId=True|False,
value='string',
stageKeys=[
{
'restApiId': 'string',
'stageName': 'string'
},
],
customerId='string'
)
I found this method and am hoping the script should look something like this. Does anyone know how i successfully create the API keys?
To do this using CLI I used this and it works fine:
aws apigateway create-api-key --name 'WH1-In4m-ApiKey' --description 'development.' --enabled --stage-keys restApiId='yz50sp19a7',stageName='wh1'
How to replicate in powershell for aws?
If you have your AWS credentials correctly configured, you can use:
AWS CLI in your script (provided you have AWS CLI installed) to generate an API key: create-api-key
Use Python Boto3 library: create-api-key
You may have to install AWS Tools for Windows PowerShell for option #1

How to get AWS service Catalog Provisioned Product's Envet Output section details through AWS Cli

Actually Service Catalog is using CloudFormation template for provisioning the Products/environments.
I tried Provisioning the product with help of AWS document example. In that AWS is having CF template for Creating AWS Instance with public access.
I Provisioned the Product(I mean created the EC2 Instance) but here I need the IP address of EC2 instance which is created through Cloudformation.
Could anyone help me with the AWS Cli command/AWS Powershell command to get the output section of the Product.
I got the answer myself.
Finally I got the IP address of Provisioned Product.
$newProduct = New-SCProvisionedProduct -ProvisionedProductName $productName -ProductId $productId -ProvisioningArtifactId $artifactId -ProvisionToken testToken -ProvisioningParameter #( #{key="KeyName";value="test"} )
$envInfo = Get-SCRecord -Id $newProduct.RecordId
$envIP = $envInfo.RecordOutputs[1].OutputValue
Write-Host $envIP