Pre-populated list of availability zone options as parameter for cloudformation template - aws-cloudformation

I would like to present a list of availability zones for a parameters options in a CloudFormation template for the region in the console. Preferably using the Troposphere Python mod.
I see that I would be calling Fn::GetAZs to create a list of availableValues to use as options to the parameter but wondering if there is an example of this already or if it is possible.
So something like this:
template.add_parameter('AZs', AllowedValues= call Fn::GetAZs here some how ... )

I ended up picking the AZs with this instead of using a parameter for user to choose.
AvailabilityZone=Select(0, GetAZs(Ref("AWS::Region"))), ...
It is in the docs - https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/intrinsic-function-reference-getavailabilityzones.html
Equivalent template object looks like this:
"mySubnet" : {
"Type" : "AWS::EC2::Subnet",
"Properties" : {
"VpcId" : {
"Ref" : "VPC"
},
"CidrBlock" : "10.0.0.0/24",
"AvailabilityZone" : {
"Fn::Select" : [
"0",
{
"Fn::GetAZs" : ""
}
]
}
}
}

Related

Resolver to filter a non-scalar type in AppSync / Amplify

We are using AWS Amplify. This is my type
type Package #model {
id: ID!
desc: String!
company: Company! #connection
servicetype: ServiceType! #connection
price: Float!
active: Boolean!
createdAt: AWSDateTime
updatedAt: AWSDateTime
}
Amplify does not generate a filter option for listPackage to allow filtering on servicetype. My understanding is you need to add a custom query and resolver for this. I have added a query listPackageByServiceType but am confused on the resolver... cannot get it to work.
Is there a similar example of code I can follow? I cannot get the filter option to work correctly.
If you have a secondary index on your DynamoDB table based on that field, serviceType, you can do this with a query on that field.
Otherwise, the way to go is probably a scan with a filter on that field. See here for more: https://docs.aws.amazon.com/appsync/latest/devguide/resolver-mapping-template-reference-dynamodb.html#aws-appsync-resolver-mapping-template-reference-dynamodb-scan
From that link, the scan mapping template would look like this (note that depending on what you have, pagination might be necessary):
{
"version" : "2017-02-28",
"operation" : "Scan",
"index" : "fooIndex",
"filter" : {
"expression" : "filter expression"
"expressionNames" : {
"#name" : "name",
},
"expressionValues" : {
":value" : ... typed value
}
}
}

is there is a way to update/ add a new widget CloudWatch dashboard using CloudFormation template?

I​ am trying to add lambda function stats to my dashboard using cloudformation but the problem is that the lambda function is created in a different stack than my dashboard which is created after the stack that has the dashboard. so is there is a way to update the dashboard after it was created using cloudformation
Could you try updating the other cloudformation stack just in case right after and see if it helps. Btw you want to try creating cloudformation templates, there's an online tool available called cloudkast. It is an online aws cloudformation template generator.
In CloudFormation, when referencing a resource in another stack, you use the intrinsic function:
Fn::ImportValue in your Dashboard CloudFormation. See the documentation.
A good example of this can be found on the AWS blog - https://aws.amazon.com/premiumsupport/knowledge-center/cloudformation-reference-resource/
{
"Parameters":{
"NetworkStackNameParameter":{
"Type":"String"
}
},
"Resources" : {
"WebServerInstance" : {
"Type" : "AWS::EC2::Instance",
"Properties" : {
"InstanceType" : "t2.micro",
"ImageId" : "ami-a1b23456",
"NetworkInterfaces" : [{
"GroupSet" : [{"Fn::ImportValue" : {"Fn::Sub" :
"${NetworkStackNameParameter}-SecurityGroupID"}}],
"AssociatePublicIpAddress" : "true",
"DeviceIndex" : "0",
"DeleteOnTermination" : "true",
"SubnetId" : {"Fn::ImportValue" : {"Fn::Sub" : "${NetworkStackNameParameter}- SubnetID"}}
}]
}
}
}
The SubnetId above is pulled in from another stack using the ImportValue.
The same can be done for the Lambda when building a dashboard using CloudFormation.

How MongoClient::save(...) might change the _id field of document parameter

I have a class User that embeds a JsonObject to represent the user's fields. This class looks like that:
class User {
private JsonObject data;
public User(...) {
data = new JsonObject();
data.put("...", ...).put(..., ...);
}
public String getID() { return data.getString("_id"); }
// more getters, setters
// DB access methods
public static userSave(MongoClient mc, User user){
// some house keeping
mc.save("users", user.jsonObject(), ar -> {
if(ar.succeeded()) { ... } else { ... }
});
}
}
I've just spent more than half a day trying to figure out why a call to user.getID() sometimes produced the following error: ClassCastException: class io.vertx.core.json.JsonObject cannot be cast to class java.lang.CharSequence. I narrowed down to the userSave() method and more specifically to MongoClient::save() which actually produces a side effect which transforms the data._id from something like
"_id" : "5ceb8ebb9790855fad9be2fc"
into something like
"_id" : {
"$oid" : "5ceb8ebb9790855fad9be2fc"
}
This is confirmed by the vertx documentation which states that "This operation might change _id field of document parameter". This actually is also true for other write methods like inserts.
I came with two solutions and few questions about doing the save() properly while keeping the _id field up to date.
S1 One way to achieve that is to save a copy of the Json Object rather than the object itself, in other words : mc.save("users", user.jsonObject().copy(), ar -> {...});. This might be expensive on the long run.
S2 An other way is to "remember" _id and then to reinsert it into the data object in the if(ar.succeeded()) {data.put("_id", oidValue); ...} section. But as we are asynchronous, I don't think that the interval between save() and the data.put(...) is atomic ?
Q1: Solution S1 make the assumption that the ID doesn't change, i.e., the string 5ceb8ebb9790855fad9be2fc will not change. Do we have a warranty about this ?
Q2: What is the right way to implement the saveUser() properly ?
EDIT: The configuration JSON object user for the creation of the MongoClient is as follows (in case there is something wrong) :
"main_pool" : {
"pool_name" : "mongodb",
"host" : "localhost",
"port" : 27017,
"db_name" : "appdb",
"username" : "xxxxxxxxx",
"password" : "xxxxxxxxx",
"authSource" : "admin",
"maxPoolSize" : 5,
"minPoolSize" : 1,
"useObjectId" : true,
"connectTimeoutMS" : 5000,
"socketTimeoutMS" : 5000,
"serverSelectionTimeoutMS" : 5000
}

I want to add custom tag defined in docisign on my document through REST API call in Apex

I want to add AccountName custom tag defined in docusign on my document through REST API call in Apex. Here is my REST API request body
{
"status" : "sent",
"customFields" : {
"textCustomFields" : [ {
"name" : "AccountName",
"show" : "true",
"required" : "False",
"value" : "Test Account",
"customFieldType" : "text"
} ]
}
}
The URL is https://demo.docusign.net/restapi/v2/accounts/'accountId'/envelopes
I use anchor string /txtAccountName1/ which I have added on my document but it does not map to any value of the custom field AccountName related to salesforce object. For the tabs it works fine it successfully maps the signer tag to s1 and date to the d1 but for this custom field it does not map the AccountName custom tag to the anchor string /txtAccountName1/. I have created the custom tag AccountName related to salesforce object and used anchor string as /txtAccountName{r}/. I am writing the code in sandbox and using Docusign Demo Account for the integration.I am not sure about the name used in textCustomField and value I used is the reason for not getting the required result. Though the Rest API request is returning the sucess.
How can I map the AccountName value to the anchorString defined in my document?
You are mixing two things customtab and customfields. CustomFields is the metadata on an envelope, there is no tab for this and you can send text type of list type custom fields, these type of fields are not visible to a signer/recipient and is sent as metadata in an envelope, For details related to CustomFields are available at https://docs.docusign.com/esign/restapi/Envelopes/EnvelopeCustomFields/create/
Now customTab, if you have already defined an account level customTab as "AccountName" then you can add them using REST API using below call like:
{
"textTabs": [{
"tabLabel": "AccountName",
"documentId": "83644555",
"recipientId": "84066562",
"pageNumber": 1,
"value": "AccountName",
"anchorString": "/txtAccountName1/"
}]
}
So before using the Accountlevel custom tab, you need to create it in your DocuSign account from webapp or using API - https://docs.docusign.com/esign/restapi/CustomTabs/CustomTabs/create/, Once it is created then only you can use it in an envelope.
As far as I can tell, you can't pull in the custom tag definition. You need to define the entire tag every time you use it, which means you'll need to use something like this:
"textCustomFields" : [ {
"name" : "AccountName",
"show" : "true",
"required" : "False",
"value" : "Test Account",
"anchorString": "/txtAccountName1/"
"customFieldType" : "text"
} ]

React Component iterate/loop properties of Mongo object

I do have a Mongo collection that stores albums with predefined "slot" for its images and feel a bit stuck if there a way to loop over the properties of the collection in order to display images in separated divs.
I did used this code for mapping over the album covers and it worked great:
albums() {
return Albums.find().fetch();
}
{this.albums().map( (album) => {
return <div key={album._id}><img src={album.cover} /></div>
})}
But now I ask you to help, is it possible to loop over photoOne, photoTwo, etc... and skip/don't display data if it is empty like in photoThree for example.
{
"_id" : "CHMHbNWWwZGaLGvB6",
"title" : "Text",
"cover" : "link",
"createdAt" : date,
"photoOne" : {
"titleOne" : "Text",
"coverOne" : "link"
}
"photoTwo" : {
"titleTwo" : "Text",
"coverTwo" : "link"
}
"photoThree" : {
"titleThree" : "",
"coverThree" : ""
}
}
I'm not a Mongo user, but in the map function you can check for the existing values, and handle it there. Something like (there is surely a cleaner way, though):
this.albums().map( (album) => {
for (key in album){
if (key.startsWith('photo')){
var title = Object.keys(album[key])[0];
if (album[key][title].length != 0){
console.log("Can use: " + Object.keys(album[key])[0])
}
}
}
})
Results in:
Can Use This: titleOne
Can Use This: titleTwo
Hope that helps, but it seems like having the photos with photoOne, photoTwo you are limiting the number of photos to use and requiring the need to use Object.keys to get the values out (without specifically using album.photoOne, album.photoTwo, etc.
If the album photos were stored in an embedded document, you could just include the photos and titles that existed and avoid having to check for empty ones. You would just loop through the photos that are present....if that makes sense.