AWS velocity template - How to discern between string or other - aws-api-gateway

In Amazon API gateway I'm using a body mapping template to transform the request. I found that keeping track of the commas was cumbersome (especially with multiple optional parameters) so I came up with the following:
{
"context": { /* context params */ },
"request": {
#foreach($queryParam in $input.params().querystring.keySet())
"$queryParam" : "$input.params().querystring.get($queryParam)"
#if($foreach.hasNext),#end
#end
}
}
The issue I find with this is that when $input.params().querystring.get($queryParam) is an integer (and shouldn't be enclosed with quotes) then it doesn't work. That seems fair enough, but how do I improve this to check if $input.params().querystring.get($queryParam) is a string, so that I can subsequently wrap it in quotation marks?
Request
http://www.somewebsite.com/apiendpoint?id=4&name=Terry&aliases=[Tel,Terry]
Transformation
{
"id": "4",
"name": "Terry",
"aliases": "[Tel,Terry]"
}
Expected Transformation
{
"id": 4,
"name": "Terry",
"aliases": ["Tel","Terry"]
}

Then you would do something like:
{
"context": { /* context params */ },
"request": {
#foreach($queryParam in $input.params().querystring.keySet())
#set($value = $input.params().querystring.get($queryParam))
#set($isNum = $value.matches('[-+]?\d+(\.\d+)?'))
"$queryParam" : #if(!$isNum)"#end$value#if(!$isNum)"#end
#if($foreach.hasNext),#end
#end
}
}

Related

Azure Data Factory Copy Data activity - Use variables/expressions in mapping to dynamically select correct incoming column

I have the below mappings for a Copy activity in ADF:
"translator": {
"type": "TabularTranslator",
"mappings": [
{
"source": {
"path": "$['id']"
},
"sink": {
"name": "TicketID"
}
},
{
"source": {
"path": "$['summary']"
},
"sink": {
"name": "TicketSummary"
}
},
{
"source": {
"path": "$['status']['name']"
},
"sink": {
"name": "TicketStatus"
}
},
{
"source": {
"path": "$['company']['identifier']"
},
"sink": {
"name": "CustomerAccountNumber"
}
},
{
"source": {
"path": "$['company']['name']"
},
"sink": {
"name": "CustomerName"
}
},
{
"source": {
"path": "$['customFields'][74]['value']"
},
"sink": {
"name": "Landlord"
}
},
{
"source": {
"path": "$['customFields'][75]['value']"
},
"sink": {
"name": "Building"
}
}
],
"collectionReference": "",
"mapComplexValuesToString": false
}
The challenge I need to overcome is that the array indexes of the custom fields of the last two sources might change. So I've created an Azure Function which calculates the correct array index. However I can't work out how to use the Azure Function output value in the source path string - I have tried to refer to it using an expression like #activity('Get Building Field Index').output but as it's expecting a JSON path, this doesn't work and produces an error:
JSON path $['customFields'][#activity('Get Building Field Index').outputS]['value'] is invalid.
Is there a different way to achieve what I am trying to do?
Thanks in advance
I have a slightly similar scenario that you might be able to work with.
First, I have a JSON file that is emitted that I then access with Synapse/ADF with Lookup.
I next have a For each activity that runs a copy data activity.
The for each activity receives my Lookup and makes my JSON usable, by setting the following in the For each's Settings like so:
#activity('Lookup').output.firstRow.childItems
My JSON roughly looks as follows:
{"childItems": [
{"subpath": "path/to/folder",
"filename": "filename.parquet",
"subfolder": "subfolder",
"outfolder": "subfolder",
"origin": "A"}]}
So this means in my copy data activity within the for each activity, I can access the parameters of my JSON like so:
#item()['subpath']
#item()['filename']
#item()['folder']
.. etc
Edit:
Adding some screen caps of the parameterization:
https://i.stack.imgur.com/aHpWk.png

API Management API Schema-Definition create - Multiple definitions under 1 schema at a time

I am trying to use the REST API PUT call to
https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.ApiManagement/service/{serviceName}/apis/{apiName}/schemas/{schemaId}?api-version=2021-01-01-preview
as an equivalent powershell cmdlet doesn't function as expected for adding schema-definitions. But the problem is even with REST API call, it is able to add one definition at a time. If my schema has more than 1 definition, when I fire the 2nd and subsequent PUT call it overwrites the previously written definition, and finally only 1 definition remains. Tried adding the If-Match to Request Header on 2nd and subsequent calls too, but in vain.
Tried adding multiple definitions under "schemas" as array of json, but even if that creates multiple definitions in 1 go, the DefinitionName are 0, 1, 2, 3 etc. and not actual names given in the input json body.
Multiple Definition Sample below -
"properties": {
"contentType": "application/vnd.oai.openapi.components+json",
"document": {
"components": {
"schemas": [
{
"Definition1": {
"type": "object",
"properties": {
"String1": {
"type": "string"
}
}
},
"Definition2": {
"type": "object",
"properties": {
"String2": {
"type": "integer"
}
}
}
}
]**
}
}
}
}
Does the PUT call allow putting definitions at once and if so, how?
Found the issue in the JSON being PUT on the REST API request.
The multiple definition json has to be like this -
{
"properties": {
"contentType": "application/vnd.oai.openapi.components+json",
"document": {
"components": {
"schemas": {
"Definition1": {
"type": "object",
"properties": {
"String1": {
"type": "string"
}
}
},
"Definition2": {
"type": "object",
"properties": {
"String2": {
"type": "integer"
}
}
}
}
}
}
}
}
The definitions given under "schemas" need not be put inside []. Just specify as per the above json structuring and we should be good.

Unable to parse template language expression 'encodeURIComponent([parameters('table_storage_name')])'

Hey I am doing a CI/CD deployment for a logic app, I have a table storage where I store some data, I have two table storage for test and prod environment. I created a parameter called *table_storage_name" in ARM template :
"parameters": {
// ....
"connections_azuretables_1_externalid": {
"defaultValue": "/subscriptions/e5..../resourceGroups/myrg.../providers/Microsoft.Web/connections/azuretables-1",
"type": "String"
},
"table_storage_name": {
"defaultValue": "testdevops",
"type": "String"
}
}
The error comes from when I reference the parameter here in template.json file:
// ...
"Insert_Entity": {
"runAfter": {
"Initialize_variable": [
"Succeeded"
]
},
"type": "ApiConnection",
"inputs": {
"body": {
"PartitionKey": "#body('Parse_JSON')?['name']",
"RowKey": "#body('Parse_JSON')?['last']"
},
"host": {
"connection": {
"name": "#parameters('$connections')['azuretables_1']['connectionId']"
}
},
"method": "post",
// problem occur after this line
"path": "/Tables/#{encodeURIComponent('[parameters('table_storage_name')]')}/entities"
}
}
but get this error:
InvalidTemplate: The template validation failed: 'The template action 'Insert_Entity' at line '1' and column '582' is not valid: "Unable to parse template language expression 'encodeURIComponent([parameters('table_storage_name')])': expected token 'Identifier' and actual 'LeftSquareBracket'.".'.
I tried escaping the quote with a backslash like: encodeURIComponent(\'[parameters('table_storage_name')]\') or encodeURIComponent('[parameters(''table_storage_name'')]') but all of them raise an error. How can I reference a paramter inside encodeURIComponent in an ARM template ?
As discussed in the comments. credits: #marone
"path": "/Tables/#{encodeURIComponent(parameters('table_storage_name'))}/entities"
Found the solution from this link https://platform.deloitte.com.au/articles/preparing-azure-logic-apps-for-cicd
but here are the steps to reference a parameter logic app:
create an ARM parameter table_storage_name_armparam in template.json, in order to use it's value to reference the value of the ARM parameter (yes it's confusing but follow along you'll understand):
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"table_storage_name_armparam": {
"type": "String"
}
},
"variables": {},
"resources": [
{
......
}
Now in the logic app parameter value (in the bottom of json file) create the logic app parameter table_storage_name and the value of this parameter will be the ARM parameter created in step 1:
.......
"parameters": {
"$connections": {
"value": {
"azuretables": {
"connectionId": "[parameters('connections_azuretables_externalid')]",
"connectionName": "azuretables",
"id": "/subscriptions/xxxxx-xxxx-xxxx-xxxxxxxx/providers/Microsoft.Web/locations/francecentral/managedApis/azuretables"
}
}
},
"table_storage_name": {
"value": "[parameters('table_storage_name_armparam')]"
}
}
}
}
]
}
finally, reference the logic app parameter value as follow:
"path": "/Tables/#{encodeURIComponent(parameters('table_storage_name'))}/entities"

MongoDB Stitch GraphQL Custom Mutation Resolver returning null

GraphQL is a newer feature for MongoDB Stitch, and I know it is in beta, so thank you for your help in advance. I am excited about using GraphQL directly in Stitch so I am hoping that maybe I just overlooked something.
The documentation for the return Payload displays the use of bsonType, but when actually entering the JSON Schema for the payload type it asks for you to use "type" instead of "bsonType". It still works using "bsonType" to me which is odd as long as at least one of the properties uses "type".
Below is the function:
const mongodb = context.services.get("mongodb-atlas");
const collection = mongodb.db("<database>").collection("<collection>");
const query = { _id: BSON.ObjectId(input.id) }
const update = {
"$push": {
"notes": {
"createdBy": context.user.id,
"createdAt": new Date,
"text": input.text
}
}
};
const options = { returnNewDocument: true }
collection.findOneAndUpdate(query, update, options).then(updatedDocument => {
if(updatedDocument) {
console.log(`Successfully updated document: ${updatedDocument}.`)
} else {
console.log("No document matches the provided query.")
}
return {
_id: updatedDocument._id,
notes: updatedDocument.notes
}
})
.catch(err => console.error(`Failed to find and update document: ${err}`))
}
Here is the Input Type in the customer resolver:
"type": "object",
"title": "AddNoteToLeadInput",
"required": [
"id",
"text"
],
"properties": {
"id": {
"type": "string"
},
"text": {
"type": "string"
}
}
}
Below is the Payload Type:
{
"type": "object",
"title": "AddNoteToLeadPayload",
"properties": {
"_id": {
"type": "objectId"
},
"notes": {
"type": "array",
"items": {
"type": "object",
"properties": {
"createdAt": {
"type": "string"
},
"createdBy": {
"type": "string"
},
"text": {
"type": "string"
}
}
}
}
}
}
When entering the wrong "type" the error states:
Expected valid values are:[array boolean integer number null object string]
When entering the wrong "bsonType" the error states:
Expected valid values are:[string object array objectId boolean bool null regex date timestamp int long decimal double number binData]
I've tried every combination I can think of including changing all "bsonType" to "type". I also tried changing the _id to a string when using "type" or objectId when "bsonType". No matter what combination I try when I use the mutation it does what it is supposed to and adds the note into the lead, but the return payload always displays null. I need it to return the _id and note so that it will update the InMemoryCache in Apollo on the front end.
I noticed that you might be missing a return before your call to collection.findOneAndUpdate()
I tried this function (similar to yours) and got GraphiQL to return values (with String for all the input and payload types)
exports = function(input){
const mongodb = context.services.get("mongodb-atlas");
const collection = mongodb.db("todo").collection("dreams");
const query = { _id: input.id }
const update = {
"$push": {
"notes": {
"createdBy": context.user.id,
"createdAt": "6/10/10/10",
"text": input.text
}
}
};
const options = { returnNewDocument: true }
return collection.findOneAndUpdate(query, update, options).then(updatedDocument => {
if(updatedDocument) {
console.log(`Successfully updated document: ${updatedDocument}.`)
} else {
console.log("No document matches the provided query.")
}
return {
_id: updatedDocument._id,
notes: updatedDocument.notes
}
})
.catch(err => console.error(`Failed to find and update document: ${err}`))
}
Hi Bernard – There is an unfortunate bug in the custom resolver form UI at the moment which doesn't allow you to only use bsonType in the input/payload types – we are working on addressing this. In actually you should be able to use either type/bsonType or a mix of the two as long as they agree with your data. I think that the payload type definition you want is likely:
{
"type": "object",
"title": "AddNoteToLeadPayload",
"properties": {
"_id": {
"bsonType": "objectId"
},
"notes": {
"type": "array",
"items": {
"type": "object",
"properties": {
"createdAt": {
"bsonType": "date"
},
"createdBy": {
"type": "string"
},
"text": {
"type": "string"
}
}
}
}
}
}
If that doesn't work, it might be helpful to give us a sample of the data that you would like returned.

CloudFormation - Access Output of Parent Stack in Child Nested stack

I have a master Cloudformation template which invokes two child templates. I have my first template run and the Outputs captured in the Outputs section of the resource. I have given lot of tries in using the ChildStack01 Output values in the Second Template which is nested and I am not sure why I get Template format error: Unresolved resource dependencies [XYZ] in the Resources block of the template. Here is my master template.
{
"AWSTemplateFormatVersion": "2010-09-09",
"Resources": {
"LambdaStack": {
"Type": "AWS::CloudFormation::Stack",
"Properties": {
"TemplateURL": "https://s3.amazonaws.com/bucket1/cloudformation/Test1.json",
"TimeoutInMinutes": "60"
}
},
"PermissionsStack": {
"Type": "AWS::CloudFormation::Stack",
"Properties": {
"TemplateURL": "https://s3.amazonaws.com/bucket1/cloudformation/Test2.json",
"Parameters": {
"LambdaTest": {
"Fn::GetAtt": ["LambdaStack", "Outputs.LambdaTest"]
}
},
"TimeoutInMinutes": "60"
}
}
}
}
Here is my Test1.json Template
{
"Resources": {
"LambdaTestRes": {
"Type": "AWS::Lambda::Function",
"Properties": {
"Description": "Testing AWS cloud formation",
"FunctionName": "LambdaTest",
"Handler": "lambda_handler.lambda_handler",
"MemorySize": 128,
"Role": "arn:aws:iam::3423435234235:role/lambda_role",
"Runtime": "python2.7",
"Timeout": 300,
"Code": {
"S3Bucket": "bucket1",
"S3Key": "cloudformation/XYZ.zip"
}
}
}
},
"Outputs": {
"LambdaTest": {
"Value": {
"Fn::GetAtt": ["LambdaTestRes", "Arn"]
}
}
}
}
Here is My Test2.json which has to use the output of Test1.json.
{
"Resources": {
"LambdaPermissionLambdaTest": {
"Type": "AWS::Lambda::Permission",
"Properties": {
"Action": "lambda:invokeFunction",
"FunctionName": {
"Ref": "LambdaTest"
},
"Principal": "apigateway.amazonaws.com",
"SourceArn": {
"Fn::Join": ["", ["arn:aws:execute-api:", {
"Ref": "AWS::Region"
}, ":", {
"Ref": "AWS::AccountId"
}, ":", {
"Ref": "TestAPI"
}, "/*"]]
}
}
}
},
"Parameters": {
"LambdaTest": {
"Type": "String"
}
}
}
It is not enough to just have output, you need to export that output.
Look here: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/using-cfn-stack-exports.html
So you need something like:
"Outputs": {
"LambdaTest": {
"Value": {
"Fn::GetAtt": ["LambdaTestRes", "Arn"]
}
"Export": {
"Name": "LambdaTest"
}
}
}
You have two unresolved Ref resource dependencies in Test2.json, one to LambdaTest and one to TestAPI.
For LambdaTest, it looks like you're trying to pass this as a parameter from the parent stack, but you haven't specified it as an input Parameter in the child Test2.json template. Add an entry in Test2.json's Parameters section, like this:
"Parameters": {
"LambdaTest": {
"Type": "String"
}
},
Regarding TestAPI, this reference doesn't seem to appear anywhere else in your templates, so you should either specify this as a fixed string directly, or add another input Parameter in your Test2.json stack (see above) and then provide it from the parent stack.
The error is coming from test1.json(LambdaStack).
Logical ID
An identifier for the current output. The logical ID must be alphanumeric (a-z, A-Z, 0-9) and unique within the template.
It seems you have two logical ID with the same name "LambdaTest", one in resource section and other in output section.