How to decrypt mongodb objectId on Nodejs CosomosDB Trigger - mongodb

I am retrieving my azure cosmosdb/mongodb document from a custom trigger to azure functions.. But my objectId seems to be encrypted.. How to get the correct objectid..
for example ObjectId("5df88e60d588f00c32a3c9ce") is coming as ]øŽ`Õˆð2£ÉÎ
or ObjectId("5df88f92d588f00c32a3c9d1") is coming as ]ø’Õˆð2£ÉÑ
Is there a way to retrieve objectid in nodejs/python or any script if i give ]ø’Õˆð2£ÉÑ as input.
This is my function.json used in the azure function
{
"scriptFile": "__init__.py",
"bindings": [
{
"type": "cosmosDBTrigger",
"name": "documents",
"direction": "in",
"leaseCollectionName": "leases1",
"connectionStringSetting": "devcosmosdb_DOCUMENTDB",
"databaseName": "devcosmosdb",
"collectionName": "newCollection",
"createLeaseCollectionIfNotExists": "true"
}
]
}
This is my nodejs code..
module.exports = async function (context, documents) {
if (!!documents && documents.length > 0) {
context.log('Document Id: ', documents[0].id);
context.log(documents[0]);
}
}
This is my output and this is where i am not getting the objectid properly..
2020-06-16T17:16:38Z [Information] Executing 'Functions.changeTrigger' (Reason='New changes on collection newCollection at 2020-06-16T17:16:38.2618864Z', Id=adc9556a-133f-4e85-b533-5574283a5a7d)
2020-06-16T17:16:38Z [Information] Document Id: NWRmODhkZGRkNTg4ZjAwYzMyYTNjOWNj
2020-06-16T17:16:38Z [Information] {
id: 'NWRmODhkZGRkNTg4ZjAwYzMyYTNjOWNj',
_rid: 'KEcnAO163B4EAAAAAAAAAA==',
_self: 'dbs/KEcnAA==/colls/KEcnAO163B4=/docs/KEcnAO163B4EAAAAAAAAAA==/',
_ts: 1592327797,
_etag: '"0000c1d2-0000-0300-0000-5ee8fe750000"',
'$t': 3,
'$v': {
_id: { '$t': 7, '$v': ']øÝÕð\f2£ÉÌ' },
name: { '$t': 2, '$v': 'myname' },
email: { '$t': 2, '$v': 'my email' },
},
_lsn: 537
}

Please go to Azure portal to check the content of your document. I have done a test on my side, it just works fine.
Here is the document I used to test.
{
"id": "testid1",
"test1":"testvalue1",
"test2":{
"test21":"test21value",
"objectId":"5df88f92d588f00c32a3c9d1"
}
}
After clicking save button, the function will be triggered.
Here is my testing code.
def main(documents: func.DocumentList) -> str:
if documents:
logging.info('Document id: %s', documents[0]['id'])
logging.info('%s',documents[0].to_json())
The output is as below.
Update:
Currently only SQL API base is supported in Azure function cosmosdb trigger. You can also find the feature under Settings part.
The URI should be something like https://testbowman.documents.azure.com:443/
If you create a mongodb api cosmosdb account, you won't find the 'add to function' feature. And the URI should be something like https://tonycosmosdb.mongo.cosmos.azure.com:443/

Related

Strapi email designer plugin reference template to record

I'm currently developing a multi-tenant API with Strapi and for one of the parts I use the Strapi email designer plugin because I want to send some emails but I want them to be custom designed for each tenant, the problem is that the plugin's table is not accessible in the content manager of Strapi so I can only hard code the template to a specific endpoint, is there a way to have the plugin table in the content manager or for it to be referenced to a content manager table something like:
(table)tenant->(field)templateId => (ref-table)plugin-email-designer->(ref-field)templateId
you know so I can switch and set dynamically from the Strapi panel and not with hard-coded endpoints
I've checked your issue briefly, and there is option you are going to like, but it involves using patch-package...
So, let's assume that you have strapi project created and you have added strapi-plugin-email-designer and you are using yarn v1.xx.xx:
yarn add patch-package postinstall-postinstall
Go to node_modules/strapi-plugin-email-designer/server/content-types/email-template/schema.json
change following fileds:
{
...
"pluginOptions": {
"content-manager": {
"visible": true
},
"content-type-builder": {
"visible": true
}
},
...
}
now run
yarn patch-package strapi-plugin-email-designer
now open your projects package.json and add to scripts:
{
"scripts": {
...
"postinstall": "patch-package"
}
}
run
yarn build
yarn develop
head to admin ui, you should see new Collection:
so now you can do that:
Sending Email
Let's assume you added a relation has one called email_template to your model.
Next we need to add custom route, so in /src/api/tenant/routes/ create file called routes.js
/src/api/tenant/routes/routes.js
module.exports = {
routes: [
{
method: 'POST',
path: `/tenants/:id/send`,
handler: `tenant.send`
}
]
}
now, we need to add handler to controller:
/src/api/tenant/controllers/tenant.js
"use strict";
/**
* tenant controller
*/
const { createCoreController } = require("#strapi/strapi").factories;
module.exports = createCoreController("api::tenant.tenant", ({ strapi }) => ({
async send(ctx) {
const { id } = ctx.params;
const { data } = ctx.request.body;
// notice, if you need extra validation you add it here
// if (!data) return ctx.badRequest("no data was provided");
const { to, subject } = data;
const { email_template, ...tenant } = await strapi.db
.query("api::tenant.tenant")
// if you have extra relations it's better to populate them directly here
.findOne({ where: { id }, populate: ["email_template"] });
console.log(email_template);
try {
await strapi
.plugin("email-designer")
.service("email")
.sendTemplatedEmail(
{
to,
//from, < should be set in /config/plugins.js email.settings.defaultFrom
//replayTo < should be set in /config/plugins.js email.settings.defaultReplyTo
},
{
templateReferenceId: email_template.templateReferenceId,
subject,
},
{
...tenant,
// this equals to apply all the data you have in tenant
// this may need to be aligned between your tenant and template
}
);
return { success: `Message sent to ${to}` };
} catch (e) {
strapi.log.debug("📺: ", e);
return ctx.badRequest(null, e);
}
},
}));
don't forget to enable access to /api/tenants/:id/send in admin panel, Settings - Roles
POST http://localhost:1337/api/tenants/1/send
{
"data": {
"to" : "email#example.com",
"subject": "Hello World"
}
}
response:
{
"success": "Message sent to email#example.com"
}
pls note, there is no template validation, e.g. if you give it a wrong template it would not be happy

is there a way to put web-activity result into sql-table (sink)

Unable to get the output of a web activity into a sql-table using azure-data-factory.
This is what i have done and where im getting stuck (Step 3).
Steps:
1.Get a token from a API-call
Get the results from API Call using token from step 1
and tkae this results in a successfull query that provides me with 'JSON'
Take the result from previous activity 'JSON' and put in in a azure sql database table.
azure-datafactory - web activites
Can you not get this done using a copy activity ? You should configure the source as Web activity and sink as SQL. I was playing with this http://dummy.restapiexample.com/api/v1/employees and we needed to introduce the structure . This is what I did and it work .
"source": {
"type": "RestSource",
"httpRequestTimeout": "00:01:40",
"requestInterval": "00.00:00:00.010",
"structure": [
{
"id": "id"
},
{
"employee_salary": "employee_salary"
}
]
},
"sink": {
"type": "SqlServerSink",
"writeBatchSize": 10000
},
"enableStaging": false
},
You can read more . https://learn.microsoft.com/en-us/azure/data-factory/copy-activity-schema-and-type-mapping

Loopback 3.8 and Azure Cosmos DB

We have a Loopback v3.8 application using MongoDB connector v3.1
It works fine in the environments running native MongoDB but now we would like to deploy to Azure and use Cosmos DB, which in theory should support all the native MongoDB commands.
The problem we're having is that PATCH operations (which I believe are mapped to Model.updateAttributes by Loopback) are not working.
This is the error we get:
Could not update Client. { Error: No Client found for id
592cc132a31109354c45d1d8 }
Loopback debug strings:
loopback:connector:mongodb updateAttributes +7ms Client 592cc132a31109354c45d1d8 { '$set': { loginDate:2017-06-02T12:30:18.201Z } }
loopback:connector:mongodb MongoDB: model=Client command=findAndModify +2ms [ { _id: 592cc132a31109354c45d1d8 },
[ [ '_id', 'asc' ] ],
{ '$set': { loginDate: 2017-06-02T12:30:18.201Z } },
{}, [Function] ]
loopback:connector:mongodb Result: +399ms { _t: 'FindAndModifyResponse', ok: 1, value: null, lastErrorObject: { n: 1, updatedExisting: false, value: null } }
loopback:connector:mongodb updateAttributes.callback +4ms Client 592cc132a31109354c45d1d8 null { _t: 'FindAndModifyResponse', ok: 1,
value: null,
lastErrorObject: { n: 1, updatedExisting: false, value: null } }
If we do a GET for that Client, using its Id, we get the correct response, so the Client document is there.
Can the Loopback MongoDB connector used for Cosmos DB?
Are we missing something that requires Loopback to work correctly with Cosmos DB?
Thanks.
This is because the underlying base for cosmosdb is no longer MongoDB, it simply allows you to access is using a familiar API. The MongoDB connector is not intended for use with Cosmos DB, I personally have been looking for a solution for my own use and came across the NPM package loopback-connector-cosmosdb which worked for some simple applications but is completely unsupported by it's developer and loopback.

How to ask permission in Actions on Google without the SDK?

I would like to know the name of the user, however I cannot use the nodejs sdk since I use another language.
How can I ask for permission?
I would prefer a way with the normal json responses.
I hacked this minimal script to get the JSON reponse which the nodejs sdk would return:
gaction.js:
const DialogflowApp = require('actions-on-google').DialogflowApp;
const app = new DialogflowApp({
request: {
body: {
result: {
action: 'Test',
contexts: []
}
},
get: (h) => h
},
response: {
append: (h, v) => console.log(`${h}: ${v}`),
status: (code) => {
return {send: (resp) => console.log(JSON.stringify(resp, null, 2))}
}
}
});
function testCode(app) {
app.askForPermission('To locate you', app.SupportedPermissions.DEVICE_PRECISE_LOCATION);
}
app.handleRequest(new Map().set('Test', testCode));
I'm still no node.js expert so this might be not an optimal solution. When you have installed node and run the command npm install actions-on-google, this will install the necessary dependencies.
When done you just need to run node gaction which will create this output:
Google-Assistant-API-Version: Google-Assistant-API-Version
Content-Type: application/json
{
"speech": "PLACEHOLDER_FOR_PERMISSION",
"contextOut": [
{
"name": "_actions_on_google_",
"lifespan": 100,
"parameters": {}
}
],
"data": {
"google": {
"expect_user_response": true,
"no_input_prompts": [],
"is_ssml": false,
"system_intent": {
"intent": "assistant.intent.action.PERMISSION",
"spec": {
"permission_value_spec": {
"opt_context": "To locate you",
"permissions": [
"DEVICE_PRECISE_LOCATION"
]
}
}
}
}
}
}
If you send now the JSON above you will be asked from Google Home. Have fun!
The request/response JSON formats for the API.AI webhooks with Actions is documented at https://developers.google.com/actions/apiai/webhook
As you've discovered, the data.google.permissions_request attribute contains two fields regarding the request:
opt_context contains a string which is read to give some context about why you're asking for the information.
permissions is an array of strings specifying what information you're requesting. The strings can have the values
NAME
DEVICE_COARSE_LOCATION
DEVICE_PRECISE_LOCATION
If you are using Java or Kotlin there is an Unofficial SDK. It matches the official SDK api nearly exactly.
https://github.com/TicketmasterMobileStudio/actions-on-google-kotlin

Unable to send data to Sitecatalyst with function CQ_Analytics.record

I am working on a POC involving AEM and site catalyst integration .
I am using AEM’s out of box Geomatrixx outdoors website which already implements site catalyst features.
Data is being populated to report suite via
• Data tracking (on page load)
data-tracking="{'event': ['eventName'], 'values': {'key': 'value', 'nextKey': 'nextValue'}, componentPath: 'myapp/component/mycomponent'}"
• CQ_Analytics.record(after page load, activates on a page).
CQ_Analytics.record({event: 'eventName', values: { valueName: 'VALUE' }, collect: false, options: { obj: this, defaultLinkType: 'X' }, componentPath: '<%=resource.getResourceType()%>'})
UseCase: When, I am adding a product to cart below function gets executed CQ_Analytics.record But unable to send cart addition data to site catalyst .
I have verified same using adobe digital debugger.
Code snippet from /libs/commerce/components/product/product.jsp
function trackCartAdd(form) {
if (CQ_Analytics.Sitecatalyst) {
var productQuantity = Number($("input[name='product-quantity']", form).val() || '1');
var productPrice = Number($("input[name='product-size']:checked", form).data('price').replace(/[^0-9\\.]/g, ''));
var productChildSku = $("input[name='product-size']:checked", form).data('sku')
CQ_Analytics.record({
"event": ["cartAdd"<%= (session.getCartEntryCount() == 0) ? ", 'cartOpen'" : "" %>],
"values": {
"product": [{
"category": "",
"sku": "<%= xssAPI.encodeForJSString(baseProduct.getSKU()) %>",
"price": productPrice * productQuantity,
"quantity": productQuantity,
"evars": {
"childSku": CQ.shared.Util.htmlEncode(productChildSku)
}
}]
},
"componentPath": "<%= xssAPI.encodeForJSString(resource.getResourceType()) %>"
});
}
return true;
}
Note: I have done the product variable mapping for report suite in AEM.
Please guide me .