Amplify Datastore subscription cost - aws-appsync

I am trying to understand the cost of Datastore. It seems that it subscribes to all Mutations. So if there are 50 users, then each message will be send 50 times, even if it not required.
As each real time mutation costs money, we will be paying unnecessary 49 times for this real time message mutation.
Also , it seems to me SyncExpression doesn't have any effect on this Subscription.
I am really stuck here. It will be great of someone can clarify

Amplify generates the datastore boilerplate code for you, but you still need to call it. You won't pay for every user and every mutation.
You will only subscribe to a mutation (explicitly call the code to listen for changes) on a per-user basis for things that user is interested in. e.g. if you are viewing a TODO item, you'd subscribe the user to that item and they'll immediately see if someone else modify it on another device.
UPDATE
Long story... I was triggering back-end computation via GraphQL by making a lambda resolver. The computation took too long and the GQL call would timeout. I updated the code so the GQL call called itself asynchronously (re-trigger the lambda), and returned immediately. Then when the long-running task completed in the spun-up lambda, I updated the a record in the database.
I update the record using AppSync instead of direct GQL so it would trigger mutations, and in the react client, I listen to a mutation for the specific record that will be updated. This way, there is just 1 user listening (if they've triggered the long running action) and that user is only notified about changes to the single DB record they're interested in, and not receiving other user's updates.
I don't know if all this is applicable to your situation. The code snippets below may help you, but they're somewhat out of context.
// In amplify/backend/api/projectname/schema.graphql
type Subscription {
onCouponWithIdUpdated(id: ID!): Coupon #aws_subscribe(mutations: ["updateCoupon"])
}
// In my useSendCoupon hook...
// Subscribe to coupon updates
useEffect(() => {
if (0 === couponId) {
return
}
console.log(`subscribe to coupon updates for couponId:`, couponId)
const onCouponWithIdUpdated = /* GraphQL */ `
subscription OnCouponWithIdUpdated($id: ID!) {
onCouponWithIdUpdated(id: $id) {
id
proofLink
owner
}
}
`
const subscription = API
.graphql(graphqlOperation(onCouponWithIdUpdated, { id: couponId }))
.subscribe({
next: ({ provider, value }) => {
const coupon = value.data.onCouponWithIdUpdated
//console.log(`Proof Link:`, coupon.proofLink)
setProofLinks([coupon.proofLink])
setSendCouponState(COUPON_STATE_PREVIEW_SUCCESS)
},
error: error => console.warn(error)
})
console.log('subscribed: ', subscription)
return () => {
console.log(`unsubscribe to coupon updates`)
subscription.unsubscribe()
}
}, [couponId])
// inside a lambda...
const updateCouponWithProof = async (authorization, couponId, proofLink) => {
const initializeClient = () => new AWSAppSyncClient({
url: process.env.API_XXXX_GRAPHQLAPIENDPOINTOUTPUT,
region: process.env.REGION,
auth: {
type: AUTH_TYPE.AMAZON_COGNITO_USER_POOLS,
jwtToken: authorization
},
disableOffline: true,
})
const executeMutation = async (mutation, operationName, variables) => {
const client = initializeClient()
try {
const response = await client.mutate({
mutation: gql(mutation),
variables,
fetchPolicy: "no-cache",
})
return response.data[operationName]
} catch (err) {
console.log("Error while trying to mutate data", err)
throw JSON.stringify(err)
}
}
const updateCoupon = /* GraphQL */ `
mutation UpdateCoupon(
$input: UpdateCouponInput!
$condition: ModelCouponConditionInput
) {
updateCoupon(input: $input, condition: $condition) {
id
proofLink
owner
}
}
`
const variables = { input: { id: couponId, proofLink } }
try {
return await executeMutation(updateCoupon, 'updateCoupon', variables)
} catch (error) {
console.log(`executeMutation error`, error)
}
}

Related

Send message and leave server on ready event if not whitelisted (Discord.JS + MongoDB)

I'm coding a whitelist system for my discord bot that, on ready event (and after a 3 seconds delay), checks if every server it is in has it's ID added to the whitelist database on MongoDB. If not, the bot sends an embed and leaves the server. I managed to get it working on the guildCreate event, but on ready event it performs the message and leave actions on every single server without filtering conditions, even though those are added to the list. I cannot figure out why. Also, I'm still new to JavaScript, so it could be just a minor mistake.
//VARIABLES
const { Client, MessageEmbed } = require("discord.js")
const config = require('../../Files/Configuration/config.json');
const DB = require("../../Schemas/WhitelistDB");
//READY EVENT
module.exports = {
name: "ready",
once: false,
async execute(client) {
//[ ... ] <--- OTHER UNNECESSARY CODE IN BETWEEN
setTimeout(function() { // <--- 3 SECONDS DELAY
client.guilds.cache.forEach(async (guild) => { // <--- CHECK EVERY SERVER
await DB.find({}).then(whitelistServers => { // <--- CHECK MONGODB ID LIST
if(!whitelistServers.includes(guild.id)) {
const channel = guild.channels.cache.filter(c => c.type === 'GUILD_TEXT').random(1)[0]; // <--- SEND MESSAGE TO RANDOM TEXT CHANNEL (It is sending to every server, when it should be sending only to the not whitelisted ones)
if(channel) {
const WhitelistEmbed = new MessageEmbed()
WhitelistEmbed.setColor(config.colors.RED)
WhitelistEmbed.setDescription(`${config.symbols.ERROR} ${config.messages.SERVER_NOT_WHITELISTED}`)
channel.send({embeds: [WhitelistEmbed]});
}
client.guilds.cache.get(guild.id).leave(); // <--- LEAVE SERVER (It is leaving every server, when it should be leaving only the not whitelisted ones)
} else { return }
});
});
}, 1000 * 3);
}
}
I found the solution myself!
Instead of finding the array of whitelisted ID's for each guild, find one at a time and instead of checking the content of the array, check if the array exists. This is the updated code:
//WHITELIST
setTimeout(function() {
client.guilds.cache.forEach(async (guild) => {
await DB.findOne({ GuildID: guild.id }).then(whitelistServers => {
if(!whitelistServers) {
const channel = guild.channels.cache.filter(c => c.type === 'GUILD_TEXT').random(1)[0];
if(channel) {
const WhitelistEmbed = new MessageEmbed()
WhitelistEmbed.setColor(config.colors.RED)
WhitelistEmbed.setDescription(`${config.symbols.ERROR} ${config.messages.SERVER_NOT_WHITELISTED}`)
channel.send({embeds: [WhitelistEmbed]});
}
client.guilds.cache.get(guild.id).leave();
} else { return }
});
});
}, 1000 * 3);

Google Action Webhook Inline Editor Returns Before the API call

This is my first Google Action project. I have a simple slot after the invocation. User enters the value on prompt and slot invokes the webhook and make a call to API using the user input. All works fine. However the webhook returns to users even before the API call finish processing and returns the value (line 1 conv.add). I do see in the logs that everything from API is logged fine after the webhook returns to user. Below is the code I am using. I am using inline editor. What am I missing? Thanks for help in advance.
const { conversation } = require('#assistant/conversation');
const functions = require('firebase-functions');
var https = require('https');
const fetch = require('node-fetch');
const app = conversation({debug: true});
app.handle('SearchData', conv => {
const body = JSON.stringify({
val: "this is my body"
});
// prepare the header
var postheaders = {
'Content-Type' : 'application/json',
'Auth' : 'MyAuthCreds'
};
fetch('https://host.domain.com/data', {
method: 'post',
body: body,
headers: postheaders,
})
.then(res => res.json())
.then(d => {
console.log(d);
var profile = d;//JSON.parse(d);
console.log(d.entries);
console.log("Length: "+ d.entries.length);
if(d.entries.length > 0)
{
console.log("Data found");
conv.add("Data found"); //line 1
}
else
{
console.log("no data found");
conv.add("no data found"); //line 1
}
})
.catch(function (err) {
// POST failed...
console.log(err);
});
});
exports.ActionsOnGoogleFulfillment = functions.https.onRequest(app);
Your issue is that your handler is making API calls which are asynchronous, but the Assistant Conversation library doesn't know that you're doing so. So as soon as the handler finishes, it tries to send back a response, but your asynchronous responses (the stuff in the then() blocks) haven't executed yet.
To address this, you need to return a Promise object so the library knows to wait till the Promise is fulfilled before it returns.
Fortunately, in your case, this should be pretty straightforward. fetch and all the .then() blocks return a Promise. So all you need to do is add a return statement in front of the call to fetch. So something like this:
return fetch('https://host.domain.com/data', {

Cloud function http function fails on first run

I am testing with a payment processing system and every time a transaction is completed, the payment processor should hit my endpoint with a POST request with payment details so I can save it to my database (Firestore).
Only thing is the function fails on the first try. What I mean is, say a customer pays, the payment processor hits my cloud function, it fails to save to my database. When a second customer makes the transaction a minute, 5 minutes or even 18 minutes later according to my observation, everything works as expected.
Am I facing a cold start problem or what is happening. And how do I solve this.
Here is my function
exports.stkCallback = functions.https.onRequest(async (request, response) => {
if (request.method === 'POST') {
if (request.body.Body.stkCallback.ResultCode === 0) {
const jsonData = request.body.Body.CallbackMetadata;
console.log("USER HAS COMPLETED THE TRANSACTION");
var transactionID;
///This below line logs successfully everytime meaning my payment processor has sent the POST
/// request
console.log("checkoutid:", request.body.Body.CheckoutRequestID)
///I have saved the CheckoutRequestID previously to Firestore so I first query the document
//// with that ID (CheckoutRequestID) and get its data so I can update the transaction as
//// complete
var docRef=db.collection("Transactions").doc(request.body.Body.CheckoutRequestID);
await docRef.get().then((doc) =>{
// eslint-disable-next-line promise/always-return
if (doc.exists) {
//console.log("Document data:", doc.data());
transactionID=doc.id;
transactionData.push(doc.data());
} else {
// doc.data() will be undefined in this case
console.log("No such document!");
}
}).catch((error)=> {
console.log("Error getting document:", error);
});
///Once I get the data I can then go ahead and do other operations.
///Only the above query fails the first time which I don't know why
///By failing Saying No such Document. Which the document does exist
***carrying out other operations using the fetched transactionID and transactionData***
response.sendStatus(200);
} else {
console.log("USER HAS CANCELLED THE TRANSACTION");
response.sendStatus(200);
}
I have refactored my code and reproduced it to the below
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
const db = admin.firestore();
exports.stkCallback = functions.https.onRequest(async (request, response) => {
const accountSid = "#";
const authToken = "#";
const client = require("twilio")(accountSid, authToken);
if (request.method === 'POST') {
if (request.body.Body.ResultCode === 0) {
const jsonData = request.body.Body.CallbackMetadata;
console.log("USER HAS COMPLETED THE TRANSACTION");
var transactionID;
///This below line logs successfully everytime meaning my
/////payment processor has sent the POST request
console.log("checkoutid:",
request.body.Body.CheckoutRequestID)
////The below function is critical to all the other below functions below it as
///it supplies the necessary data all the way down
///It is also the function that fails on the first run
var docRef= await db.collection("Transactions").doc(request.body.Body.CheckoutRequestID).get()
.catch((error)=> {
console.log("Error getting document:", error);
});
//// a log of some data from the above function
//// when it fails, the below log is undefined,
console.log("tyyy",docRef.data().Home)
transactionID=docRef.id;
////the data returned from above function is used to perform other operations.
////Below is just one of them
////consequently, it will fail as some values like doc.data().Uid will be ////undefined
await db.collection("Users").doc(doc.data().Uid).collection("Transactions").doc(transactionID).update({
TransactionComplete: true,
transactionCompletedTimeDb: admin.firestore.FieldValue.serverTimestamp(),
Amount: jsonData.Item[0].Value,
ReceiptNO: jsonData.Item[1].Value,
TransactionDate: jsonData.Item[3].Value,
PhoneNumber: jsonData.Item[4].Value,
UserId: doc.data().Uid
})
// eslint-disable-next-line promise/always-return
.catch((error)=> {
// The document probably doesn't exist.
console.error("Error updating document: ", error);
});
response.sendStatus(200);
} else {
console.log("USER HAS CANCELLED THE TRANSACTION");
response.sendStatus(200);
}
});
Attaching an image of a failed function, do note the time
An image of logs of the same triggered function right after (3 minutes later). As you can see the function completes successfully
This seems like a Cold Start Issue
The mitigation of this issue will depend on many information that you are not sharing with us like the complete function, dependencies that you are using, and instance size.
Spreading a loaded function into multiple small functions will help with the cold start time, also using smaller, updated, and cloud oriented libraries will also help.
Also, the size of the payload could be an important factor here, how big is the size of the payload sent to the function and how big is the size of the info that you are writing into the logs? All these small pieces have an important influence on the performance of a cold start.
As a quick solution for your Issue, I can safely say that creating a Scheduled task that triggers your functions every 30 minutes, for example, would be enough to mitigate your issue in the short term.

Axios in Vuex Store returning promise and not data

I'm not sure why I can't get the data I want from this axios.get in my Vuex store.
I've setup this action to commit a change to my mutation like this:
mutations: {
updateShop: (state, payload ) => {
state.shop = payload;
return state.shop;
}
},
actions: {
getShop: ({ commit }) => {
return axios.get("/admin/wine_app/shops").then(response => {
debugger; // I DO have data here??
commit('updateShop', response.data);
});
}
}
But when I stop it with that debugger I DO have the data, but when I use the getShop action in an component I see the promise being returned.
Any idea why?
EDIT:
It MIGHT just not be ready!! I'm seeing this in the console
Promise {<pending>}
__proto__: Promise
[[PromiseStatus]]: "pending"
[[PromiseValue]]: undefined
make the action getShop async
getShop: async ({ commit }) => {
const response = await axios.get("/admin/wine_app/shops");
debugger; // I DO have data here??
commit('updateShop', response.data);
}
await the action where you call it
await this.$store.dispatch('getShop')
Use the shop state prop in your code
this.$store.state.shop
or use mapState if you would like to use several state props.
Also make sure not to return any data from mutations. They must change state and not return its props.
The reason you are seeing a promise is because your action is returning the axios call. Remove the return in both your mutation and action methods.
Your action method uses axios to retrieve the data. This then commits your mutation with the response data. Your mutation method updates the state with that data. At this point your state.shop is updated.
Within your Vue components you can access that data by accessing the state as a computed property.
computed: {
shop() {
return this.$store.state.shop
}
// Or use mapState
...mapState(['shop'])
}
Whenever your state changes this should update in your components due to reactivity.

In an isomorphic flux application, should the REST api calls be implemented in the action?

Should it be implemented in the action creator, or in a service class or component? Does the recommendation change if it's an isomorphic web app?
I've seen two different examples:
Action creator dispatches an action login_success/login_failure after making the rest call
Component calls an api service first and that service creates a login_success or failure action directly
example 1
https://github.com/schempy/react-flux-api-calls
/actions/LoginActions.js
The action itself triggers a call to the api then dispatches success or failure
var LoginActions = {
authenticate: function () {
RESTApi
.get('/api/login')
.then(function (user) {
AppDispatcher.dispatch({
actionType: "login_success",
user: user
});
})
.catch(function(err) {
AppDispatcher.dispatch({actionType:"login_failure"});
});
};
};
example 2
https://github.com/auth0/react-flux-jwt-authentication-sample
The component onclick calls an authservice function which then creates an action after it gets back the authentication results
/services/AuthService.js
class AuthService {
login(username, password) {
return this.handleAuth(when(request({
url: LOGIN_URL,
method: 'POST',
crossOrigin: true,
type: 'json',
data: {
username, password
}
})));
}
logout() {
LoginActions.logoutUser();
}
signup(username, password, extra) {
return this.handleAuth(when(request({
url: SIGNUP_URL,
method: 'POST',
crossOrigin: true,
type: 'json',
data: {
username, password, extra
}
})));
}
handleAuth(loginPromise) {
return loginPromise
.then(function(response) {
var jwt = response.id_token;
LoginActions.loginUser(jwt);
return true;
});
}
}
What's the better/standard place for this call to live in a Flux architecture?
I use an api.store with an api utility. From https://github.com/calitek/ReactPatterns React.14/ReFluxSuperAgent.
import Reflux from 'reflux';
import Actions from './Actions';
import ApiFct from './../utils/api.js';
let ApiStoreObject = {
newData: {
"React version": "0.14",
"Project": "ReFluxSuperAgent",
"currentDateTime": new Date().toLocaleString()
},
listenables: Actions,
apiInit() { ApiFct.setData(this.newData); },
apiInitDone() { ApiFct.getData(); },
apiSetData(data) { ApiFct.setData(data); }
}
const ApiStore = Reflux.createStore(ApiStoreObject);
export default ApiStore;
import request from 'superagent';
import Actions from '../flux/Actions';
let uri = 'http://localhost:3500';
module.exports = {
getData() { request.get(uri + '/routes/getData').end((err, res) => { this.gotData(res.body); }); },
gotData(data) { Actions.gotData1(data); Actions.gotData2(data); Actions.gotData3(data); },
setData(data) { request.post('/routes/setData').send(data).end((err, res) => { Actions.apiInitDone(); }) },
};
In my experience it is better to use option 1:
Putting API calls in an action creator instead of component lets you better separate concerns: your component(-tree) only calls a "log me in" action, and can remain ignorant about where the response comes from. Could in theory come from the store if login details are already known.
Calls to the API are more centralized in the action, and therefore more easily debugged.
Option 2 looks like it still fits with the flux design principles.
There are also advocates of a third alternative: call the webAPI from the store. This makes close coupling of data structures on server and client side easier/ more compartmental. And may work better if syncing independent data structures between client and server is a key concern. My experiences have not been positive with third option: having stores (indirectly) create actions breaks the unidirectional flux pattern. Benefits for me never outweighed the extra troubles in debugging. But your results may vary.