I am working on an application which need to notify around 100 people at once when a specific condition is met. Now when a user who is performing the action which results in the specific condition need to wait till all 100 emails are sent which takes quite long using Gmail SMTP. The application is built on top of Cake PHP.
My question is whether there is a way application can send 100 emails without blocking the user whose action results in meeting the specific condition.
To make my question clear, think of Groupon. It sends notification to all buyers when minimum numbers of buyers are met. So when the nth person make the purchase, Google sends the notification.One way is to notify all buyers immediately after the purchase is complete (which is what we are doing n context of our application) and probably other way is to wait and send the notification using an external script/app at a pre-defined time.
In case of former, the application would block while sending emails is complete. Since PHP deosn't support multi-threading, I was wondering if there is an easy way to make this operation asynchoronous so it doesn't affect main application flow.
You could put the notification in a queue, and use a cronjob that checks and sends notifications every 5 minutes. That way your user isn't locked up while the operation happens.
I'm not 100% sure, but you might be able to use an ajax call too, which would keep the user free to carry on after the request is sent.
Related
In our design we have something of a paradox. We have a database of projects. Each project has a status. We have a REST api to change a project from “Ready” status to “Cleanup” status. Two things must happen.
update the status in the database
send out an email to the approvers
Currently RESTful api does 1, and if that is successful, do 2.
But sometimes the email fails to send. But since (1) is already committed, it is not possible to rollback.
I don't want to send the email prior to commit, because I want to make sure the commit is successful before sending the email.
I thought about undoing step 1, but that is very hard. The status change involves adding new records to the history table, so I need to delete them. And if another person make other changes concurrently, the undo might get messed up.
So what can I do? If (2) fails, should I return “200 OK” to the client?
Seems like the best option is to return “500 Server Error” with error message that says “The project status was changed. However, sending the email to the approvers failed. Please take appropriate action.”
Perhaps I should not try to do 1 + 2 in a single operation? But that just puts the burden on the client, which is worse!
Just some random thoughts:
You can have a notification sent status flag along with a datetime of submission. When an email is successful then it flips, if not then it stays. When changes are submitted then your code iterates through ALL unsent notifications and tries to send. No idea what backend db you are suing but I believe many have the functionality to send emails as well. You could have a scheduled Job (SQL Server Agent for MSSQL) that runs hourly and tries to send if the datetime of the submission is lapsed a certain amount or starts setting off alarms if it fails as well.
If ti is that insanely important then maybe you could integrate a third party service such as sendgrid to run as a backup sending mech. That of course would be more $$ though...
Traditionally I've always separated functions like this into a backend worker process that handles this kind of administrative tasking stuff across many different applications. Some notifications get sent out every morning. Some get sent out every 15 minutes. Some are weekly summaries. If I run into a crash and burn then I light up the event log and we are (lucky/unlucky) enough to have server monitoring tools that alert us on specified application events.
I have a bot over Facebook which people are subscribing for sports updates.
I have 1,000 - 10,000 users I want to send out an update to.
Currently, in small scales like 20 messages , I would use a Facebook Batch request.
But, i'm not sure what would be the best way to send my messages in a large scale.
My two options are:
Batch - limited to 50 requests per batch request.
I don't really know if I should expect a delay in the execution of the request.
Regular calls - I will iterate through my receivers and send each of them a message separately.
I'm afraid Facebook might block me for thinking i'm spamming, or I will exceed the rate limits.
I have to say I was expecting a more generic method coming from Facebook since they are allowing users to subscribe for update through my bot, hence, I was expecting them to provide a guide on what are the best practices for sending the update users subscribed for.
You should definitely use Facebook Messenger Broadcast API for this. This will broadcast your message to all user subscribed to the bot.
Caveats:
You have to apply for this permission. (pages_messaging and pages_messaging_subscriptions.Takes about 1-2 days, but
can test on Admin/Test users of the app)
Each broadcast has to be a separate broadcast. (e.g. you can't send image and a text together, each has to be its own individual broadcast).
Have some kind of un-subscription option as well. FB user might think you are spamming even if you clearly say in the messages that your bot will send updates.
Use custom labels to create targetted sends. So you can either subdivide who you will send updates to about specific issues or just label people if they unsubscribe to your broadcast or not.
Basic workflow:
Get permission to broadcast.
Create message_creative_id via POST to endpoint
Use message_creative_id to POST a broadcast_messages
On a successful send you will get back broadcast_id
I am creating an application which has a follow mechanism where the followed user has to accept the request of a following (similar to private accounts on instagram).
I then want the following user to find out when the other user has checked a million times (every time the following user opens the screen if I did the query in viewDidLoad). However, the problem with this, is that there will be a lot requests which will expensive to me as I will have to pay for the requests to Parse so I want to minimise these queries.
Currently, the best thing I can think of is to check once a day at midnight for example but this doesn't seem very seamless.
Is there a better way of doing this?
For starters consider how stale you are willing to allow an app's view of the world to be and cache the response that long. If a user views that screen every 30 seconds you might only want to actually check with the server 5 minutes after the last successful response (or the last response which had 0 follow requests).
You might consider switching from this sort of "pull" polling where the client decides when to ask the server if anything has changed to a "push" model where the server can inform the client when a change occurs. For example you can send a silent background push notification to a user's devices when they have a follow request, the app can then respond by performing your existing query.
You might still want polling or user triggered requests (like a "pull to refresh" gesture) as a fallback for missed notifications or devices with notifications disabled but you should be able to drastically reduce request volume.
I've build a simple task management webapp: User A fills up a form, hits submit button, sends data to a server and if the data validates User B gets assigned to this task.
I'd like to notify User B by email on this new assignment. However User A can alter the task data or even delete the task and the email that already has been sent would be incorrect in this case.
One approach is to delay the notification email for couple of minutes and then upon sending update the email message if needed.
Which are the best practices for notifications sending?
I think you have a few choices:
Send out emails whenever task status changes. Don't include details; send a link to user B to let them see what the changes are.
This is a good example of Why Starbucks Does Not Use Two Phase Commit. User B will tolerate "dirty reads" because they aren't life altering.
Send out all notification emails asynchronously on a fixed schedule. Have a timed task query a database, generate all the emails, and send them at once. The task will have the chance to only send the latest one. If user A assigns a task, makes updates, then deletes, user B will only get the last meaningful one. In this case, an assign followed by a delete might result in no email being sent. Only an assign or update as last state will result in an email being sent.
In Salesforce you can set up various workflow processes or build API apps that send email. For most standard Salesforce orgs, there is a limit of 1000 emails per day. (e.g. see here)
I can't find any info on what happens after you reach the limit.
e.g. what sort of errors occur, and are administrators automatically notified?
It'll throw an exception (I can't remember the exact message). I've gotten these from time to time and I think they can't be caught. A quick way to check would be create an anonymous block with an isFuture method that sends 10 emails inside of a loop. Call this isFuture method inside of another loop (also 10x) and you'll send 100 emails without hitting governor limits.
Of course you'll have to run your code 11x to get the email exception. This is a pretty shite way to do it, but it's better than clicking a button 1000x.