is there a built in mechanism to alert in eventbridge whenever a new database user is created in redshift cluster?
I can create a lambda function regularly pull the list of existing database users. Store list of users in a s3 bucket and compare them with previous list. Send an SNS alert if there's a change. I wanted to know if there's a better approach to this problem.
Related
I am building a mobile app (in flutter firebase, but the answer does not have to be firebase specific). And I would like to implement a feature that notifies users whenever anyone from their contact list joins the app. This seems like a very EXPENSIVE feature.
At the top of my head, I see a lambda/cloud function that is triggered everytime a user joins and then searches a database of users and their respective contacts for the existence of the new user's phone number. To me, this solution does not scale well for two reasons: if the number of total users is in the millions and the number of users joining simultaneously is a lot.
My better solution is to get the user's contacts upon joining and then searching a database of current users contacts for any of the phone numbers of the newly joined user.
Is there a solution better than the second one? If so, what is it? If the second solution is standard, what kind of backend storage mechanism provides the best search and retrieval time for a database of users and their respective contacts?
In the case of large users ill not do first solution because that may slow the sign up process instead i will creat a cron job that runs at a specific time or periodically it will get the list of the latest users signed up that day or that hour whatever you prefer then that cron will check the new user if related to any user in the databases and send a notification right away, or a better solution create a temporary table in a database in another server insert the notification informations into the other server, creat another cron job in the second server it will run at a specific time to sendthe notification
I have a flutter app using firebase and google cloud. The organization providing the app to their users uploads a list of users that are able to register and create an account. When a user goes to register I want two things to happen:
They are given an error message if their email address and ID number do not match an existing document with email and ID field values
Existing fields, like their department and deck number that are in the collection uploaded by the organization are copied to their new user profile
I would write a cloud v2 function. The documentation has some great examples of how to block registration. What you would want to do is in the beforeUserCreated method, look up the field in firestore to validate that their email. You can get their email through the AuthBlockingEventType additionalUserInfo field which should provide the username (email in this case) to compare against the firestore database.
Deploying an AuthBlocking function is the same deployment as any other function.
Once deployed, you will need to remember to register your blocking function for it to take effect.
As far as updating their user profile information, you could just use another cloud function to listen for a database change once the user is registered and then copy that data over.
Could you please let me know if there is a way to get an email alert if a schedule is created or a change is done to the existing schedule on Azure Data Factory Pipelines? This requirement is to monitor any schedule changes/ new schedule creations happening in the production environment and restrict them if done without proper approval in the project.
I tried using Azure log analytic workspace to achieve this requirement but the existing ADF log tables(ADFActivityRun, ADFPipelineRun, ADFTriggerRun) do not have any such information holding to write query (maybe am missing anything). Please let me know your views.
Many thanks in advance.
Thank you
For Roles and Permissions in ADF, you need Contributor role to create, edit, and delete data factories and child resources including datasets, linked services, pipelines, triggers, and integration runtimes.
You can create custom rule for a group where users can only monitor a data factory (or factories) but can't modify it.
You can refer this third-party tutorial on how to Create Azure Custom Reader Role for Data Factory and assign the role to a user whom you just want to give read permission.
For alerts, you can use Azure Monitor service to monitor and create alerts for your Azure Data Factory actions.
To get started, simply navigate to the Monitor tab in your data factory, select Alerts & Metrics, and then select New Alert Rule.
Give a name to your rule. Provide the severity to the alert. Add criteria to create a alert. There are plenty of scenarios available in the provided in-built criteria list. Then click on Configure notification.
The Configure notification allows you to choose options like Email, SMS, Azure notification and Voice call. You can choose multiple options as per your requirement.
A developer had created dashboard and successfully published to tableau server for auto refresh extract every day, Developer had data and DB access at that time.
Now the same developer's access for DB is removed, is it possible to get fresh data to reports everyday? will the extract have new data everyday ?
Not unless you update the database credentials published with the data source to be something that the database accepts
You typically want some sort of service account for this purpose - so that the published dashboards still work when the original publisher leaves
I am using Cloud Functions to handle events, I have some topics that trigger an event such as a write to my document store.
What I would like to do is on a change to my doc store, notify any users interested in that change that there is fresh data.
For example, a news feed. If User A triggers an activity that is written to the store, User B should receive an update that such an activity has taken place, either an instruction to poll new data or just the new event object.
I do not want to use the Firebase Realtime DB as it is a requirement to use MongoDB, however, I believe as Firebase can hook into the events on Google Cloud Functions, I should be able to trigger this still using events?
Is this correct? So far in Firebase I can only find triggers around Realtime DB..
I can sort of achieve this with FCM, however it feels like this is more aimed at giving the user notifications as it requires the user to accept them in the browser, where I want to notify the app itself, not the user.
I do not want to use the Firebase Realtime DB as it is a requirement to use MongoDB, however, I believe as Firebase can hook into the events on Google Cloud Functions, I should be able to trigger this still using events?
Yes, but
Using only Firebase storage triggers is a bit tricky thing to do, as Firebase gives you a Single bucket to work on, so if there is a change in FirebaseStorage(FS) the event will be triggered regardless of in which folder the file is stored.
I can sort of achieve this with FCM, however it feels like this is more aimed at giving the user notifications as it requires the user to accept them in the browser, where I want to notify the app itself, not the user.
Yes, you can use FCM for it, but you will need to store FCM Token of every device in some kind of database(If you dont want to store in Firebase Database) to send it to users.
I want to notify the app itself, not the user.
You can pass payload in FCM through data instead of notification which will be received in onMessageReceived and then you can decide what you want to do with it.
i.e.
var payload = {
data: {
score: "850",
time: "2:45"
}
};