Run transactions - AppSync Aurora Serverless Resolver RDS - aws-appsync

I'm using AppSync with aurora serverless resolvers. I would like to know if it possible to run transactions within the RDS resolvers.
I could run transactions using data api but in a lambda function resolver.

You could write a stored procedure that you call from your AppSync resolver and have the transaction code inside.

Related

Alert about DB creation on RDS/Aurora PostgreSQL

I have some Aurora PostgreSQL Clusters created on our AWS account. Because of some access issues (which we are working on already), there are several people in other teams who create random DB's on these Aurora Clusters and then we need to work on cleaning them up.
I wanted to check if there is a way to get alerted (via SNS Notifications etc.) whenever a new DB is created on these AWS Postgres clusters using some AWS Tools itself.
Thanks
You could do it using AWS Aurora Database Activity Streams, it will capture all database activity on the database and send it AWS Kinesis Data Stream and you could create a AWS Lambda function to read Kinesis Data Stream and identify the events needed (ex. create database)and finally send notification to AWS SNS from AWS Lambda code.
Another option is enable pgaudit on your AWS Aurora PostgreSQL, send logs to AWS CloudWatch, create AWS Lambda to read the events from AWS CloudWatch and send AWS Notification
Below you can find step by step on AWS Blog below.
Part 2: Audit Aurora PostgreSQL databases using Database Activity Streams and pgAudit

Are there any database administration tools that runs in AWS Lambda?

Are there any tools for database administration that can be deployed in AWS Lambda? My usecase is i've aurora serverless running inside a vpc and I want an AWS Lambda function to be able to visualize, clear and delete datas so developers do not need to get inside bastion hosts everytime they need to clear a row.
There is Data API for Aurora Serverless which allows you to use regular AWS SDK (e.g. boto3) to query your databases in Aurora Serverless.

Can I configure AWS RDS to only stream INSERT operations to AWS DMS?

My requirement is to stream only INSERTs on a specific table in my db to a Kinesis data stream.
I have configured this pipeline in my AWS environment:
RDS Postgres 13 -> DMS (Database Migration Service) -> KDS (Kinesis Data Stream)
This setup works correctly but it processes all changes, even UPDATEs and DELETEs, on my source table.
What I've tried:
Looking for config options in the Postgres logical decoding plugin. DMS uses the test_decoding PG plugin which does not accept options to include/exclude data changes by operation type.
Looking at the DMS selection and filtering rules. Still didn't see anything that might help.
Of course I could simply ignore records originated from non-INSERT operations in my Kinesis consumer, but this doesn't look like a cost-efficient implementation.
Is there any way to meet my requirements using these AWS services (RDS -> DMS -> Kinesis)?
Well DMS does not have this capability .
If you want only INSERT to be send to Kinesis in that case you can have a lambda function on every INSERT of RDS .
Lambda function can be configured as trigger for INSERT .
You can invoke lambda only for INSERT and write to Kinesis directly .
Cost wise also this will be less .
In DMS you are paying for Replication instance even when not in use .
For detailed reference Stream changes from Amazon RDS for PostgreSQL using Amazon Kinesis Data Streams and AWS Lambda

how to setup tables of aws aurora mysql using aws cloudformation or aws cdk?

how to setup tables of aws aurora mysql using aws cloudformation or aws cdk?
in my setup i have a serverless app using lambda for various microservices. the datebase is a serverless aurora mysql database. to provision the aws infrastructure i will use aws CDK. i like to setup the database using some migration tools like liquibase or sequelize.
for the moment i am using a separat lambda function. the lambda function executes liquibase to setup db changes. but i have to execute the function separately after deployment with CDK is succeded.
an execution triggered after the execution of the cloudformation stack (cdk stack) would be optimal?! I would like to avoid a CI / CD stack via code pipeline.
does anyone has best practice to setup database at provision time?
Cloud watch rules
Cloud watch rules based on cloudformation events can be used to route events for processing lambda. Cloud watch rules can be a part of the CDK deployment description.
The triggered function can then execute liquibase, flyway, sequelize or something else to spinup or change db.
---- or ----
Cloudformation custom resource
AWS cloudformation custom ressource can execute a lambda function during cloudformation lifecycle.
The triggered function can then execute liquibase, flyway, sequelize or something else to spinup or change db.
I use Cloudformation custom resources for running database migrations and initial database setup scripts at deployment time.
This is the recommended way for running DB migrations for serverless applications if you don't want to rely on a CI/CD pipeline to do it for you.
Here's a well written blog post by Alex DeBrie about CF custom resources: https://www.alexdebrie.com/posts/cloudformation-custom-resources/

AWS DMS - Scheduled DB Migration

I have Postgresql db in RDS. I need to fetch data from a bunch of tables in postgresql db and push data into a S3 bucket every hour. I only want the delta changes (any new inserts / updates) to be sent in the hourly. Is it possible to do this using DMS or is EMR a better tool for performing this activity?
You can create an automated environment of migration data from RDS to S3 using AWS DMS (Data Migration Service) tasks.
Create a source endpoint (reading your RDS database - Postgres, MySQL, Oracle, etc...);
Create a target endpoint using S3 as an engine endpoint (read it: Using Amazon S3 as a Target for AWS Database Migration Service);
Create a replication instance, responsible to make a bridge between source data and target endpoint (you will only pay while processing);
Create a database migration task using the option 'Replication data change only' on migration type field;
Create a cron lambda, which starts a DMS task, with stack Python following these instructions of this articles Lambda with scheduled events e Start DMS tasks with boto3 in Python.
Connecting these resources above you may can have what you want.
Regards,
Renan S.