Restoring mongodb from archived dumb on S3 server - mongodb

I'm having a mongodb deployed on openshift, I want to restore the data from S3 bucket, is there a way to do this directly or I need to download the data from S3 first and then run mongorestore command?

Related

MongoDB Atlas Sync to Local disk and archive

I have a MongoDB Atlas cloud database,
and I want to "sync" to a local server instance with mongod server running.
I have written an automated backup script that backs up a website, which then also does a mongodump to create an archive file from the (local) MongoDB, which then all gets dumped to an AWS bucket.
It's been working great, but I just realized that it's getting the local disk's mongo data, and not the "live" data on the Mongo Atlas cloud.
Is there a way mongodump can dump the MongoDB Atlas stuff to the local disk?
I hope there is an easier way than to "find" all on individual Atlas collections in my database, and "update" to the local disk.
I was successfully able to get a dump from Atlas to my local server using mongodump.
mongodump --forceTableScan --url="mongodb+srv://<username>:<password>#yourmongoserver.something.mongodb.net/<database name>"
Note this failed until I included the --forceTableScan, which then after was seemingly successful.

How to restore exported RDS snapshot from S3 to RDS cluster

I have an AWS RDS Aurora PostgreSQL cluster (compatible with PostgreSQL 13.4).
I successfully followed this tutorial to back up my PostgreSQL RDS aurora cluster snapshot to S3, and it seems that all the data is backed up to s3.
Now I'm trying to restore the exported snapshot from S3 to PostgreSQL RDS cluster, and I couldn't find explanation how to do it.
Any idea how to do it? maybe I need to first restore the exported data from S3 to snapshot, and then connect it to to RDS, or any other way?
The RDS Snapshot to S3 export feature is not intended for additional backups of your data. It is intended to convert your data to Parquet for use in analytics tools like Redshift or Athena. Some data type conversion happens during this export process.
There is currently no method available to import these Parquet files back into RDS. You would have to write some code yourself to read the Parquet files and insert the data back into a running RDS instance if you needed that.
If you are just wanting a secondary backup of your RDS instance in addition to the RDS snapshots, you could either look into cross-region or cross-account copies of your RDS snapshots, or look into using the AWS Backup service.

Binary backup of AWS RDS PostgreSQL

I am looking for a way to do a regular binary backup of my AWS RDS PostgreSQL database to use this copy locally and mount it in my docker postgis container. As far as I have understood, I cannot connect to the RDS host and do pg_basebackup.
So what is the best way to do this task?
The best workflow would be: There is an automatic daily binary backup stored in AWS S3. Then locally, some shell script to download the files to a folder that is mounted to the postgis folder.
Any ideas if that is possible?
I have one more additional requirement: I would like to exclude specific tables in the backup.

Is there any approach to migrate PostgreSQL database from Azure to AWS RDS PostgreSQL

I able to migrate from on-prem database to AWS using DMS service but couldn't able to migrate database from Azure to AWS.
Is there any better approach?
A pretty low level but effective way would be to: export your Azure data as CSV using pg_dump,
copy it into an AWS S3 bucket (pretty easy with Python awscli package) and load from S3 into RDS Postgres.
Would be great if files could be moved directly from blob storage to s3, but I don't think Azure Postgres supports dumping directly to blob anyway.

Get big(250Gb) RDS PostgreSQL db dump into my local machine

My problem is to get big(250Gb) postgres dump on my local machine.
Its on AWS RDS. I tried to dump it to local machine, but it takes too long, kinda 3+ days.
Trying to find a way to dump it into S3 and download from there safely. May be you could suggest more effective way to do that. Will appreciate any kind of help.
Thanks!
As of my knowledge, aws does not provide a way to backup db into s3
you can take a look into this question and answers,
Export huge database from amazon RDS to local mysql
here is one answer
If the data is that big I would suggest copying the RDS snapshot on S3, as explained here.
Link to documentation to copy snapshot to s3
This topic is covered in this StackOverflow thread Exporting a AWS Postgres RDS Table to AWS S3
Another solution would be to spin up an EC2 instance and dump the database to a local EBS volume that is large enough for the following steps. Then chose one of the following:
Compress the DB dump into multiple files and copy to S3 for download. I would use a smart S3 download manager given the size of the database dump.
Export the S3 data using Snowball Export S3 Data. If your Internet connection is not fast enough / reliable enough then Snowball will get you the data.