No Data appearing in Redshift from MongoDB using AWS DMS - mongodb

I am trying to test the AWS data migration service to fetch some data from Atlas MongoDB to Redshift cluster, I have created the replication instance, both the endpoints, successfully tested the connection them and created the replication task, the task loads and completes normally but no data appears in redshift, just empty tables?
Anybody can you please help what might be the reason?

Ok so first try to debug it , modify the task and enable cloudwatch logs and run the task again and look at the logs , also enalbe data assessment task you will need to add an iam role for that. Some mistakes I made was that the endpoint for mongodb needs an atlasadmin login , no other will work , tell me if it did not work. Also make sure all tasks have correct db name , capitol letters matter , same happend with me , my endpoint had db name with small letters and my task had db name with capitol letter , the task was successful but no data.Lastly and most important is that make sure your mongo peering connection is correctly figured with aws. Im sure when you see the logs there will be an un authorized error.

Related

Error creating Glue connection with DocumentDB

I have created a connection in Glue with a DocumentDB cluster. The cluster is running and I can connect from my laptop and also from AWS athena to run Athena queries over it. The connection URL in Glue follows this format:
mongodb://host:27017/database
In the connection creation I have tried enabling and disabling the SSL connection option:
Also I have disable in the cluster the TLS and rebooted the database. Every time I test the connection with Glue I get this error:
Check that your connection definition references your Mongo database with correct URL syntax, username, and password.
Exiting with error code 30
Also I have tried setting the user and password in the URL but I get the same error.
How can I solve this?
Thanks!!!
First of all, does the "database" actually exists in DocumentDB cluster? Make sure you select the right VPC for Glue, has to be the same as DocumentDB. When using the Test Connection option, one of the security groups has to have an allow all rule, or the source security group in your inbound rule can be restricted to the same security group.
This blog post has some good info on how to setup a Glue connection to MongoDB/DocumentDB.
I have solved the problem. Disabling TLS on DocumnetDB and in the Glue connection works. I have to find the way to make it working with TLS enabled.

CloudRun Suddenly got `Improper path /cloudsql/{SQL_CONNECTION_NAME} to connect to Postgres Cloud SQL instance "{SQL_CONNECTION_NAME}"`

We have been running a service using NestJS and TypeORM on fully managed CloudRun without issues for several months. Yesterday PM we started getting Improper path /cloudsql/{SQL_CONNECTION_NAME} to connect to Postgres Cloud SQL instance "{SQL_CONNECTION_NAME}" errors in our logs.
We didn't make any server/SQL changes around this timestamp. Currently there is no impact to the service so we are not sure if this is a serious issue.
This error is not from our code, and our third party modules shouldn't know if we use Cloud SQL, so I have no idea where this errors come from.
My assumption is Cloud SQL Proxy or any SQL client used in Cloud Run is making this error. We use --add-cloudsql-instances flag when deploying with "gcloud run deploy" CLI command.
Link to the issue here
This log was recently added in the Cloud Run data path to provide more context for debugging CloudSQL connectivity issues. However, the original logic was overly aggressive, emitting this message even for properly working CloudSQL connections. Your application is working correctly and should not receive this warning.
Thank you for reporting this issue. The fix is ready and should roll out soon. You should not see this message anymore after the fix is out.

Unable to connect from BigQuery job to Cloud SQL Postgres

I am not able to use the federated query capability from Google BigQuery to Google Cloud SQL Postgres. Google announced this federated query capability for BigQuery recently in beta state.
I use EXTERNAL_QUERY statement like described in documentation but am not able to connect to my Cloud SQL instance. For example with query
SELECT * FROM EXTERNAL_QUERY('my-project.europe-north1.my-connection', 'SELECT * FROM mytable;');
or
SELECT * FROM EXTERNAL_QUERY("my-project.europe-north1.pg1", "SELECT * FROM INFORMATION_SCHEMA.TABLES;");
I receive this error :
Invalid table-valued function EXTERNAL_QUERY Connection to PostgreSQL server failed: server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request.
Sometimes the error is this:
Error encountered during execution. Retrying may solve the problem.
I have followed the instructions on page https://cloud.google.com/bigquery/docs/cloud-sql-federated-queries and enabled BigQuery Connection API. Some documents use different quotations for EXTERNAL_QUERY (“ or ‘ or ‘’’) but all the variants end with same result.
I cannot see any errors in stackdriver postgres logs. How could I correct this connectivity error? Any suggestions how to debug it further?
Just adding another possibility for people using Private IP only Cloud SQL instances. I've just encountered that and was wondering why it was still not working after making sure everything else looked right. According to the docs (as of 2021-11-13): "BigQuery Cloud SQL federation only supports Cloud SQL instances with public IP connectivity. Please configure public IP connectivity for your Cloud SQL instance."
I just tried and it works, as far as the bigquery query runs in EU (as of today 6 October it works).
My example:
SELECT * FROM EXTERNAL_QUERY("projects/xxxxx-xxxxxx/locations/europe-west1/connections/xxxxxx", "SELECT * FROM data.datos_ingresos_netos")
Just substitute the first xxxxs with your projectid and the last ones with the name you gave to the connection in The bigquery interface (not cloudsql info, that goes into the query)
Unfortunately BigQuery federated queries to Cloud SQL work currently only in US regions (2019 September). The documents (https://cloud.google.com/bigquery/docs/cloud-sql-federated-queries) say it should work also in other regions but this is not the case.
I tested the setup from original question multiple times in EU and europe-north1 but was not able to get it working. When I changed the setup to US or us-central1 it works!
Federated queries to Cloud SQL are in preview so the feature is evolving. Let's hope Google gets this working in other regions soon.
The BigQuery dataset and the Cloud SQL instance must be in the same region, or same location if the dataset is in a multi-region location such as US and EU.
Double check this according to Known issues listed.
server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request.
This message usually means that you will find more information in the logs of the remote database server. No useful information could be sent to the "client" server because the connection disappeared so there was no way to send it. So you have look in the remote server.

Connecting to Database (Postgres AWS RDS)

I am following the tutorial on how to set up and connect a PostgreSQL server on AWS found HERE
When I try to sign in on workbench, I get this message:
At first I thought it was because my DB instance was not available yet, so I waited until it finished backing up. This did not seem to work as I am still getting this message. I will appreciate assistance on this.
Have you created a secutiry group and allow databases connection port?
From the docs:
VPC Security Group(s): Select Create New Security Group. This will
create a security group that will allow connection from the IP address
of the device you are currently using, to the database created.

Not able to migrate the data from Parse to local machine

as some of you might aware about the shutting down of parse service in about a year, i am following the migration process as per their tutorials. However, i am not able to migrate these data from parse to local database(i.e. mongodb).
I've started the mongodb instanse locally on 27017, and also created an admin user as part of migration based on these tutorials. Reference-1 & Reference-2.
But when i try to migrate the data from parse developer console, i get No Reachable Servers or Network Error & i don't understand why. I have doubt in the Connection string that i use for this but i am not sure, please find the following image.
I am new to mongodb so don't have much idea about this, your little help would be greatly appreciated.
Since the migration tool runs at parse.com, the tool needs to be able to access your MongoDB instance over the Internet.
Since you're using a local IP (192.168.1.101), parse.com cannot connect to your IP and the transfer will time out.
Either you need to make your MongoDB reachable from the Internet, or you can - as they do in their guide - use an existing MongoDB service.