I'm trying to migrate data from MongoDB cloud to Redshift, I was planning to use AWS DMS for this migration. However I am having problems setting up MongoDB cloud as a source Endpoint.
I get the following error:
Test Endpoint failed: Application-Status: 1020912, Application-Message: Failed to create new client connection Failed to connect to database., Application-Detailed-Message: Error verifying connection: 'No suitable servers found (serverSelectionTryOnce set): [Failed to resolve 'development-izqpz.mongodb.net'] [connection refused calling ismaster on '27017:27017']' Failed to connect to database.
I was following this tutorial https://docs.aws.amazon.com/dms/latest/userguide/CHAP_Source.MongoDB.html from aws, but I think I am not setting up the server correctly
Create a peering connection of your mongodb and add it to your aws vpc.
Related
I am trying to connect a MySQL database via Google Cloud Data Fusion Wrangler. It's a very standard connection and I can connect and do every action with another tool (DBeaver). However, trying to connect with Wrangler, I am getting this message:
A server error occurred when testing the connection. Error: Exception occurred while handling request: string
Any suggestions? Thanks!
Please follow the instructions on https://cloud.google.com/data-fusion/docs/how-to/using-jdbc-drivers
Once done, you will be able to select the uploaded driver from the dropdown
For your requirement, you have to use JDBC Driver for connecting to MySQL database.You can use Public IP to setup connection to Wrangler.
Following are the steps to connect to MySQL database using Wrangler
Go to Hub in the Cloud Fusion instance and and select the required JDBC Driver.
Download the
driver and deploy it.
Go to Wranglers page and click on Add
Connection if you are connecting to MySQL for the first time.
Select
the MySQL database from the list of databases provided in Wrangler.
Add name and JDBC driver along with username, password and other connection arguments.
Click on test connection to check the connectivity.
For more details you can check this link.
I am trying to create a linked server between the warehouse and a amazon cloud service.
The service provide is using a PostgreSQL database.
I have installed the ODBC Driver (12.10) on my server but I keep getting this error.
I am not sure how to work around this as I have never used Postgres before.
I'm trying to create an ODBC connection with a remote mongoDB in order to connect MS Power BI with this mongo via ODBC.
By reading the mongoDB Connector documentation here (https://docs.mongodb.com/bi-connector/current/#:~:text=The%20MongoDB%20Connector%20for%20Business,Tableau%2C%20MicroStrategy%2C%20and%20Qlik) - Hosted Database and On Premises BI Connector, i am trying to connect to the remote mongodb with the --mongo-uri option when you start the BI Connector's mongosqld process.
here is the response i am receiving on this connection attempt:
If you can't see the image, the error is "unable to load mongodb information: failed to create admin session for loading server cluster information... socket was unexpectedly closed: EOF"
Does anybody know what i am missing on this connection?
I am trying to copy data from a MySQL database to Azure SQL Server but I am getting a timeout error:
Operation on target Copy MyTable failed:
ErrorCode=UserErrorFailedToConnectOdbcSource,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ERROR
[08001] [Microsoft][MySQL] (1004) The connection has timed out while
connecting to server: xxxxx.xxxxxx.us-west-2.rds.amazonaws.com at
port:
1234.,Source=Microsoft.DataTransfer.Runtime.GenericOdbcConnectors,''Type=System.Data.Odbc.OdbcException,Message=ERROR
[08001] [Microsoft][MySQL] (1004) The connection has timed out while
connecting to server: xxxxx.xxxxxx.us-west-2.rds.amazonaws.com at
port: 1234.,Source=,'
I can preview data while looking at the source of my Copy Data task. There is no timeout. I see all of the rows and columns. I even changed the query to limit the results to 2 rows.
SELECT mytable.id, mytable.name FROM myschema.mytable LIMIT 2;
However, when I publish the pipeline and trigger it to run I get the timeout error. How can I resolve the timeout using Azure Data Factory (ADF) when connecting to MySQL?
The error message was not the most helpful. I discovered what the problem was. The problem was that the IP Addresses used by ADF had to be added to the "Outbound IP" list from AWS MySQL. Everything started working once I updated the outbound IP address list.
I have created an Redshift endpoint in AWS DMS service.
When I run the test connection I get the following error message:
Error Details: [errType=CALL_SERVER_ERROR, status=0, errMessage=Failed executing command on Replication Server, errDetails=]
Both the DMS replication instance and Redshift cluster are at the same region and in the same VPC.