Can I use ElephantSql service as a target in bluemix data connect? - ibm-cloud

I created a service to ElephantSql in bluemix (plan: Tiny Turtle).
I created a connection to the ElephantSql service in Data Connect. It appears in my list of Connections in Data Connect. However - It does not appear as an option when I try to use "copy to target" / "set target". Is this a restriction in Data Connect?
Thx

When you say that you created an ElephantSql connection, I assume you selected the PostgreSQL connection type. Data Connect supports PostgreSQL only as a source, not a target: https://console.ng.bluemix.net/docs/services/dataworks1/dataworks_overview.html

Related

Azure: An attempt was made to access a socket in a way forbidden by its access permissions 10.0.0.4:3306

I created a mysql database instance in North Europe, then a Web app there in
North Europe, via VS published in Azure and at startup I get an exception as in the screenshot. Everything works locally in VS using this connection string, but not in Azure itself. If you disable the DB instance, the exception is the same, so the application does not connect to the DB at all. I have been suffering with this problem for 4 hours. The connection port to MySQL Azure does not change. In the security settings in my opinion everything is fine. The local mysql server that could theoretically occupy port 3306 is disabled.
connectionString="Server=testing-db-srv2.mysql.database.azure.com; Port=3306; Database=testing-sys-azure; Uid= ####testing-db-srv2; Pwd= ###; SslMode=Preferred;"
Please check your appsettings.json file for any API endpoints that might need to be updated now that your app is on Azure. This is the most common cause for this type of error.

Connection failure Postgresql on AWS-RDS Instance in a private network from PowerBI Desktop & Service

I have an AWS RDS (PostgreSQL) that is inside a private network - only accessible via a VPN and Bastian Host.
I am able to establish connection from PBI Desktop to "PostgreSQL-RDS Instance." By creating SSH tunneling from my Laptop (localhost) to Bastian Host using ODBC Driver. With this approach all the data is imported onto PBI desktop(import mode).
But our requirement is to establish connection through a direct query to refresh data real time and generate the Reports Dynamically which I am not able to.
I entered the database credentials into the Power BI desktop tool, and it not working correctly in the power bi desktop, getting a Timeout Error.
I must use direct query, I can't use import.
Any help is appreciated.
An exact error that you are getting would help get to the root cause of the issue. However, a few basic troubleshooting steps that I'd suggest are:
Ensure that you have a compatible version of the software installed on your machine such as the Npgsql-4.0.9. AT times the latest version of the software usually causes issues.
Ensure that you remove the semicolon at the end of the query.
Once you get the query running successfully on the desktop version, when you publish it to the web version, the visuals will not be able to connect to the database unless an on-premises data gateway is setup. To do so, more details on setting up a data gateway to automatically refresh the dataset for the power bi web version are here:
Refresh AWS RDS database from Power BI Web you are successfully able to query directly

Power BI connection problem with Postgresql

I'm using Power BI version 2.84 to connect to Postgresql server. In PBI desktop everything works fine, I can connect to the server, import and refresh data smoothly.
However when I publish it to PBI server, I can't refresh it anymore due to 'encrypted connection'. I have checked all of my connection settings and make sure they are not encrypted at all but the problem is still there.
Please let me know if you have any solution for this.
Cheers
I assume you are using direct query?
If you want to use direct query you will need to set up On-Premises data gateway.
on premise gateways
And then you should add gateway cluster in PowerBI web version gateway cluster:
Data gateway
I think everything is quite straightforward here.
But do you need direct query? If you are ok with refreshing your data a few times a day, you could set up a ODBC connection (when importing data, choose ODBC option not postgresql).
You would need to set up ODBC drivers, (Control panel -> Administrative tools -> Data sources) And create a new one (you should download Postgresql ODBC driver if you have none)
Then you also need to create On-Premises data gateway and set up refresh intervals.

Connecting to Cloud SQL MySQL

We would like to test connecting Cloud SQL (mySQL) to BigQuery using Cloud Data Fusion. What is the proper way to connect to CloudSQL as that does not appear to be "build in" at this point in time. What driver is recommended and are there any instructions available?
Here are instructions to use Cloud SQL MySQL in Data Fusion. Note that in the Wrangler section, currently, Cloud SQL instances with Private IP cannot be used. However, they can still be used when running Data Fusion pipelines
Using Cloud SQL (MySQL) in Wrangler (Public IP only)
Obtain the JDBC Driver JAR file by building it using the instructions at https://github.com/GoogleCloudPlatform/cloud-sql-jdbc-socket-factory
Go to Wrangler
If this is the first time you are configuring CloudSQL for MySQL, click on the Add Connection button from the Wrangler screen and choose Database.
Click “Google Cloud SQL for MySQL.”
Upload the previously built JAR as illustrated, and click the Next button.
Click the Finish button to complete the upload.
Once the driver has been uploaded you will see a green check mark indicating that your driver has been installed.
Click the Google Cloud SQL for MySQL to create a new connection. Once the connection modal opens click on the Advanced link if present.
Enter connection string as
jdbc:mysql://google/<database>?cloudSqlInstance=<instance-name>&socketFactory=com.google.cloud.sql.mysql.SocketFactory&useSSL=false
where represents the database you created in the prerequisites section, and refers to you instance connection name as displayed in the overview tab of of the instance details page, e.g:
Example:
jdbc:mysql://google/mysql?cloudSqlInstance=cloud-data-fusion-demos:us-west1:mysql&socketFactory=com.google.cloud.sql.mysql.SocketFactory&useSSL=false
Enter the username and the password you configured for this CloudSQL instance
Click Test Connection to verify that the connection can successfully be established with the database.
Click Add Connection to complete the task.
Once you’ve completed all the steps you will be able to click on the newly defined database connection and see the list of tables for that database.
Using Cloud SQL (MySQL) in Pipelines (Public and Private IP)
Perform steps 1-6 in the Wrangler section above
Open the pipeline Studio
From the plugin palette on the left, drop the Cloud SQL source plugin to the canvas, and open it by clicking Properties.
Specify the plugin name as cloudsql-mysql (Presumes that you have perform.
Specify the connection string as below:
jdbc:mysql://google/?cloudSqlInstance=&socketFactory=com.google.cloud.sql.mysql.SocketFactory&useSSL=false
where represents the database you created in the prerequisites section, and refers to you instance connection name as displayed in the overview tab of of the instance details page, e.g.:
jdbc:mysql://google/mysql?cloudSqlInstance=cloud-data-fusion-demos:us-west1:mysql&socketFactory=com.google.cloud.sql.mysql.SocketFactory&useSSL=false
Enter the query that you would like to import data from as the Import Query
Enter the username and password to use for the database. You can also use a secure macro for the password.
Click Get Schema to populate the schema of the plugin.
Configure the rest of the pipeline, and deploy.

Connecting to Database (Postgres AWS RDS)

I am following the tutorial on how to set up and connect a PostgreSQL server on AWS found HERE
When I try to sign in on workbench, I get this message:
At first I thought it was because my DB instance was not available yet, so I waited until it finished backing up. This did not seem to work as I am still getting this message. I will appreciate assistance on this.
Have you created a secutiry group and allow databases connection port?
From the docs:
VPC Security Group(s): Select Create New Security Group. This will
create a security group that will allow connection from the IP address
of the device you are currently using, to the database created.