Connection failure Postgresql on AWS-RDS Instance in a private network from PowerBI Desktop & Service - postgresql

I have an AWS RDS (PostgreSQL) that is inside a private network - only accessible via a VPN and Bastian Host.
I am able to establish connection from PBI Desktop to "PostgreSQL-RDS Instance." By creating SSH tunneling from my Laptop (localhost) to Bastian Host using ODBC Driver. With this approach all the data is imported onto PBI desktop(import mode).
But our requirement is to establish connection through a direct query to refresh data real time and generate the Reports Dynamically which I am not able to.
I entered the database credentials into the Power BI desktop tool, and it not working correctly in the power bi desktop, getting a Timeout Error.
I must use direct query, I can't use import.
Any help is appreciated.

An exact error that you are getting would help get to the root cause of the issue. However, a few basic troubleshooting steps that I'd suggest are:
Ensure that you have a compatible version of the software installed on your machine such as the Npgsql-4.0.9. AT times the latest version of the software usually causes issues.
Ensure that you remove the semicolon at the end of the query.
Once you get the query running successfully on the desktop version, when you publish it to the web version, the visuals will not be able to connect to the database unless an on-premises data gateway is setup. To do so, more details on setting up a data gateway to automatically refresh the dataset for the power bi web version are here:
Refresh AWS RDS database from Power BI Web you are successfully able to query directly

Related

Power BI connection problem with Postgresql

I'm using Power BI version 2.84 to connect to Postgresql server. In PBI desktop everything works fine, I can connect to the server, import and refresh data smoothly.
However when I publish it to PBI server, I can't refresh it anymore due to 'encrypted connection'. I have checked all of my connection settings and make sure they are not encrypted at all but the problem is still there.
Please let me know if you have any solution for this.
Cheers
I assume you are using direct query?
If you want to use direct query you will need to set up On-Premises data gateway.
on premise gateways
And then you should add gateway cluster in PowerBI web version gateway cluster:
Data gateway
I think everything is quite straightforward here.
But do you need direct query? If you are ok with refreshing your data a few times a day, you could set up a ODBC connection (when importing data, choose ODBC option not postgresql).
You would need to set up ODBC drivers, (Control panel -> Administrative tools -> Data sources) And create a new one (you should download Postgresql ODBC driver if you have none)
Then you also need to create On-Premises data gateway and set up refresh intervals.

Transfer MongoDB dump on external hard drive to google cloud platform

As a part of my thesis project, I have been given a MongoDB dump of size 240GB which is on my external hard drive. I'll have to use this data to run my python scripts for a short duration. However, since my dataset is huge and I cannot mongoimport on my local mongodb server (since I don't have enough internal memory), my professor gave me a $100 google cloud platform coupon so I can use the google cloud computing resources.
So far I have researched that I can do it this way:
Create a compute engine in GCP and install mongodb on remote engine. Transfer the MongoDB dump to remote instance and run the scripts to get the output.
This method works well but I'm looking for a method to create a remote database server in GCP so I that I can run my scripts locally, which is something like one of the following.
Creating a remote mongodb server on GCP so that I can establish a remote mongo connection to run my scripts locally.
Transferring the mongodb dump to google's datastore so then I can use the datastore API to remotely connect and run my scripts locally.
I have given a thought of using MongoDB atlas but because of the size of the data, I will be billed hugely and I cannot use my GCP coupon.
Any help or suggestions on how of either of the two methods can be implemented is appreciated.
There is 2 parts to your question
First, you can create a compute engine VM with MongoDB installed and load your backup on it. Then, open the right firewall rules for allowing the connexion from your local environment to the Google Compute Engine VM. The connexion will be performed with a simple login/password.
You can use a static IP on your VM. By the way, in case of reboot on the VM you will keep the same IP (and it will be easier for your local connexion).
Second, BE CAREFUL to datastore. It's a good product, serverless NoSQL database, document oriented, but it's absolutely not the MongoDB equivalent. You can't perform aggregate, you are limited in search capabilities,... It's designed for specific use case (I don't know yours, but don't think that is the MongoDB equivalent!).
Anyway, if you use Datastore, you will have to use a service account or to install Google Cloud SDK on your local environment to be authenticated and to be able to request Datastore API. No login/password in this case.

Enable local development access to PostgreSQL DB on Amazon RDS

I'm in the early stages of a web project which requires a database. Until now, I've managed to get away with using an SQLite database locally for development and a PostgreSQL database running on AWS RDS in "production" (mainly just for alpha testers). I haven't really had any state in the database that I couldn't just blow away and re-seed whenever necessary.
However, I'm now at the point in my project where I'm going to have state in the production database that I can't easily reproduce via seeding in my local SQLite database. So I've decided to create another development database that I create via a script which just takes the last snapshot of my production database and creates a production database. I've managed to get this script running with some degree of success...
But I'm having difficulty connecting to this development database in my local development environment. Each time I try to connect, I timeout. Most of the resources on Amazon seem to indicate that this is likely a security group issue. The security group corresponding to my database currently has these inbound settings (security group erased, but it is the group listed as my RDS security group):
Is there something obviously wrong here? How do I set up my security groups such that I can connect to this development database on my local machine?
The source shouldn't be set to the same security group, but rather whatever source you'll be connecting from. You can use 0.0.0.0/0 to enable traffic from any source.

How can we access Bluemix hosted "Compose for MongoDB" service from "outside"?

Situation:
Have created today a new Compose for MongoDB Service instance in Bluemix
Need:
I have to access this MongoDB DIRECTLY with tools (eg. Mongo Managemant Studio Pro, mongo.exe, etc.) for bulkloading, testing, ad-hoc data fix, etc.
Problem:
I have not found any docs, samples nor a CLEAR statement that
a) gives me some confirmation that THIS is possible
b) gives me COMPLETE information (not just some technical fragments that might have worked year ago) how to do it.
Maybe I am looking to the wrong places or do not know the right people. However I am stuck on this, and before quitting Bluemix MongoDB maybe somebody has a copy/past solution or handson step by step manual.
Any help welcome. Thanks!
Connecting to MongoDB service in Bluemix from an application is possible. For this answer I have used the application "Robo3T" and here are the steps:
Access your MongoDB Service on you Bluemix account. Usually under
"Cloud Foundry Services"
Open section "Manage", from "Connection Settings" copy from "HTTPS" the connection address and port. In this example "sl-eu-lon-2-portal.5.dblayer.com" and "20651"
In Robo3T create a new connection with the connection address from previous step
In tab Authentication configure database name, username and password
. The credentials are found as in step 1
From "Connection Settings" copy the SSL Certificate into a text file and save locally.
In Robo3T Add the certificate to the connection in the "SSL" tab
Test the connection and save the settings
Answer
YES, Bluemix hosted Compose for MongoDB instances can be connected from the mongo Shell and some updated DB Managment tools.
However, you have to make sure, that in case you are running the newest DB versions, that your tools (shell and DB management GUIs) comply with the newest DB features such as encryption etc.
Origin of the Problem
My problem was due to older and therefore incompatible versions of the mongo shell and DB-managment tools running against the newest MongoDB versions with their specialities on encription and multiple servers to be handled in the URI.
At least two DB managment tools are not compatible with the newest DB version and will take their time to get fixed. The problem is, that both will not tell you about this. They just do not not connect. No logs on either side. Period.
So my advise here: look for tool providers who express dedicated compliance with the specific version of your DB.
Advise to the Bluemix Team
It might not take much time to provide some sample connection strings for the most common tools like the mongo shell, MongoBooster, etc. to take the hassle and guesswork out of interpreting the Environment variables and figuring out what is needed for specific connection strings and what is not.
For instance MongoDB Atlas hosting provides for every cluster readymade connection strings for many tools you can just copy/past and done!
Connecting to Atlas took me 5 Minutes. For Bluemix I have lost hours! Not because it is complex, but because the documentation and the generated Info is somehow incomplete and messy - at least for the ones who do not connection strings for their living!

Error code 40 when running SSRS reports from Internet Explorer (run as administrator)

We deployed a VB.Net application on a customer's computer that contains SSRS reports.
The application connects to the SQL Server database in the app without any problems. We installed SQL Server Data Tools so we could deploy the reports (rdl) and data source (rdl) files up to the report server. These deploy without any problems.
In SQL Server Data Tools we can "Preview" the reports without any problems as well.
We do run into a problem when attempting to view the report from Internet Explorer (run as an administrator).
We get the following error:
Cannot create a connection to data source 'DataSourceReports'
(this is the name we used for the TargetDataSourceFolder)
error:40 - Could not open a connection to SQL Server
We also get the same error when the app we deployed runs the reports.
Please let us know what is not set up correctly on the SQL Server side.
A likely possibility is that you are experiencing a double hop authentication problem. It's not clear from your explanation, but is the SQL Server database on a separate server from the report server? If so, then your credentials allow you to connect to the report server but Windows integrated security does not pass those credentials on to the SQL Server database if you are using NTLM on the report server. The report server tries to use Kerberos on your network to authenticate by way of ticketing to the SQL Server database, but you must have this configured correctly on your network. See this article if you want to use Kerberos: http://technet.microsoft.com/en-us/library/ff679930(v=sql.100).aspx.
Another (easier) solution is to open the data source on the report server and change the authentication to use stored credentials. Make sure the credentials you use have read permission on the SQL Server database. The downside of this approach is that you cannot use row-level security in your report by user unless you design your report to capture user information and set up the query or a filter on the dataset to restrict data by user. If that's not a concern, the stored credentials are easy to set up and maintain - and you're going to have to do this anyway if you want to use caching, snapshots, or subscriptions. For more information on stored credentials, see http://msdn.microsoft.com/en-us/library/ms159736.aspx.