I have a Google Cloud SQL PostgreSQL database in which I can connect by using SSL and by entering my IP address in allowed connection settings. However, I do not want to list all the IP addresses that is going to connect to this database (because I do not know all the IP addresses). I have around 15 people which I want them to login to my database using QGIS and they should be able to change the data as this is a research. Security is not a big issue as this database will be online for a very short period of time. What connection method can you suggest? The users are not very proficient so I need to setup everything.
I hope you're doing fine.
I would like to suggest to set the connections with the Cloud SQL proxy as it will provide the security needed without using ssl or the need of authorize any network. so basically the set up is to:
Enable the API
Install the proxy client on your local machine
Determine how you will authenticate the proxy
If required by your authentication method, create a service account
Also you can find the steps on "Connecting to Cloud SQL from external applications"
Hope this works for you as I have never used it with QGIS but I believe that as you are using a proxy it won't be hard from there to use it with QGIS as if you connected to a local server.
Related
I am using a mobile application that connects directly to the database instance (Postgres), as such, I have to keep the ports open for traffic that is generated from the internet (4G, mobile app).
This mobile app (QFIELD, mobile version of QGIS) has a direct connection to the database, this is the reason why the database is reachable from the internet on a public ip but this is a critical issue for the security of the data and the requests that can be sent to the database.
I would like to proxy the requests so that the database is only available to local machines and not open for connections directly.
The mobile appp would send the request to an HTTP url which would send the request to the local ip and port, this way I would avoid to have the database exposed on the internet.
Ideally, I would like to go from this app (which uses a postgres connection string to connect to the server) to an HTTP server that routes the request locally, as such:
APP connects to https://myproxy/postgres
Request is proxied to a local server
Can I do this with Apache2? Any ideas?
At the moment I cannot write a middleware that proxies requests from the APP to the local postgres.
If your application is expecting to connect directly to a PostgreSQL database and you don't want to change that then you need to connect to something that "speaks" PostgreSQL's client protocol.
You can place a proxy such as pgbouncer or pgpool in front of it, but they aren't a guarantee of greater security just by themselves. This is the same problem as with any proxy - it is just forwarding requests and responses to your actual server so any vulnerability is still exposed.
What you can do is:
restrict the number of connections at the proxy point
restrict which users can connect non-locally to your PostgreSQL cluster
restrict where they can connect from to just your proxy
restrict those users permissions within the database(s)
That last point is particularly important - assume any user account your application can be used maliciously. Restrict the account to prevent mass updating or deleting of data. Also take special care to restrict access to other users' data.
If I was forced to allow access like this, I would want one PostgreSQL user account per actual user at the very least. In practice I wouldn't get to this point with a production application.
I am trying to connect to a Google Cloud SQL database from my Google App Maker app. Unfortunately, my IT staff hasn't set up Google App Maker to use Google Cloud SQL as the default so I'm trying to connect to the database the same way I have connected to external MySQL databases in the past but it's not working with the public IP address.
I have created a Google Cloud SQL database and I'm able to connect to it from MySQLWorkbench using the public IP address. I had to add my the IP address for my home computer in order to connect the database. I did not need to use SSL.
I created a Google service account for my Google App Maker app. I then included this service account in the Google project that contains the Cloud SQL database. I assigned it permissions for Cloud SQL Admin and Cloud SQL Client.
I am using this code in App Maker to try and connect. It's the same code I have used with other external MySQL databases. The ip address 34.xx.xx.xx is the public IP address listed in the overview page of the Google Cloud SQL instance.
// App Settings
// Important Note: This is for demo purposes only, storing passwords in scripts
// is not recommended in production applications.
// DB Credentials (you need to provide these)
var address = '34.xx.xx.xx';
var db = 'Kenco_IoT_Template';
var dbUrl = 'jdbc:mysql://' + address + '/' + db;
var user = 'real_username';
var userPwd = 'real_password';
I receive this error message:
"Executing query for datasource ActivityTable: (Error) : Failed to establish a database connection. Check connection string, username and password. Please refer to the ReadMe and edit your database settings!"
My guess is the issue is with setting up the Cloud SQL database to accept the connection from Google App Maker. The best solution will be to enable Cloud SQL as the default for Google App Maker but I'm hoping there is some alternative I can use for now.
Any help appreciated.
To make this an answer so others can more easily find it:
The answer is that you needed to whitelist the IP ranges for appscript. Public IPs on Cloud SQL instances either require whitelisting of the IP addresses for access, or they need to use the Cloud SQL Proxy.
The OP also mentioned that they had to switch which Jdbc method they used from getConnection() to getCloudSqlConnection().
The option to authorize all apps belonging to the same project is missing in Google Cloud SQL - PostgreSQL. The documentation provide examples for authorization using the network setting 0.0.0.0/0 which simply allows all IPv4 connections.
As we do not know when the App Engine authorization feature would be available for PostgreSQL, what is the next best setting to allow the IP range of App engine instances? I am lost as they are dynamically allocated and ephemeral.
Specs
App Engine Flex (1 aspnetcore + 1 custom service on dotnet core)
Cloud SQL - PostgreSQL
Both belong to the same GCP project
The way to go in this case is to follow the documentation steps:
add 0.0.0.0/0 as the network and configure SSL access from the App Engine Flexible to the Cloud SQL PostgreSQL instance. The crucial part here is to adjust the PostgreSQL instance details, namely the SSL connections configuration. You need to allow only SSL connections to reach your instance, this way the GAE Flex instances (and only them, as having the SSL certificate) will be able to reach the instance with the database, even having dynamically allocated IPs.
To allow SSL connections only in your PostgreSQL instance:
Go to Cloud Console, choose the SQL section
Click on your PostgreSQL instance to view its details
Click the Allow only SSL connections button in the SSL tab
I've created one instance on Google Cloud with PostgreSql and I've connected the data studio with this database adding all the addresses specified in white list specified at link below
[https://support.google.com/datastudio/answer/7288010?hl=en]
With that solution I have to open access to my database to a lot of addresses. And this issue, associated to the fact that SSL is not supported is
a big lack of security.
Is there any different way to use google data studio for reports?
Maybe using CloudSqlProxy and considering google data studio as an external application from the GC environment?
Thanks for cooperation
Michele
I am assuming you are concerned about data being exposed due to the lack of support for SSL. Though that is a valid concern in a lot of cases, for your specific use case, it should not matter:
All the ip addresses that you have to whitelist here are Google Server/infrastructure addresses.
Data Studio as an application runs on Google's servers. So the communication between Google Cloud SQL and Google Data Studio will be entirely within Google's network. Even if it is not SSL, that traffic should not be exposed to outside world.
The connection between any client computer (where report is being viewed) and Data Studio will always be HTTPS.
However, if you still want to have an SSL connection, you can create a Community Connector in Apps Script that uses the JDBC service to connect to databases using SSL.
Try using client.key in both client fields.
The solution posted below helped here,
https://support.google.com/datastudio/thread/8739014?hl=en
Should I be able to setup secure gateway to be able to connect to my on-prem SQL server DB, using SQL Server Management Studio on my laptop from home (not on prem)?
You don't "have to" use the secure gateway in order for your application on the cloud to see your local db. You could simply give your application the public ip (and port) of the local machine and they should work fine.
It is however a good practise to use the Secure Gateway service as it can ensure the security of the local-to-cloud communication. Make sure to have a look at the documentation to learn how the service works - https://console.ng.bluemix.net/docs/services/SecureGateway/secure_gateway.html