Accessing Google Cloud SQL from Google Compute Engine using private network - google-cloud-sql

is it possible to access Google Cloud SQL from Google Compute Engine using the private network?
It appears that Google Cloud SQL sees the public network IP for the Google Compute Engine instance.
And, the web console doesn't allow entering the instance private address.

No it is not possible to access Google Cloud SQL instances via a private IP address.
This this page confirms this, it says Note: You must use the external (public) IP address of the GCE instance ...when configuring Authorized IP Addresses to your cloud sql instance from your GCE instance.

This is now available via private services access and VPC Network Peering.
The announcement:
https://cloud.google.com/blog/products/databases/introducing-private-networking-connection-for-cloud-sql
Details:
https://cloud.google.com/sql/docs/postgres/private-ip

You can't access cloud sql from a private IP address but you can whitelist NAT instance's Public IP in order to access cloudsql from private server.

Related

Access on prem sql server without VMs through azure data factory

Currently I have SHIR in vm which has virtual network set up to connect to on prem sql for my data factory. I want to get rid of the VM and still access the onprem sql using the virtual network. Is it possible to use private link service to pull the data from onprem using adf managed private endpoint? I found one resource where they do this using a VM on top of the private link service. However, that still requires a VM.
Yes, It is possible to use a private link service and a managed private endpoint to access an on-premises SQL Server instance from ADF.
Using the private IP address, you can use the managed private endpoint to connect to the on-prem SQL Server. This eliminates the use of VM.
For this, You need to create a virtual network and establish a connection to your on-premises network using a VPN or ExpressRoute.
Then you need to create a private endpoint for the SQL Server instance and use the private IP address to connect to SQL. You need to use SHIR(Which is installed on an on-premises machine and configured to connect to your virtual network) to connect to on-prem SQL via a private network using the private IP of the endpoint.

Cloudsql access from ai-platform job

Google has nice ways to connect to cloudsql from other google services but I cannot see how to connect from ai-platform jobs. As part of our training job, we need to update our cloudsql db with metrics but the only I could get it to work is by whitelisting all IPs (don't want that!) in the cloudsql and connecting via the public IP. I don't see an option to add cloud-sql-proxy to the trainer instance. Since the IP of the trainer instance is dynamic, we cannot reliably add specific IP address to whitelist. Any other ways to handle this?
It looks like AI Platform supports VPC peering, so you should be able to connect to Cloud SQL using private IP.
Since Cloud SQL also uses VPC peering, you'll likely need to do the following to get the resources to connect:
Create a VPC to share (or use the "default" VPC)
Follow the steps here to setup VPC peering for AI Platform in your VPC.
Follow the steps here to setup a private IP for your instance in your VPC.
Since the resources are technically in different networks, you may need to export custom routes (Step #2) to allow the AI platform access to your Cloud SQL instance.
Alternatively to using private IP, you could keep using public IP w/ an IP allowlist coupled with Authorizing with SSL/TLS certificates. This still isn't as secure as using the proxy or private IP (as users are technically able to connect to your instance), but they'll be unable to interact with the database engine without the correct certificates.
Can you publish a PubSub message from within your training job and have it trigger a cloud function that connects to the database? AI Platform training seems to have IAM restrictions that I too am curious how to control.

How can I connect to Google Cloud SQL database as an external mysql database from Google App Maker?

I am trying to connect to a Google Cloud SQL database from my Google App Maker app. Unfortunately, my IT staff hasn't set up Google App Maker to use Google Cloud SQL as the default so I'm trying to connect to the database the same way I have connected to external MySQL databases in the past but it's not working with the public IP address.
I have created a Google Cloud SQL database and I'm able to connect to it from MySQLWorkbench using the public IP address. I had to add my the IP address for my home computer in order to connect the database. I did not need to use SSL.
I created a Google service account for my Google App Maker app. I then included this service account in the Google project that contains the Cloud SQL database. I assigned it permissions for Cloud SQL Admin and Cloud SQL Client.
I am using this code in App Maker to try and connect. It's the same code I have used with other external MySQL databases. The ip address 34.xx.xx.xx is the public IP address listed in the overview page of the Google Cloud SQL instance.
// App Settings
// Important Note: This is for demo purposes only, storing passwords in scripts
// is not recommended in production applications.
// DB Credentials (you need to provide these)
var address = '34.xx.xx.xx';
var db = 'Kenco_IoT_Template';
var dbUrl = 'jdbc:mysql://' + address + '/' + db;
var user = 'real_username';
var userPwd = 'real_password';
I receive this error message:
"Executing query for datasource ActivityTable: (Error) : Failed to establish a database connection. Check connection string, username and password. Please refer to the ReadMe and edit your database settings!"
My guess is the issue is with setting up the Cloud SQL database to accept the connection from Google App Maker. The best solution will be to enable Cloud SQL as the default for Google App Maker but I'm hoping there is some alternative I can use for now.
Any help appreciated.
To make this an answer so others can more easily find it:
The answer is that you needed to whitelist the IP ranges for appscript. Public IPs on Cloud SQL instances either require whitelisting of the IP addresses for access, or they need to use the Cloud SQL Proxy.
The OP also mentioned that they had to switch which Jdbc method they used from getConnection() to getCloudSqlConnection().

GKE private cluster and cloud sql proxy connection

I have 2 GKE cluster both private and public and using cloudproxy as sidecar container for gke app to access cloudsql instance.
public cluster setup for development/testing
Cloud SQL is enabled with both private and public IP.
GKE app is using cloudproxy with default option of ip types (public,private) as below
Cloud SQL doesn't have any authorized network.
In this case, my app is able to connect CloudSQL and works smoothly. As far as I understand, here connection to cloudsql should be happening with private becuase there is no authorised network configured.
private cluster setup for production
Cloud SQL is enabled with both private and public IP.
GKE app is using cloudproxy with default option of ip types (public,private)
cloudsql-proxy setting in deployment file
- name: cloudsql-proxy
image: gcr.io/cloudsql-docker/gce-proxy:1.11
command: ["/cloud_sql_proxy"]
args: ["-instances=$(REAL_DB_HOST)=tcp:$(REAL_DB_PORT)","-credential_file=/secrets/cloudsql/credentials.json"]
case 1
Cloud SQL doesn't have any authorized network.
Result: Application is not able to connect with Cloud SQL
case 2
Cloud SQL have private GKE NAT gateway as authorized network
Result: Application is not able to connect with Cloud SQL
May be removing cloudproxy from application will work (I am yet to test) but it discourages the usage of proxy during dev env as it will need changes in deployment file during production deployment.
I am not able to understand what is causing the connection failure with cloudproxy in gke private cluster. Should we not use cloudproxy in private cluster?
Update
The reason due to which cloud proxy not able to connect cloud sql was disabled Cloud SQL Admin API. I have updated my answer in answer section.
It looks like the question here is "Should we use the Cloud SQL proxy in a private cluster?" and that answer is "it depends". It's not required to connect, but it allows for more security because you can restrict unnecessary access to your Cloud SQL server.
The Cloud SQL proxy doesn't provide connectivity for you application - it only provides authentication. It has to be able to connect via the existing path, but then uses the Service Account's IAM roles to authenticate the connection. This also means that it doesn't have to come from a whitelisted network because it's been authenticated by a different means.
If you want to use the proxy to connect via Private IP (instead of defaulting to public), use the -ip_address_types=PRIVATE - this will tell the proxy to connect with the instance's Private IP instead. (Please note that if the proxy lacks a network path (eg, isn't on the VPC) that proxy will still be unable to connect.)
#kurtisvg has provided an informative answer to it.
However the real issue was SQL Admin API and enabling it fixed the issue. After looking into the logs I found below entry.
Error 403: Access Not Configured. Cloud SQL Admin API has not been used in project XXXXXX before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/sqladmin.googleapis.com/overview?
The issue for me was enabling Private cluster in GKE cluster :(
Because of private GKE cluster it wasn't having access to external IP addresses and fix was to create a NAT gateway with cloud router as per https://cloud.google.com/nat/docs/gke-example.
Hint if it's the issue is you won't be able to ping to google.com etc from the container after logging into it.

Fort rabbit and Google Cloud SQL

I'd like to use fortrabbit with Google Cloud SQL. Google's Cloud SQL requires to whitelist any IPs that want to access the db, and it seems that fort rabbit doesn't guarantee the outbound IP? How can I access my Cloud SQL data from fortrabbit?
[edit] Cloud SQL does not support whitelisting all IP's like 0.0.0.0. Having said that, it's enough if you can provide a subnet that can catch all the IP's from which your connections can possibly originate from. If you provide a broad IP range for the authorized subnet, please make sure your database is protected with strong user-name and passwords to protect from unauthorized access.