Connect Cloud Run to Cloud SQL Server Instance in C# - google-cloud-sql

If I understand the "Cloud SQL Connections" tab in Cloud Run should instantiate the Cloud SQL Proxy.
What is the sql server connectionstring that I should use to make this work?
Setup : (All in the the same GCP Project):
1. Create a Cloud Sql instance of SQL Server
2. Upload your docker image to Google Container registry.
Written using .netcore with code to connect to the SQL Server created in step 1
2. Create a Service instance in Google Cloud Run.
3. Specify Cloud SQL Connections and select your sql server instance from the list and deploy.

I've not tried this using Cloud Run and SQL Server but ...
The proxy should make a connection available to your .NET client on 127.0.0.1:1443 (link)
Assuming you're using a database client similar to the Google example, your connection string will be:
"ConnectionString": "User Id=[[USER]];Password=[[PASS]];Server=127.0.0.1;Database=[[DB]];"
If I understand correctly, the default port is 1443.
NB Per other commenters, your question would be improved with more details. When you write that you're completing steps, please include the links. When you reference your configuration, please include snippets. Folks answering your questions benefit from having to assume as little as information as possible from questions.

I do not think it is supported yet. There is no documentation for Cloud SQL Server.
According to the official documentation :
Once correctly configured, you can connect your service to your Cloud
SQL instance's unix domain socket using the format:
/cloudsql/INSTANCE_CONNECTION_NAME.
Note: Cloud Run (fully managed) does not support connecting to the
Cloud SQL instance using TCP. Your code should not try to access the
instance using an IP address such as 127.0.0.1 or 172.17.0.1.
Also:
Note: The Cloud SQL Proxy does not support Unix sockets on Windows.
I tried to do it using Cloud SQL Proxy with tcp and got:
System.Net.Internals.SocketExceptionFactory+ExtendedSocketException (111): Connection refused 127.0.0.1:1433

Related

How to read oracle db over vpn from azure databricks?

When I try to read a oracle table via azure databricks(I connect to vpn for accessing this db) , it shows below error
Java. Sql. Sqlrecoverableexception : IO Error : The network adapter could not establish the connection..
Do I need to specify the VPN details in databricks?
Even if you connected to VPN, the Databricks cluster that is running in the cloud couldn't reach your on-premise Oracle installation. To achieve that you need to work with your networking team on setting VPN connection between on-premise & cloud. It's described in the documentation in great details, so it doesn't make sense to repeat it here.
Hostnames which are being used in the Databricks clusters should be enabled to access on-prem resources through Endpoint Services.You can ref : link

Connectivity between Cloud Run and Cloud SQL (Internal IP)

I have created my organisation infrastructure in GCP following the Cloud Foundation Toolkit using the Terraform modules provided by Google.
The following table list the IP ranges for all environments:
Now I am in the process of deploying my application that consists of basically Cloud Run services and a Cloud SQL (Postgres) instance.
The Cloud SQL instance was created with a private IP from the "unallocated" IP range that is reserved for peered services (such as Cloud SQL).
In order to establish connectivity between Cloud Run and Cloud SQL, I have also created the Serverless VPC Connector (ip range 10.1.0.16/28) and configured the Cloud SQL proxy.
When I try to connect to the database from the Cloud Run service I get this error after ~10s:
CloudSQL connection failed. Please see https://cloud.google.com/sql/docs/mysql/connect-run for additional details: Post "https://www.googleapis.com/sql/v1beta4/projects/[my-project]/instances/platform-db/createEphemeral?alt=json&prettyPrint=false": context deadline exceeded
I have granted roles/vpcaccess.user for both the default Cloud Run SA and the one used by the application in the host project.
I have granted roles/compute.networkUser for both SAs in the service project. I also granted roles/cloudsql.client for both SAs.
I have enabled servicenetworking.googleapis.com and vpcaccess.googleapis.com in the service project.
I have run out of ideas and I can't figure out what the issue is.
It seems like a timeout error when Cloud Run tries to create a POST request to the Cloud SQL API. So it seems like the VPC connector (10.1.0.16/28) cannot connect to the Cloud SQL instance (10.0.80.0/20).
Has anyone experienced this issue before?
When you use the Cloud SQL built-in connexion in Cloud Run (but also App Engine and Cloud Function) a connexion similar to Cloud SQL proxy is created. This connexion can be achieved only on a Cloud SQL public IP, even if you have a serverless VPC connector and your database reachable through the VPC.
If you have only a private IP on Cloud SQL, you need to use the private IP to reach the database, not the built-in Cloud SQL connector. More detail in the documentation
I also wrote an article on this
If you are using a private IP, you need to check the docker bridge network's IP range. Here is what the documentation says:
If a client cannot connect to the Cloud SQL instance using private IP, check to see if the client is using any IP in the range 172.17.0.0/16. Connections fail from any IP within the 172.17.0.0/16 range to Cloud SQL instances using private IP. Similarly, Cloud SQL instances created with an IP in that range are unreachable. This range is reserved for the docker bridge network.
To resolve some of the issues, you are experiencing, follow the documentation here and post any error messages you receive, for example, you could try:
Try the gcloud sql connect command to connect to your instance. This command authorizes your IP address for a short time. You can run this command in an environment with Cloud SDK and mysql client installed. You can also run this command in Cloud Shell, which is available in the Google Cloud Console and has Cloud SDK and the mysql client pre-installed.
Temporarily allow all IP addresses to connect to an instance. For IPv4 authorize 0.0.0.0/0 (for IPv6, authorize ::/0. After you have tested this, please make sure you remove it again as it opens up to the world!
Are you using connection pools?
If not, I would create a cache of connections so that when your application needs to link to the database, it can get a temporary connection from the pool. Once the application has finished its operation, the connection returns to the pool again for later use. For this to work correctly, the connection needs to be open and closed efficiently and not waste any resources.

Connecting to Google Cloud SQL from my machine

I'm trying to connect to Google Cloud SQL from my machine (Ubuntu) using this command:
mysql --host='Public IP' --user='' --password
However, I'm getting this error:
ERROR 2003 (HY000): Can't connect to MySQL server on 'Public IP' (110)
I need any help resolving my issue.
First you need to let the Cloud SQL instance which IP addresses it can accept. You can do that without SSL by following the instructions here. However, to be more secure, I would recommend you using SSL. More info on that here.
Probably the easiest way to securely connect from your local machine to a public ip of a cloud SQL instance is to download and use the proxy, following the instructions here:
https://cloud.google.com/sql/docs/mysql/connect-admin-proxy
What you have to do is add a network to the public ip section, under the connections tab after selecting your Cloud SQL instance.
See Cloud SQL Connections Tab here
So, for the name input you put firstname-lastname kind of thing to denote whose ip it is. Then input your IP address 1.2.3.4/32 into the network input.
After doing so and saving you will be able to connect.
Yes, you can add SSL and use certificates. That is all best practice and what should be done for a production stack. But if this is just getting off the ground and in rapid development, that's all you need to do in the beginning.

Cannot connect to on-prem SQL Server with Google Cloud Data Fusion

I am trying to test a connection using Cloud Data Fusion to connect to an on-prem SQL Server. Our GCP Project does not use the default network but rather a custom VPC.
It's important to note that security is very important as this database contains healthcare data.
We currently have App Engine Flex code that uses pymssql to query the database on this SQL Server through the VPC, and want to test using Data Fusion.
I have copied and added the generated data fusion service account to IAM with role Cloud Data Fusion API Service Agent.
I have configured in Data Fusion
system.profile.properties.network = <VPC name>
I have verified that the username and password for authentication to SQL Server are valid for the database.
THE VPC network allows port 22 and 1433.
At this point I am just trying to get a successful connection to query a table in the database.
Here is the error message I get:
Connect timed out. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.
Thoughts?
It looks like you can access SQL Server from GCP (via AppEngine Flex Py code) but having trouble connecting through Data Fusion. It also looks like authentication doesnt seem to be a problem (as the error message is around connection).
One area I can think of is to try Cloud Data Fusion private IP, that allows you to securely connect to your VPC from Cloud Data Fusion. This is a new feature and you will need Data Fusion product team support to leverage this feature.

Google Cloud Data Fusion 1. Does not connect to oracle 2. When the pipeline is running I get 'default' network port error

I installed oracle-jdbc thin driver to connect with On prem oracle DB but when I test the connection I get network adapter error
I tried the changing the host but still same
When running the pipeline from GCS-BQ I getting network port error. Can we change the VPC the pipeline is running on ?
Regarding the oracle db connection error, is the db available on the public network for connection? Currently wrangler service in Cloud Data Fusion cannot talk to the on-prem db over a private connection and we are actively working towards it.
However if the db is available on the public network then it seems like the issue with the oracle db configurations. Can you please take a look at this answer and see if it helps - Oracle SQL Developer: Failure - Test failed: The Network Adapter could not establish the connection?
Also are you able to connect to the oracle db through some other query tool such as SqlWorkbench?
Breaking down your question:
1. Connecting to on-prem databases
It is possible nowadays to connect to on-premise databases. Make sure you created an interconnect between the on-prem network and the network used by Data Fusion instance and make sure you applied the right firewall rules (seems you are getting firewall issues by the logs). I suggest trying to connect directly in the database first to confirm that the network setup works.
2. Change network configurations on the Data Fusion job.
You can specify parameters for your job. There are options to change the network and subnetwork that the job will be executed under Configure > Compute config > Customize option. If you use shared VPC you can also specify the Host project.