We are using google cloud sql instace with our project and we are using Squirrel and Toad as admin tools for our instance.
In the recent days our instace performance is very slow. Connecting our cloud sql instace with the admin tool itself very slow.
In some times it not even establishes the connection with the admin tool for around 2 mins. After that only the connection get established.
We also checked with other internet connection also. In that also the performance is too low.
Can you plese check the performance of the google cloud sql.
Related
I’m using Postgres provisioned by Google Cloud SQL,
Recently we see the number of connections to increase by a lot.
Had to raise the limit from 200 to 500, then to 1000. In Google Cloud console Postgres reports 800 currenct connections.
However I have no idea where these connections come from. We have one app engine service, with not a lot of traffic at the moment accessing it, another application hosted on kubernetes. And a dozen or so batch jobs that connect to it. Clearly there must be some connection leakage somewhere.
Is there any way I can see from where these connections originate ?
All applications connecting to it are Java based at the moment.
They use the HikariCP connection pool. I’m considering changing the “test query”upon connection to insert a record in a log table. Hence I could perhaps find out from where the connections originate.
But are there better ways available?
Thanks,
Consider monitoring connection activity with pg_stat_activity, i.e: SELECT * from pg_stat_activity;.
As per the documentation:
Connections that show an IP address, such as 1.2.3.4, are connecting using IP. Connections with cloudsqlproxy~1.2.3.4 are using the Cloud SQL Proxy, or else they originated from App Engine. Connections from localhost are usually to a First Generation instance from App Engine, although that path is also used by some internal Cloud SQL processes.
Also, take a look at the best practices for managing database connections that contain information on opening and closing connections, connection count, or on how to set a connection duration in the Java programming language.
I am trying to test a connection using Cloud Data Fusion to connect to an on-prem SQL Server. Our GCP Project does not use the default network but rather a custom VPC.
It's important to note that security is very important as this database contains healthcare data.
We currently have App Engine Flex code that uses pymssql to query the database on this SQL Server through the VPC, and want to test using Data Fusion.
I have copied and added the generated data fusion service account to IAM with role Cloud Data Fusion API Service Agent.
I have configured in Data Fusion
system.profile.properties.network = <VPC name>
I have verified that the username and password for authentication to SQL Server are valid for the database.
THE VPC network allows port 22 and 1433.
At this point I am just trying to get a successful connection to query a table in the database.
Here is the error message I get:
Connect timed out. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.
Thoughts?
It looks like you can access SQL Server from GCP (via AppEngine Flex Py code) but having trouble connecting through Data Fusion. It also looks like authentication doesnt seem to be a problem (as the error message is around connection).
One area I can think of is to try Cloud Data Fusion private IP, that allows you to securely connect to your VPC from Cloud Data Fusion. This is a new feature and you will need Data Fusion product team support to leverage this feature.
I installed oracle-jdbc thin driver to connect with On prem oracle DB but when I test the connection I get network adapter error
I tried the changing the host but still same
When running the pipeline from GCS-BQ I getting network port error. Can we change the VPC the pipeline is running on ?
Regarding the oracle db connection error, is the db available on the public network for connection? Currently wrangler service in Cloud Data Fusion cannot talk to the on-prem db over a private connection and we are actively working towards it.
However if the db is available on the public network then it seems like the issue with the oracle db configurations. Can you please take a look at this answer and see if it helps - Oracle SQL Developer: Failure - Test failed: The Network Adapter could not establish the connection?
Also are you able to connect to the oracle db through some other query tool such as SqlWorkbench?
Breaking down your question:
1. Connecting to on-prem databases
It is possible nowadays to connect to on-premise databases. Make sure you created an interconnect between the on-prem network and the network used by Data Fusion instance and make sure you applied the right firewall rules (seems you are getting firewall issues by the logs). I suggest trying to connect directly in the database first to confirm that the network setup works.
2. Change network configurations on the Data Fusion job.
You can specify parameters for your job. There are options to change the network and subnetwork that the job will be executed under Configure > Compute config > Customize option. If you use shared VPC you can also specify the Host project.
I deployed my java web application on Bluemix Dedicated environment and use it with Cloudant Dedicated NoSql DB. In this DB i tried to return 60k documents and server returned
500 Error: Failed to establish a backside connection
to me. So i'm wondering about connection timeout in Bluemix, there're posts where people claim that Bluemix resets a network connection in 120 if there's no response received. Is it possible to change this setting, or maybe someone knows how to solve such problem.
P.S. When I deploy it on my computer then it works fine, but of course it takes some time. Particularly this case may be solved using cloudant pagination, but i develop service for scheduling REST-calls and if bluemix reset all connections after 2 minutes i'll have a big problems with it.
Not sure which Bluemix Dedicated you are using, but the timeout is typically global. Paging would work and I thinking a websocket based approach would work as well.
-r
I have successfully deployed Google Cloud SQL with phpmyadmin running on App Engine. When connecting the database directly using App Engine the performance is fine but while connecting it via sql client or external server it takes too long to load.
Can anyone tell me what might be the reason for this behavior?