google cloud not releasing connections - google-cloud-sql

We migrated our mysql db to google cloud and before the migration, open connection to the db was constant at 30. However in cloud, it goes up and down average 200.
Is there any know issue with google cloud causing connections not being released?

connect_timeout and max_connections of Cloud SQL instances have different default values compare to standard MySQL installation. You need to look at your codes to see how connect_timeout impact the number of connections of your application:
variable Cloud SQL MySQL
------------------------------------------
connect_timeout 60 10
max_connections 250 151

Related

Google Cloud SQL FATAL: hot standby is not possible because max_connections = 100 is a lower setting than on the master server (its value was 500)

I was running PostgresSQL 13.4 replica instance on Google Cloud SQL. The replica keeps failing to start after maintenance with the error FATAL: hot standby is not possible because max_connections = 100 is a lower setting than on the master server (its value was 500) even though max_connections = 500 flag which is same as primary instance is explicitly set.
I found this post PostgreSQL 9.5 - change to max_connections not being visible to slaves but hot_standby flag is not modifiable in Cloud SQL.
Restarting and stopping replication does not work so far.

"error: too many connections for database 'postgres'" when trying to connect to any Postgres 13 instance

My team and I are currently experiencing an issue where we can't connect to Cloud SQL's Postgres instance(s) from anything other than the psql cli tool. We get a too many connections for database "postgres" error (in PGAdmin, DBeaver, and our node typeorm/pg backend). It initially happened on our (only) Postgres database instance. After restarting, stopping and starting again, increasing machine CPU/memory proved to do nothing, I deleted the database instance entirely and created a new one from scratch.
However, after a few hours the problem came back. I know that we're not actually having too many connections as I am able to query pg_stat_activity from psql command line and see the following:
Only one of those (postgres username) connections is ours.
My coworker also can't connect at all - not even from psql cli.
If it matters, we are using PostgreSQL 13, europe-west2 (London), single zone availability, db-g1-small instance with 1.7GB memory, 10GB HDD, and we have public IP enabled and the correct IP addresses whitelisted.
I'd really appreciate if anyone has any insights into what's causing this.
EDIT: I further increased the instance size (to no longer be a shared core), and I managed to successfully connect my backend to it. However my psql cli no longer works - it appears that only the first client to connect is allowed to connect after a restart (even if it disconnects, other clients can't connect...).
From the error message, it is clear that the database "postgres" has a custom connection limit (set, for example, by ALTER DATABASE postgres CONNECTION LIMIT 1). And apparently, it is quite small. Why is everyone try to connect to that database anyway? Usually 'postgres' database is reserved for maintenance operations, and you should create other databases for daily use.
You can see the setting with:
select datconnlimit from pg_database where datname='postgres';
I don't know if the low setting is something you did, or maybe Google does it on its own for their cloud offering.
#jjanes had the right idea/mention.
I created another database within the Cloud SQL instance that wasn't named postgres and then it was fine.
It wasn't anything to do with maximum connection settings (as this was within Google Cloud SQL) or not closing connections (as TypeORM/pg does this already).

How to change max_connections for Postgres through SQL command

We have a hosted PostgreSQL, with no access to the system or *.conf files.
I do have a admin access and can connect to it using Oracle SQL developer.
Can I run any command to increase the max_connections. All other parameters seems to be ok shared mem and buffers can hold more connections so there is not problem there.
Changing max_connection parameter needs a Postgres restart
Commands
Check max_connection just to keep current value in mind
SHOW max_connections;
Change max_connection value
ALTER SYSTEM SET max_connections TO '500';
Restart PostgreSQL server
Apparently, the hosted Postgres we are using does not provide this option. (compose.io)
So the work around is to use a pgbouncer to manage you connections better.

Postgresql dies with 235 + concurrent connections

I have installed postgresql on an Azure VM and am running tests to see if postgresql can support the expected load. I have increased the max_connections value to 1000 but when I run ab -c 300, postgresql stops responding. Are there any other settings I should be changing?
Thanks, Kate.
PostgreSQL will perform best with a lot less than 1000 connections on most hardware. Usually less than 100. If your application cannot queue work using a connection pool, you should put an external connection pool like PgBouncer between your application and PostgreSQL.
See: https://wiki.postgresql.org/wiki/Number_Of_Database_Connections

Play framework configuration for connection limit 60 of postgreql on heroku

I have set up three Play 2.1.1 application (Api, Admin Panel and Website) on heroku with postgresql as the database. Out of these the Api and the Admin Panel are accessing the database. The postgresql package I have set up consists of following configurations:
Connection Limit : 60
Row Limit : Unlimited
RAM : 410 MB
I have the following configurations in the Play application for the database in both Api & Admin Panel:
db.default.url=DATABASE_URL
db.default.partitionCount=1
db.default.maxConnectionsPerPartition=10
db.default.minConnectionsPerPartition=5
db.default.driver="org.postgresql.Driver"
db.default.idleMaxAge=10 minutes
db.default.idleConnectionTestPeriod=30 seconds
db.default.connectionTimeout=20 second
db.default.connectionTestStatement="SELECT 1"
db.default.maxConnectionAge=30 minutes
I am getting the Timeout exception for connecting to database of BoneCp. I just want someone to verify if the above configurations are right so that I can debug in right way.
Please help me with the same.
Thank you.
heroku close all connections after 30 seconds so your maxConnectionAge has to be lower than 30s