Issues connecting to a Google Cloud SQL instance from Google Cloud Run - postgresql

I have a postgresql Google Cloud SQL instance and I'm trying to connect a FastAPI application running in Google Cloud Run to it, however I'm getting ConnectionRefusedError: [Errno 111] Connection refused errors.
My application uses the databases package for async database connections:
database = databases.Database(sqlalchemy_database_uri)
Which then tries to connect on app startup through:
#app.on_event("startup")
async def startup() -> None:
if not database.is_connected:
await database.connect() <--- this is where the error is raised
From reading through the documentation here it suggests forming the connection string like so:
"postgresql+psycopg2://user:pass#/dbname?unix_sock=/cloudsql/PROJECT_ID:REGION:INSTANCE_NAME/.s.PGSQL.5432"
I've tried several different variations of the url, with host instead of unix_sock as the sqlalchemy docs seem to suggest, as well as removing the .s.PGSQL.5432 at the end as I've seen some other SO posts suggest, all to no avail.
I've added the Cloud SQL connection to the instance in the Cloud Run dashboard and added a Cloud SQL Client role to the service account.
I'm able to connect to the databases locally with the Cloud SQL Auth Proxy.
I'm at a bit of a loss on how to fix this, or even how to debug it as there doesn't seem to be any easy way to ssh into container and try out some things. Any help would be greatly appreciated, thanks!
UPDATE
I'm able to connect directly with sqlalchemy with:
from sqlalchemy import create_engine
engine = create_engine(url)
engine.connect()
Where url is any of these formats:
"postgresql://user:pass#/db_name?host=/cloudsql/PROJECT_ID:REGION:INSTANCE_NAME"
"postgresql+psycopg2://user:pass#/db_name?host=/cloudsql/PROJECT_ID:REGION:INSTANCE_NAME"
"postgresql+pg8000://user:pass#/db_name?unix_sock=/cloudsql/PROJECT_ID:REGION:INSTANCE_NAME/.s.PGSQL.5432"
Is there something due to databases's async nature that's causing issues?

turns out this was a bug with the databases package. this should now be resolved with https://github.com/encode/databases/pull/423

Related

Connect to cloud SQL using Cloud SQL Auth is none resposinve in MySQL workbench

Im trying to create a connection for my SQL instance in GCP following their guide:
https://cloud.google.com/sql/docs/mysql/connect-admin-proxy
I set up the proxy running but I cant connect to my server.
I use MySQL workbench and the connection just timeout.
I went trough the trouble shoot guide and could not find the issue.
No errors in the cloud logs.
I try to connect using the owner google account of the project (I have all the permissions).
Cloud SQL Admin API is enabled.
I entered the password in the menu.
I saw another google guide telling to white list your IP.
I did this and its the same error.
It seems like there is a firewall or something is blocking from GCP to connect to the server but Im not sure what.
The solution for me was:
Use Cloud SQL authorized network as JM Gelilio suggested and to use pgAdmin 4 for Postgres connections.

Vapor app on Cloud Run throwing when connecting to Cloud SQL

I've built a Vapor app and am trying to deploy it to Google cloud run. At the same time I'm trying to connect my app to a Cloud SQL instance using unix sockets as it's documented here. There was also a issue opened on Vapor's postgres library here that mentions being successful using unix sockets to connect from cloud run.
Setup
My code looks like this:
let postgresConfig = PostgresConfiguration(unixDomainSocketPath: Environment.get("DB_SOCKET_PATH") ?? "/cloudsql",
username: Environment.get("DATABASE_USERNAME") ?? "vapor_username",
password: Environment.get("DATABASE_PASSWORD") ?? "vapor_password",
database: Environment.get("DATABASE_NAME") ?? "vapor_database")
app.databases.use(.postgres(configuration: postgresConfig), as: .psql)
I've also tested to see if the environment variables are there using this snippet, which didn't throw the fatalError.
if Environment.get("DB_SOCKET_PATH") == nil {
fatalError("No environment variables found...")
}
My DB_SOCKET_PATH looks like cloudsql/project-id:us-central1:sql-instance-connection-name/.s.PGSQL.5432
I've also set up the correct user and database in the Cloud SQL instance, as well as enabled public ip connections.
Whats happening?
When I deploy this image to google cloud run, it throws the error: Fatal error: Error raised at top level: connect(descriptor:addr:size:): No such file or directory (errno: 2): file Swift/ErrorType.swift, line 200
What have you done?
I tried testing unix socket connections on my local machine, and I found that when I had the incorrect unix socket, it would through this error.
I also tried commenting out all migration and connection code, which fixed the error, and narrowed it down to the PostgresConfiguration code.
What are you trying to do?
I'm trying to figure out how to connect my cloud run app to a cloud sql instance. Am I missing a configuration somewhere on my instances? Or am I making a mistake with my unix path/vapor implementation?

Is it possible to have an SSL connection to a postgresql db using Asyncpg?

I've been trying to deploy a discord bot on heroku for days, I use a postgresql database hosted on aws with heroku postgres. I've had many problems trying to connect with ssl to the database using asyncpg, I could find the solutions to some of the problems on the internet but I am now stuck with this error :
2020-12-28T10:33:11.856268+00:00 app[worker.1]: raise exceptions.InterfaceError(
2020-12-28T10:33:11.856403+00:00 app[worker.1]: asyncpg.exceptions._base.InterfaceError: `ssl` parameter can only be enabled for TCP addresses, got a UNIX socket path: '/run/postgresql/.s.PGSQL.5432'
This is how I connect to the database :
ctx = ssl.create_default_context(cafile='rds-ca-2019-eu-west-1.pem')
ctx.check_hostname = False
ctx.verify_mode = ssl.CERT_NONE
bot.con = await asyncpg.create_pool(dsn=DATABASE_URI, ssl=ctx)
...
bot.loop.run_until_complete(create_db_pool())
I have no clue of why I have that error, when trying to deploy my code on Heroku. I don't have that problem running it from my computer..
Tried to read the asyncpg documentation, not much came out of it. I don't know why it's using unix sockets and didn't really understand how it works, but I've read that asyncpg is supposed to support ssl connexion now...
Do you think there is a way to make this work, or should I just switch to psycopg2 ?

How to connect cloud function to cloudsql

How I can connect cloud function to cloudsql.
import psycopg2
def hello_gcs(event, context):
print("Imported")
conn = psycopg2.connect("dbname='db_bio' user='postgres' host='XXXX' password='aox199'")
print("Connected")
file = event
print(f"Processing file: {file['name']}.")
Could not connect to cloud sql's postgres version, please help.
Google Cloud Function provides a unix socket to automatically authenticate connections to your Cloud SQL instance if it is in the same project. This socket is located at /cloudsql/[instance_connection_name].
conn = psycopg2.connect(host='/cloudsql/[instance_connection_name]', dbname='my-db', user='my-user', password='my-password')
You can find the full documentation page (including instructions for authentication from a different project) here.
You could use the Python example mentioned on this public issue tracker, or use the Node.JS code shown in this document to connect to Cloud SQL using Cloud Functions.

Google Cloud SQL: SQLSTATE[HY000] [2013] Lost connection to MySQL server at 'reading initial communication packet', system error: 0

I'm desperate since my Google Cloud SQL instance went down. I could connect to it yesterday without problem but since this morning i'm unable to connect to it in any way, it produces the following error: The database server returned this error: SQLSTATE[HY000] [2013] Lost connection to MySQL server at 'reading initial communication packet', system error: 0
This is what I did to try to fix this:
restart instance
added authorized ip-addresses in CIDR notation
reset root password
restored backup
pinged the ip-address and I get response
All these actions completed but i'm still unable to connect through:
PHP
MySQL workbench
Ubuntu MySQL command line
All without luck. What could I do to repair my Cloud SQL instance. Is anyone else having this problem?
I'm from the Cloud SQL team. We are looking into this issue, it should be resolved soon. See https://groups.google.com/forum/#!topic/google-cloud-sql-announce/SwomB2zuRDo. Updates will be posted on that thread (and if there's anything particularly important I'll edit this post).
The problem seems to only affect connections from outside Google Cloud. Clients connecting from App Engine and Compute Engine should work fine.
Our company has same problem.
We are unable to connect through both MySQL workbench and MySQL command line.
Our Google Appengine application has no problems to connect since its not using external IP.
there.I encountered the same problem.You need to find out your public ip address,for that type "my public ip" in Google.Now click on your Cloud SQL instance that you created,under that click on ACCESS CONTROL tab and then click on Authorization tab under that.Under Authorized network,give any name you want to the network and copy your public ip address in the network.Now save changes and try to run the command from console.It should work fine.