When using Google Appengine Flexible + Python, how to use Psycopg2 directly (without SQLAlchemy) to access a CloudSQL PostgreSQL database?
Hello myselfhimself,
Here is a solution:
in your app.yaml, add an environment variable, imitating Google Appengine Flexible Python CloudSQL documentation's SQLAlchemy's URI but without the psycopg2+ prefix:
env_variables:
PSYCOPG2_POSTGRESQL_URI: postgresql://user:password#/databasename?host=/cloudsql/project-name:region:database-instance-name
in any python file to be deployed and run, pass that environment variable to psycopg2's connect statement directly. This leverages psycopg2.connect's ability to pass the URI directly to the psql client library (this might not work with older PostgreSQL versions..).
import os
import psycopg2
conn = psycopg2.connect(os.environ['PSYCOPG2_POSTGRESQL_URI'])
when working locally with the google cloud proxy tool, make sure you set the URI environment variable first, if your local server is not aware of app.yaml:
export PSYCOPG2_POSTGRESQL_URI="postgresql://user:password#/databasename?host=/cloudsql/project-name:region:database-instance-name"
./cloud_sql_proxy -instances=project-name:region:database-instance-name=tcp:5432
#somewhat later:
python myserver.py
I hope it will work for your too : )
Related
I have a Google App Engine app written in Node.js which connects to a Cloud SQL Postgres instance under the same GCP project. Previously I was using the standard environment and it worked fine, but when I switch to a flex environment, the SQL connection no longer works.
My app.yaml:
runtime: nodejs
env: flex
manual_scaling:
instances: 1
beta_settings:
cloud_sql_instances: mopho-217900:us-central1:mopho
I'm connecting to the DB through Knex with hostname /cloudsql/mopho-217900:us-central1:mopho. This gives me the following error:
Error: connect ENOENT /cloudsql/mopho-217900:us-central1:mopho/.s.PGSQL.5432
The username/password/database I'm providing are all valid, and continue to work if I switch back to the standard environment. It only fails when I switch to the flexible environment.
It turns out that the Cloud SQL Admin API needed to be enabled. (thanks to this Google groups post for the pointer)
How I can connect cloud function to cloudsql.
import psycopg2
def hello_gcs(event, context):
print("Imported")
conn = psycopg2.connect("dbname='db_bio' user='postgres' host='XXXX' password='aox199'")
print("Connected")
file = event
print(f"Processing file: {file['name']}.")
Could not connect to cloud sql's postgres version, please help.
Google Cloud Function provides a unix socket to automatically authenticate connections to your Cloud SQL instance if it is in the same project. This socket is located at /cloudsql/[instance_connection_name].
conn = psycopg2.connect(host='/cloudsql/[instance_connection_name]', dbname='my-db', user='my-user', password='my-password')
You can find the full documentation page (including instructions for authentication from a different project) here.
You could use the Python example mentioned on this public issue tracker, or use the Node.JS code shown in this document to connect to Cloud SQL using Cloud Functions.
I have a Compose for Postgresql service on IBM Bluemix which isn't allowing me run PostGIS functions on my cloud foundry rails app. I have run "CREATE EXTENSION PostGIS;" and I have also added the adapter to database.yml. Compose for Postgresql says PostGIS comes installed by default
I am using Ruby on Rails with the rgeo gem and the error is
ERR NoMethodError: undefined method `st_point' for #
Can you please let me know if there is anything I need to do to get PostGIS working?
Please raise a support request asking for the postgis plugin to be enabled on your compose instance.
Answered my own question. The problem was with the rgeo gem and the adapter. I needed the postgis:// adapter for working with the gem.
Bluemix does not allow you to change the adapter in their connections. It will always be postgresql. To get around this I set a CUSTOM_DATABASE_URL environment variable with the connection string postgis://<username>:<password>#host:port/<db_name>. Using the cf client this would look like
cf set-env <app-name> CUSTOM_DATABASE_URL postgis://<username>:<password>#host:port/<db_name>
Then in the command for my container in the manifest.yml, I prepended setting DATABASE_URL = CUSTOM_DATABASE_URL, specifically
DATABASE_URL=$CUSTOM_DATABASE_URL &&.....
Its a workaround for now until Bluemix allows us to change the adapter in the connections.
I installed Orange and I have data in a local PGSQL server.
PGSQL listens on the default port which is 5432.
I have the psycopg2 lib installed, and I also wrote a short python script which pulls some data from the database to check the module is insatlled correctly.
Firewall is down.
Python Env Path is set to use 3.4.4 which is what Orange3 uses.
When I add a sql table widget, I get an error suggesting "please install a backend to use this widgt"
Documentaion in Orange site mentions that all that needs to be done for the DB integration is just installing the python module, but this doesnt work for me.
Help would be appreciated.
Links:
https://docs.orange.biolab.si/3/visual-programming/widgets/data/sqltable.html
I am building a Django site on Google Compute Engine, and I want to install my database in SQL Cloud. It is possible?
What is the most common way to do this? Installing MySQL on virtual machine or use a Cloud SQL instance?
Thank you.
You can use either Google Cloud SQL or manage your own SQL database, depending on your needs.
To use Cloud SQL, you'd want to follow the instructions here: https://developers.google.com/cloud-sql/docs/external
If you want to manage your own SQL database, you can install MySQL or some other database on an instance. Depending on your needs, you can start with a g1-small with a fairly large disk attached and then later use a larger instance type to run your database.
If you're running your own database, you'll need to make sure to take regular backups and copy them off the database machine, to someplace like Google Cloud Storage. If you're using Cloud SQL, you can use the console or the API to schedule database backups.
This answer is following up from "Well, the problem is that to use Cloud SQL, I must connect using JDBC. I'm using Python. How I can do?"
I am not from Python world, but I recently connected my Java app on GCE instance to a Cloud-Sql DB (via cloud-sql-proxy approach, as described here: https://cloud.google.com/sql/docs/compute-engine-access) and didn't see any reason why it shouldn't work for Python too.
Here is what I just tried and easily connected my test Python app to a Cloud-Sql DB, via the cloud-sql-proxy:
Step 1: Download and run the proxy on a local port, like below (this establishes a channel between the local port 3306 and the Cloud-SQL database instance identified by the connection name "PROJ_NAME:TIMEZONE:SQL_NAME"):
sudo wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64
sudo mv cloud_sql_proxy.linux.amd64 cloud_sql_proxy
sudo chmod +x cloud_sql_proxy
sudo ./cloud_sql_proxy -instances=PROJ_NAME:TIMEZONE:SQL_NAME=tcp:3306 &
Step 2: Make sure that python-mysqldb is installed
sudo apt-get install python-mysqldb
Steo 3: Ran the following test program to connect to the Cloud-SQL db, via the local socket 3306, setup by the proxy:
import MySQLdb
conn = MySQLdb.connect(host= "127.0.0.1", port=3306, user="root", passwd="my_root_password", db="my_db")
x = conn.cursor()
try:
x.execute("""INSERT INTO Test(test_id) VALUES ('111')""")
conn.commit()
except:
conn.rollback()
conn.close()
Hope it helps.