Connect to postgresql database using google apps script from google spreadsheet - postgresql

I've seen this question asked a few times before, most notably this StackOverflow question: Automate Google Spreadsheet data load from external database
and here on Google issue tracker:
https://issuetracker.google.com/issues/36752790
but I'm looking for a more up-to-date answer.
I have a cloud-SQL instance of a PostgreSQL database running on GCP. I would like to automatically connect to it and populate a spreadsheet on a daily basis, which is possible for MySQL databases.
The docs on Google say to use a jdbc socket factory to connect to a PostgreSQL db using the format:
"jdbc:postgresql://google/<DBNAME>?"
"socketFactory=com.google.cloud.sql.postgres.SocketFactory"
"&socketFactoryArg=<InstanceName>"
I've also tried the format (slightly different) from the Google GitHub page (linked under the example in the docs):
"jdbc:postgresql://google/<DATABASE_NAME>?cloudSqlInstance=<INSTANCE_CONNECTION_NAME>&socketFactory=com.google.cloud.sql.postgres.SocketFactory&user=<POSTGRESQL_USER_NAME>&password=<POSTGRESQL_USER_PASSWORD>"
After creating a url I run:
var conn = Jdbc.getConnection(dbUrl)
But get this error:
Connection URL uses an unsupported JDBC protocol.
I'm wondering if this is possible yet or if it is user error?

I found JDBC in Google Sheets, especially for Postgres, to be buggy, so I built SeekWell which lets you automatically send data from Postgres to Sheets. You can also also sync changes from Sheets back to a database and schedule refreshes daily, hourly or every five minutes.
Disclaimer: I built this.

Related

GCP cloud sql postgres connection count missing

I have a problem after migration postgres instance from 9 to 12, we were following this method https://cloud.google.com/sql/docs/postgres/upgrade-db.
Before migration, I had a graph whit active connections on instance page and after the migration (on the new instance - postgres 12) - I don`t have values on this graph.
I am receiving a message like "No data is available for the selected time frame."
Also if I am run a query like "SELECT * FROM pg_stat_activity;"
I see the number of connections.
At this moment the graph section for connections count in Postgres12 is not currently supported. The Cloud SQL engineering team is working on implementing this feature as its already implemented for other Postgres versions. This is not just happening to you, it is a general behaviour.
The best way to report an issue is to use a Google Issue Tracker1, where you can share your project information.
I found that same error reported by another client but I cannot relate it to your scenario.

Heroku Connect, detect salesforce update in postgres

So I have an app taking advantages of Heroku Connect to sync datas between platforms.
I need to find a way to detect when an update has been made by Salesforce (or at least, when the sync has been executed). I'm using sequelize in nodejs, but of course the hooks don't work since heroku connect works directly on the DB and doesn't use the ORM.
So I'm wondering what are my options here.
The solutions that come to my mind (likely there are more):
check out the Heroku Connect system tables like _trigger_log. This table will give you an exact log of the actions HC took (updates/insert/deletes) with information about the record. Yes, you would need to poll it :)
Postgres brings it's own queue-system with LISTEN and NOTIFY. You an write your own database-trigger that will react on change in the salesforce tables, and have a listening/worker-process on the LISTEN queue in PostgreSQL.

High Availability AEM Author

I’ve been working with AEM for over a year now and lately I’ve been trying to move into a high availability setup for author.
My problem is when ever I spin up a server, add sites, and spin up another server the data doesn’t persist to the new instance. I know why this doesn’t work in the traditional setup (repository is stored locally on the file system). However, I’ve attempted using the S3 backend, and it results in the same problem where the data doesn’t persist onto the new instance.
Ive read about using mongoMK (https://helpx.adobe.com/experience-manager/6-3/sites/deploying/using/recommended-deploys.html), I.e. mongodb as a store, but they also recommended using S3 as the backend.
My question is, does anyone have any experience with multiple AEM author instances sharing the same data and node stores, if so do you have any suggestions as to how to get this working or resources where I can read about this?
After further research it seems the only option for backend clustering is to use mongodb. My attempts to use mongodb with AEM as a backend have failed. When I attempt to use the crx3 and crx3mongo run modes it looks like AEM hangs after opening a connection to mongo. I have verified that nothing is getting placed into the DB via a show dbs returning 0.000GB for the corresponfing database.

Connect AppMaker to Google SQL

I tried to connect AppMaker to an existing Google SQL database without success. In Google Cloud, I created a second generation instance (europe-west1) and allow every IP (for the test). Next, I created a user, connected Workbench with the database and created a schema. Without problems.
In AppMaker I use the following address to connect to the database:
[domainName:]myprojectID:regionName:myinstanceID/mydatabase
(with and without domain)
Feedback:
Unable to connect to Google Cloud SQL instance.
The Google Cloud SQL address may be incorrect or this App Maker editor may not have permission to access the database. You can find more information about using Google Cloud SQL in our documentation.
How can I connect to Google SQL with AppMaker?
Thanks.
The process I followed to connect to the database is the following:
Create a second generation instance by following the steps here (Please make sure to follow the steps only for "Create a Cloud SQL instance" and then create a database inside that instance)
Now, go to the IAM & ADMIN section of your project. Click on the blue "add" member option on the top. Type "appmaker-maestro#appspot.gserviceaccount.com" on the "Members" field and from the "Roles" dropdown select Project->Editor
Return to your second generation SQL instance and copy the value of the Instance connection name property of your instance. Example:
Follow the steps for second generation found here to connect to the database from AppMaker and when asked for the instance address, type the value you copied on the previous step and add /yourdatabase at the end of it. For example: my-foto-app:us-central1:myinstance/userphotodatabase
You should now be asked to enter the user and password. Do so and you are finished.
Please note that I haven't tested out this with a europe location but only on a us central location. I hope this helps and works!
Second generation instances are not yet supported by App Maker, switch to first generation and it will work.
FYI, 2nd gen Cloud SQL is supported bv App Maker and has been for awhile.
As of today, Postgre is not yet supported.

How do you get data from a remote MySQL database into your app?

Have just finished a couple of tutorials regarding populating a SQLite database with data and then using this data within your app.
However none of these tutorials show how to connect to a remote server in order to obtain data.
QUESTION:
How do you get data from a remote MySQL database into your app??
What options do you have?
Remote access is not a good idea, you would have to allow everyone to access it since it's an app. The best way to go about this is to build a layer between your app and database. From the app you would access a server side script which does the database work and responds to your app.
Well there are methods to allow remote access to your mysql database on your server and being able to query the database remotely. I think this is the cleanest solution. Check out this link: http://www.cyberciti.biz/tips/how-do-i-enable-remote-access-to-mysql-database-server.html