How can I connect to Heroku Postgres from a Google Spreadsheet - postgresql

I'd like to use a Google spreadsheet to display my database analytics
I'd like to be able to do summary queries on my Heroku Postgres database using Google Apps Script and then display and chart them in a Google spreadsheet.
Heroku offers a number of ways to connect to Heroku Postgres:
https://devcenter.heroku.com/articles/heroku-postgresql
Likewise Google Apps script offers access to a number of different external services
https://developers.google.com/apps-script/defaultservices
I've never attempted this before and so am interested in what is simplest.
JDBC seems possible but are there any other options?
As far as I can see, the only overlap between the two is JDBC which I have no experience with but feels like a bit of a heavyweight third protocol to use to get between the systems.
IS JDBC the best way to get the data across or is there something simpler I'm missing?

Set up a dataclip from dataclips.heroku.com with your desired data described as a SQL query.
Append .csv to the resulting URL
Use that URL on the google spreadsheet's importData function, like so:
=importData("https://dataclips.heroku.com/[your-dataclip].csv")

I prefer to use Skyvia for connecting Google Sheets and Heroku Postgres without coding. Here is how I do it: https://skyvia.com/data-integration/integrate-google-sheets-heroku-postgres. All I need is to specify the connections to Google Sheets and Heroku Postgres and select data to replicate. Skyvia will copy the specified Google Sheets data to Heroku Postgres and maintain this copy up-to-date automatically with incremental updates.

QueryClips is exactly what you need. This is its primary use case.

Related

What's the easiest way to copy postgreSQL tables from a schema to Snowflake?

I want to copy tables from a postgreSQL schema to Snowflake (including data). What's the easiest way to do this?
My postgreSQL database lives in AWS RDS.
I went through this a couple of months ago and will share what I learned.
Snowflake does not recognize psql so a psql export/import will not work.
The recommendation I received from support was to export the tables as csv files, recreate the DDL, and then stage the csv files in S3.
There are third party tools that help connect and sync Postgres databases to Snowflake. Stitch being the one my group looked at.
So you can write your own integration or use Stitch to connect the two. There could be a more effective way to do this but if you are moving a large database over, I am afraid there is not an easy way to do this. That said, it is not terribly difficult but will take time to move everything over. Hope that helps!
Please check out the following community article which explains the steps to migrate.
https://community.snowflake.com/s/article/PostgreSQL-to-Snowflake-ETL-Steps-to-Migrate-Data
This is from one of the Snowflake partners who provides the tools to migrate the data.

Which kind of Google Cloud Platform mobile backend client is appropriate?

THE PROBLEM
I'm writing a mobile app which will allow a user to log in, save some preferences that must be stored in a database, and display congressional bills to the user.
I've only written simple RESTful services with PHP and MySQL in the past. I'd like to take advantage of newer technologies, and am a little lost on general direction.
The bill data (formatted as JSON) can be gathered by running the scrapers found here. Using docker, I managed to set a working directory and download the files on my local machine.
I've designed a MySQL database for holding the relevant bill and user data.
I started to mess around in Google Cloud Platform, and read the doc that describes different models. I'm thinking of a few different ideas, but aren't familiar with GCP or what I can actually accomplish.
QUESTIONS
1) What are App Engine, Compute Engine, and Container Engine each for? I get the gist that Container Engine holds different instances of stuff you load up with docker, and that Compute Engine sets up a VM, but I don't really understand the relationships. How should I think of them?
2) When I run those scrapers from the shell, where are the files being stored, and how can I check on them? On my computer, I set a working directory, but how do directories work in GCP? Is it just a directory in the currently selected VM, or is this what Buckets are for?
IDEAS
1) Since my bill data already comes as JSON, should I skip the entire process of building a database for the bills and insert them into Firebase somehow? Is this even possible? If so, am I stuck using Firebase's NoSQL, or can I still set up a relational database?
2) I could schedule the scrapers to run periodically, detect new files, and run a script to parse the JSON and insert new bill data into my a database (PostgrSQL?/MySQL?). Then I would write an API.
3) Download the JSON files to a bucket, and write an API that reads from them. Not sure how the performance would compare to using a DB.
I'm open to other suggestions as well.
For your use case (stateless web application), App Engine is probably your best choice. The Google documentation has severalcomparisons of your computing options
You can use App Engine with PHP and cloud-hosted MySQL if you want, which could be a good way to get your toes wet without going in over your head.

How do I populate a google big table instance with data using an external url?

I've a google big table instance that need to be populate with data that are in a Postgres Database. My product team give a URL's that allow me to replicate the database. So using simple words I need to duplicate the Postgres database into the google instance and the way that my product team give me is using this url, how can I do this? any tutorial that can help me?
If you are already running PostgreSQL and would like to have a mirror of it on Google Cloud Platform, the best and simplest approach may be to run your own PostgreSQL instance on a Google Compute Engine virtual machine which can be done via several approaches, e.g.,
tutorial for launching PostgreSQL, or
click-to-deploy solution for PostgreSQL by Bitnami
Then, you would want to continuously mirror data from your local instance to the PostgreSQL instance running in Google Cloud to be able to query it. Another SO answer suggests that there are two major approaches to this:
Master/Master replication (Bucardo)
Master/Slave replication (Slony)
Based on your use case where you want to keep your local PostgreSQL instance as the canonical one, and just replicate to Google Cloud for the purpose of querying it, you want a Master/Slave replication, and have the PostgreSQL instance be the read-only replica, so you probably want to use the Slony approach.
For a more in-depth look at PostgreSQL solutions for high availability, load balancing, and replication, see the comparison in the manual.

Load a PostgreSQL database using cloudconnect

On the side of my Gooddata project, I maintain a small PostgreSQL database that contains a few tables.
I would like to be able to integrate both my ETL processes using the same tool, and it seems to me cloudconnect would be the easiest way, since I already have my whole GoodData ETL in it.
Here are the ways I tried to do it without success:
I tried to have a look in the documentation, and it seems to me that all the functionalities of CloverETL that enabled this (DBOutput, PostGreSQLDataWriter) are not available in Cloudconnect.
I managed to connect to the Agile Datawarehouse Service (Database attached to GoodData), but it seems that only the ADS database is able to understand the request:
COPY MyDataBaseTable (field1,field2) FROM LOCAL '${DATA_TMP_DIR}/CIforADS.csv'
even when I adapt the syntax to PostgreSQL because the dynamic addressing I use here does not seem to work.
Is there any way to proceed that I'm missing? Can anyone think of a workaround?
In general this could be achieved by using of "DBExecute" component, but
I'm not sure if I understand it well - do you want to load data into your own Postgres instance using CloudConnect?

What are some options for charting time series stored in a postgres database

I have a postgres database with a large number of time series metrics
Various operators are interested in different information and I want to provide an interface where they can chart the data, make comparisons and optionally export data as a csv.
The two solutions I have come across so far are, graphite and grafana, but both these solutions tie you down to storage engines and none support postgres.
What I am looking for is an interface similar to grafana, but which allows me to hook up any backend I want. Are there any tools out there, similar to grafana, which allow you to hook up any backend you want (or even just postgres).
Note that the data I am collecting is highly sensitive, and is required by other areas of the application and so is not suitable for storing in graphite.
The other alternative I see would be to setup a trigger on the postgres DB to feed data into graphite as it arrives, but again, not ideal.
You may want to replace Graphite's storage backend with postgresql. Here is a good primer.
2018 update: Grafana now supports PostgreSQL (link).
What I am looking for is an interface similar to grafana, but which allows me to hook up any backend I want
Thats possible with grafana . Check this guide which shows how to create a datasource plugin for a datasource thats currently not supported.