Trouble moving from Drive to Cloud SQL in Google AppMaker - postgresql

I have an AppMaker app that uses Drive tables. However I need to move to Cloud SQL, so I followed the steps here: Connect AppMaker to Google SQL
I created the Clould SQL (PostgreSQL) instance and database fine, then created the same tables in Cloud as I was using in Drive (Well, almost, I could not have a table named User).
When I went to create the Model in AppMaker, it spent a lot of time spinning, then said "Failed to load models for Google Cloud SQL." And then, strangely, it said Refresh Required.
I thought this was a fluke but when I tried again I got the exact same error messages. Any idea what is going on? Is there any chance it is failing because I still have the Drive tables with the same names as the Cloud SQL tables?
Thanks for any tips or pointers.

App Maker does not currently support PostreSQL.

Related

How do I use slashdb with redshift database

Is it possible to use slashdb with redshift? And if so, how can I create an instance on aws and choose the redshift database?
The pricing page mentions redshift, but nowhere else. The setup menu doesn't look like it shows redshift as an option in the videos
Currently the pre-built AMIs and virtual machine images for SlashDB do not ship with Redshift support. SlashDB team can do a custom build for you, but you need to get in touch with us: https://www.slashdb.com/contact/

Which kind of Google Cloud Platform mobile backend client is appropriate?

THE PROBLEM
I'm writing a mobile app which will allow a user to log in, save some preferences that must be stored in a database, and display congressional bills to the user.
I've only written simple RESTful services with PHP and MySQL in the past. I'd like to take advantage of newer technologies, and am a little lost on general direction.
The bill data (formatted as JSON) can be gathered by running the scrapers found here. Using docker, I managed to set a working directory and download the files on my local machine.
I've designed a MySQL database for holding the relevant bill and user data.
I started to mess around in Google Cloud Platform, and read the doc that describes different models. I'm thinking of a few different ideas, but aren't familiar with GCP or what I can actually accomplish.
QUESTIONS
1) What are App Engine, Compute Engine, and Container Engine each for? I get the gist that Container Engine holds different instances of stuff you load up with docker, and that Compute Engine sets up a VM, but I don't really understand the relationships. How should I think of them?
2) When I run those scrapers from the shell, where are the files being stored, and how can I check on them? On my computer, I set a working directory, but how do directories work in GCP? Is it just a directory in the currently selected VM, or is this what Buckets are for?
IDEAS
1) Since my bill data already comes as JSON, should I skip the entire process of building a database for the bills and insert them into Firebase somehow? Is this even possible? If so, am I stuck using Firebase's NoSQL, or can I still set up a relational database?
2) I could schedule the scrapers to run periodically, detect new files, and run a script to parse the JSON and insert new bill data into my a database (PostgrSQL?/MySQL?). Then I would write an API.
3) Download the JSON files to a bucket, and write an API that reads from them. Not sure how the performance would compare to using a DB.
I'm open to other suggestions as well.
For your use case (stateless web application), App Engine is probably your best choice. The Google documentation has severalcomparisons of your computing options
You can use App Engine with PHP and cloud-hosted MySQL if you want, which could be a good way to get your toes wet without going in over your head.

How do I populate a google big table instance with data using an external url?

I've a google big table instance that need to be populate with data that are in a Postgres Database. My product team give a URL's that allow me to replicate the database. So using simple words I need to duplicate the Postgres database into the google instance and the way that my product team give me is using this url, how can I do this? any tutorial that can help me?
If you are already running PostgreSQL and would like to have a mirror of it on Google Cloud Platform, the best and simplest approach may be to run your own PostgreSQL instance on a Google Compute Engine virtual machine which can be done via several approaches, e.g.,
tutorial for launching PostgreSQL, or
click-to-deploy solution for PostgreSQL by Bitnami
Then, you would want to continuously mirror data from your local instance to the PostgreSQL instance running in Google Cloud to be able to query it. Another SO answer suggests that there are two major approaches to this:
Master/Master replication (Bucardo)
Master/Slave replication (Slony)
Based on your use case where you want to keep your local PostgreSQL instance as the canonical one, and just replicate to Google Cloud for the purpose of querying it, you want a Master/Slave replication, and have the PostgreSQL instance be the read-only replica, so you probably want to use the Slony approach.
For a more in-depth look at PostgreSQL solutions for high availability, load balancing, and replication, see the comparison in the manual.

Load a PostgreSQL database using cloudconnect

On the side of my Gooddata project, I maintain a small PostgreSQL database that contains a few tables.
I would like to be able to integrate both my ETL processes using the same tool, and it seems to me cloudconnect would be the easiest way, since I already have my whole GoodData ETL in it.
Here are the ways I tried to do it without success:
I tried to have a look in the documentation, and it seems to me that all the functionalities of CloverETL that enabled this (DBOutput, PostGreSQLDataWriter) are not available in Cloudconnect.
I managed to connect to the Agile Datawarehouse Service (Database attached to GoodData), but it seems that only the ADS database is able to understand the request:
COPY MyDataBaseTable (field1,field2) FROM LOCAL '${DATA_TMP_DIR}/CIforADS.csv'
even when I adapt the syntax to PostgreSQL because the dynamic addressing I use here does not seem to work.
Is there any way to proceed that I'm missing? Can anyone think of a workaround?
In general this could be achieved by using of "DBExecute" component, but
I'm not sure if I understand it well - do you want to load data into your own Postgres instance using CloudConnect?

How can I connect to Heroku Postgres from a Google Spreadsheet

I'd like to use a Google spreadsheet to display my database analytics
I'd like to be able to do summary queries on my Heroku Postgres database using Google Apps Script and then display and chart them in a Google spreadsheet.
Heroku offers a number of ways to connect to Heroku Postgres:
https://devcenter.heroku.com/articles/heroku-postgresql
Likewise Google Apps script offers access to a number of different external services
https://developers.google.com/apps-script/defaultservices
I've never attempted this before and so am interested in what is simplest.
JDBC seems possible but are there any other options?
As far as I can see, the only overlap between the two is JDBC which I have no experience with but feels like a bit of a heavyweight third protocol to use to get between the systems.
IS JDBC the best way to get the data across or is there something simpler I'm missing?
Set up a dataclip from dataclips.heroku.com with your desired data described as a SQL query.
Append .csv to the resulting URL
Use that URL on the google spreadsheet's importData function, like so:
=importData("https://dataclips.heroku.com/[your-dataclip].csv")
I prefer to use Skyvia for connecting Google Sheets and Heroku Postgres without coding. Here is how I do it: https://skyvia.com/data-integration/integrate-google-sheets-heroku-postgres. All I need is to specify the connections to Google Sheets and Heroku Postgres and select data to replicate. Skyvia will copy the specified Google Sheets data to Heroku Postgres and maintain this copy up-to-date automatically with incremental updates.
QueryClips is exactly what you need. This is its primary use case.