Connecting a Google Cloud SQL Postgres Database to Google Data Studio - postgresql

I am going through the exact same process as the answered question found here (Connect a Google CloudSQL Postgres database to Data Studio), but I am not as advanced a user as most. I am encountering a similar problem to what was mentioned in the above question, and then some:
I have created a client certificate and downloaded the
client-cert.pem, client-key.pem and the server-ca.pem files to my
local machine.
I received the same error when attempting to link
Data Studio to our Google Cloud SQL Postgres Database as a data
source. " Can't reach the host. Please double check your connection parameters. Learn more about database connectors here. "
I tried running the following command " openssl pkcs8 -topk8 -inform PEM -outform DER -in client-key.pem -out client-key.pkcs8 -nocrypt " to change the format of the
client key in our Cloud Shell Project, but I received this error "
pkcs8: Cannot open input file client-key.pem, No such file or
directory ".
I would assume that I should not be running this command from the Cloud Shell Project terminal then. Would anybody know where I should be running this command instead? I can provide any extra material if needed. Thank you in advance - much appreciated.

It sounds like the file doesn't exist location where you are running the command. Did you upload the client-key.pem into Cloud Shell?

Related

Keyset as registered is invalid exception when Importing a RSA Key Container using aspnet_regiis

I have been trying to import a RSA key container from aspnet_regiis. Steps are as follows.
Run the command prompt as administrator
cd C:\windows\Microsoft.NET\Framework\v2.0.50727
aspnet_regiis -pi myrsakey E:\keyfile.xml
When followed the above steps I get the error as follows
**
Importing RSA Keys from file.. Keyset as registered is invalid.
<Exception from HRESULT: 0x8009001A> Failed!
**
For this "Keyset as registered is invalid" error almost every web result says to try renaming RSA file in the path C:\Users\myuser\AppData\Roaming\Microsoft\Crypto to RSA.old and reboot. If that does not work try renaming Crypto folder as Crypto.old. Eventhough I tried these steps it did not resolve the above issue. I am even running the cmd as administrator. So I was not sure what I am missing in here. Would you be help me to find a solution or a workaround for this issue.
Thanks in advance
Okay I found the answer.
As I was installing this RSA as a machine level key I should have renamed the RSA to RSA.old in the path C:\ProgramData\Microsoft\Crypto
After rename and I reboot the system. And then did the above mentioned steps again in the cmd. This time it succeded.
Previously I was renaming the RSA folder in the wrong place which is in my personal area (C:\Users\myuser\AppData\Roaming\Microsoft\Crypto).

How to encrypt files in Heroku?

I would like to find a way to store encrypted file in my github repository that Heroku can decrypt on-the-fly (it's not env var but plain old .csv files).
I used git-crypt successfully on my machine but it seems that I cannot add a gpg key to heroku.
When I connect to heroku-cli and I try to create a gpg key usingheroku run gpg --gen-key I got the following error:
gpg: signing failed: Inappropriate ioctl for device
Anyhow, I'm not even sure git-crypt is the right way to go, so feel free to gave me any other alternative solution.

Heroku Postgresql with Google Datastudio

I'm having troubles to connect an existing heroku database to Google Datastudio. I'm trying to add the connection and I get the following:
Access denied, please check your username and password.
Now, I'm 100% sure that I'm correct on those credentials and the problem comes from somewhere else.
I've tried with different setup, either a free or a paid PSQL instance, nothing works.
I've also setup a dummy account on elephantsql and the connection worked the first time without any issue.
Do you have any idea of the cause of that problem?
Edit:
Just found https://www.en.advertisercommunity.com/t5/Data-Studio/Heroku-Postgres-lt-gt-Google-Data-Studio/m-p/1031729 which is not helpful at the time of writing this post.
Since the February 6, 2018 update, Google DataStudio allows SSL connections with PostgreSQL, which is necessary to connect to a database created via Heroku.
To enable SSL you need to provide client key+cert and server cert, which can be accomplished by taking the following steps:
Generate a self-signed cert + key with openssl for client key + certificate:
openssl req \
-newkey rsa:2048 -nodes -keyout client.key \
-x509 -days 365 -out client.crt
Use the postgres_get_server_cert.py script to get the self-signed server cert from heroku psql:
https://raw.githubusercontent.com/thusoy/postgres-mitm/master/postgres_get_server_cert.py
The problem is that Heroku Postgres requires an SSL connection which doesn't seem possible with Data Studio at the moment. Hopefully Google will add that option soon.
Make sure to run the openssl command on one line to generate the client.key and client.crt in one command. It took me a couple of tries of downloading the certificates (unable to reach host error), but this finally got me connected to Heroku Postgres with GDS.
I thought I would mention that I have used this for quite awhile, but every time my database undergoes maintenance it breaks and I have to manually reconnect the certificates. I developed a better approach - connect the data to Google BigQuery and do your blends there, and then use the BigQuery Community Connector. The charts are more performant this way AND you can now use query parameters on blended data.
Of course, DataStudio won't connect directly to Heroku Postgres for the same reason, so I use a service called Fivetran to grab the raw data and send it to Google BigQuery. There is a cost to this, of course, but for some projects it may be worth it. At some point I will move my database off of Heroku to either AWS or Google itself to allow a direct connection, but that is a larger project.

How do I connect to an AWS PostgreSQL RDS instance using SSL and the sslrootcert parameter from a Windows environment?

We have a Windows EC2 instance on which we are running a custom command line application (C# console app using NpgSQL) to connect to a PostgreSQL RDS instance. Based on the instructions here:
http://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/CHAP_PostgreSQL.html#PostgreSQL.Concepts.General.SSL
we created a new DB parameter group with rds.force_ssl set to 1 and rebooted our RDS instance. We also downloaded and imported to Windows the pem file referenced on the page.
I was able to connect to the RDS instance from my Windows EC2 instance via pgAdmin by specifying SSL mode as Verify-Full. Our command-line application reads connection strings from a file and they look like this now that I've added the sslmode parameter:
Server=OurInstanceAddress;Port=5432;SearchPath='$user,public,topology';Database=OurDatabase;User Id=username;Password=mypassword;sslmode=verify-full;
Using this connection string failed with the error referenced at the bottom of the page:
FATAL: no pg_hba.conf entry for host "host.ip", user "someuser", database "postgres", SSL off
I tried adding the sslrootcert parameter, but I'm not sure if I'm dealing with it properly. I tried using the example (sslrootcert=rds-ssl-ca-cert.pem) and I tried using the name of the pem that I downloaded. I feel like there is something about the path information that I'm giving to the sslrootcert parameter that isn't right, especially in a Windows environment. I've tried using the name, I've tried using the following paths:
- sslrootcert=C:\keys\rds-combined-ca-bundle.pem - single backslash
- sslrootcert=C:\\\keys\\\rds-combined-ca-bundle.pem - double backslash
- sslrootcert=C:/keys/rds-combined-ca-bundle.pem - Linux style backslash
All of these produced the same error mentioned above.
Any insight would be appreciated.
I solved it using the environment variables instead for specifiying cert paths in connection url
-DPGSSLROOTCERT=/certs/root.crt
-DPGSSLKEY=/certs/amazon-postgresql.key
-PGSSLCERT=/certs/amazon-postgresql.crt
Although I'm in cygwin. There are some hints in the documentation when using windows here https://www.postgresql.org/docs/9.0/static/libpq-ssl.html

postgres jdbc client cert user vs db user

Backstory:
I've gotten jdbc to connect to postgres using a client cert. In java I set the user in the properties, and the driver looks it up in the keystore and sends it along. All was good.
But I just found out that I won't be getting certs with a CN of pg-user. The certs I'll be getting will have a CN of pg-user.XYZ.foo.com & pg-user.ABC.foo.com. This looks like a job for username maps. Hey they even have regexp, it'll be perfect.
I got unix user root logging in to postgres as pg-user using a username map and local ident authentication using psql -d db -U pg-user. But in that case postgres knows BOTH that the user is root, AND is trying to log in as pg-user.
Problem:
What I can't figure out is how to tell the postgres jdbc driver to grab the cert from the keystore with a CN of pg-user.XYZ.foo.com, but present to postgres as user pg-user. It appears to be the single argument of user that controls both. Does anyone know how to do this?
This page includes a list of the connection options, but it doesn't seem to offer a way to split the user names. The closest I'm seeing is the option to write my own sslfactory, and I'm really hoping to avoid that...
Thanks to #harmic's comment I was able to solve this.
Starting with to following three files:
pg-user.pem which has a CN of pg-user.XYZ.foo.com
pg-user-chain.pem which contains the chain certs
pg-user.private_key which contains the private key for the cert
Then I created the pkcs12 file like this:
cat pg-user.pem pg-user-chain.pem > cert-and-chain.pem
openssl pkcs12 -export -out ssl_cert.p12 -in cert-and-chain.pem -name pg-user -inkey private_key.pem -passout pass:{password here}
After that I declared the user property for the connection to be pg-user, and it worked. In order to test, I altered the regexp in pg_ident to not match, then I could no longer log in, I changed it back and I could.