How to enforce SSL in keycloak with Azure PostgreSQL - postgresql

I am trying to configure keycloak to run with PostgreSQL (using Azure Database for PostgreSQL) using a docker container. I was able to do this as instructed in the keycloak documentation here.
The problem that I am facing is, Azure Database for PostgreSQL has this option "Enforce SSL connection" set to "Enable" by default and the keycloak server is not working with that. It throws following error at the server startup.
ERROR [org.jboss.msc.service.fail] (ServerService Thread Pool -- 49) MSC000001: Failed to start service jboss.undertow.deployment.default-server.default-host./auth:
Caused by: org.postgresql.util.PSQLException: FATAL: SSL connection is required. Please specify SSL options and retry.
If the option "Enforce SSL connection" is disabled it worked fine.
I would like to know how to specify this option to work with keycloak.
I am using a custom Dockerfile to download and boot keycloak server and passing the data-source parameters as environmental variables with the docker run command. I have tried this approach which worked fine when I point it to my PostgreSQL data-source without any modifications. But when I change it to be compatible with my own Dockerfile it gives the same error.
Thanks in advance.

so here's what I did.
I overlooked the latest Dockerfile shared by jboss (which is available here) and adopted the lines that I needed with the version of my requirement. Earlier I was trying to add postgreSQL configuration by my own as keycloak document was suggesting. Since it is now supported out of the box I changed my Dockerfile to be compatible with jboss Dockerfile for keycloak.
Also I introduced new env. variable to enforce SSL connection by stating ssl=true as guided here

FATAL: SSL connection is required. Please specify SSL options and retry.
I've seen this happen when the client IP address isn't included in the firewall rules on the PostgreSQL server. Try confirming the firewall is open for your IP in the Connection Security page in the portal or in the Azure CLI using az postgres server firewall-rule list --resource-group --server-name

Related

gcloud beta sql connect "server closed the connection unexpectedly"

When trying to get a psql shell (not using iam user) I am receiving:
> gcloud alpha sql connect pg-instance --database mydb --user myuser --project my-project
Starting Cloud SQL Proxy: [/Users/me/google-cloud-sdk/bin/cloud_sql_proxy -instances my-project:us-central1:pg-instance=tcp:9470 -credential_file /Users/me/.config/gcloud/legacy_credentials/me#me.com/adc.json]]
2022/03/15 14:47:59 Rlimits for file descriptors set to {Current = 8500, Max = 9223372036854775807}
2022/03/15 14:47:59 using credential file for authentication; path="/Users/me/.config/gcloud/legacy_credentials/me#me.com/adc.json"
2022/03/15 14:48:00 Listening on 127.0.0.1:9470 for my-project:us-central1:pg-instance
2022/03/15 14:48:00 Ready for new connections
Connecting to database with SQL user [myuser].Password:
psql: error: connection to server at "127.0.0.1", port 9470 failed: server closed the connection unexpectedly
This probably means the server terminated abnormally
before or while processing the request.
I had the same error message when connecting to Postgres(Cloud Sql) using a service account.
In my setup I did run cloud_sql_proxy inside docker container.
In order to make it work I had to add extra configuration defined in step #9 https://cloud.google.com/sql/docs/sqlserver/connect-docker#connect-client
docker run -d \
-v <PATH_TO_KEY_FILE>:/config \
-p 127.0.0.1:5432:5432\
gcr.io/cloudsql-docker/gce-proxy:1.33.1 /cloud_sql_proxy \
-instances=<INSTANCE_CONNECTION_NAME>=tcp:0.0.0.0:5432 -credential_file=/config
The missing bits were: host ip on port mapping and 0.0.0.0: in cloud_sql_proxy command
There are a few things I would like to point out. The best starting point for me would be the About connection options page; both the Overview and the Before you begin sections are very helpful to get the full idea of the process and how to properly configure the user. But the most important part is the Connection Options, for the message connection to server at "127.0.0.1" I’m guessing it is a private IP, but please make sure this section is covered before starting to debug.
In your case, the logs are saying there was an error in the connection to the server…
I used the Troubleshoot guide that includes the Diagnose issues link to get to the Debug connection issues page that has a lot of useful information on how to debug any connectivity issue.
Generally, connection issues fall into one of the following three areas:
Connecting - are you able to reach your instance over the network?
Authorizing - are you authorized to connect to the instance?
Authenticating - does the database accept your database credentials?
Each of those can be further broken down into different paths for investigation.
Once determining the connection method, there are different questions that will help to guide you through the possible troubleshooting paths.
If using these guides doesn’t get you a solution, please make sure to update your answer with the results, steps, and information followed to provide further help. This would be a good example, as it has the same log error, and this other question shows that there are a few different troubleshooting paths for this specific log message, plus they have useful information for you.

Why am I getting "unsupported network unix" with Cloud SQL Proxy, when I'm specifying TCP?

I'm having issues when trying to connect to my Cloud SQL instance. I created a SQL Server instance, downloaded the cloud sql proxy, and everything seems to start to connect, but I keep getting the following error:
errors parsing config:
invalid "instance-connection-name": unsupported network: unix
I'm specifying the tcp port to use, but it still complains about UNIX. Here is the command I'm using when trying to connect (I replaced the actual instance connection name for privacy/security):
./cloud_sql_proxy.exe -instances=[instance-connection-name]=tcp:3306
Any help would be appreciated.
Thanks!
I tried this and it works
Rename cloud_sql_proxy_xxx to cloud_sql_proxy
Open cmd in your cloud_sql_proxy's location
Run the following command: cloud_sql_proxy -instances=[project:region:instance-name]=tcp:1433 without [ ]
From Connecting to a Cloud SQL for SQL Server using a Cloud SQL Proxy:
Depending on your language and environment, you can start the proxy using either TCP sockets or Unix sockets.
TCP sockets:
Copy your instance connection name from the Instance details page
For example: myproject:us-central1:myinstance.
If you are using a service account to authenticate the proxy, note the location on your client machine of the private key file that was created when you created the service account.
Start the proxy.
Some possible proxy invocation strings:
a) Using Cloud SDK authentication:
./cloud_sql_proxy -instances=<INSTANCE_CONNECTION_NAME>=tcp:1433
The specified port must not already be in use, for example, by a local database server.
b) Using a service account and explicit instance specification (recommended for production environments):
./cloud_sql_proxy -instances=<INSTANCE_CONNECTION_NAME>=tcp:1433 \
-credential_file=<PATH_TO_KEY_FILE> &

How to connect to the database in ddev?

I installed successfully ddev for TYPO3 and now want to connect to the mariadb database. But what are the credentials? If I ssh into the container and want to connect I got a password prompt.
Access via external tools is described in Using Developer Tools with ddev.
Specifically you need to execute the following command to get the necessary credentials:
ddev describe
When upgrading my ddev and deleting all the containers, everything stayed the same except my new port number incremented up by one.
mariadb
Host: localhost:portNumberIncrementedByOne
User/Pass: 'db/db'

KRB5KDC_ERR_S_PRINCIPAL_UNKNOWN. while connection to mongodb with GSSAPI

I have setup Active directory with kerberos authentication on windows server 2012 r2, set mongodb server on a 2nd machine. Started mongodb with GSSAPI authentication, Now if I try to connect to mongodb using the follwong url
mongo.exe --host Mongo32Test.ihubtest.com.com --authenticationMechanism=GSSAPI --authenticationDatabase=$external -u mongoService#ihubtest.com --verbose
I am getting the following message.
Error: SASL(-1): generic failure: SSPI: InitializeSecurityContext: The specified target is unknown or unreachable
I have installed wireshark and the packet contains this message
"KRB5 167 KRB Error: KRB5KDC_ERR_S_PRINCIPAL_UNKNOWN"
Searching around I figured that it is related to service principle name
mongoService#ihubtest.com is a domain user and is part of $external database in mongodb.
verified the service principle name, it looks fine.
C:>setspn -l mongoService
Registered ServicePrincipalNames for CN=mongo Service,CN=Users,DC=ihubtest,DC=com:
mongodb/Mongo32test.ihubtest.com#IHUBTEST.COM
tried the troubleshooting steps mentioned in this page, https://docs.mongodb.com/manual/tutorial/troubleshoot-kerberos/, am I missing something on Active directory configuration ?
if not yet looked into this ticket MongoDB Team has a closed ticket with some steps
https://jira.mongodb.org/browse/SERVER-13885
I believe in you misquoted your hostname as "Mongo32Test.ihubtest.com.com" instead of "Mongo32Test.ihubtest.com".
Please verify whether the provided hostname is correct or not

Install Chef Server 11 with AWS RDS

Now AWS has postgresql service in RDS, So I tried to install Chef Server 11 with postgresql RDS instance by editing attributes in /opt/chef-server/embedded/cookbooks/chef-server/attributes/default.rb
default['chef_server']['postgresql']['vip'] = "rds instance endpoint"
and importing database with the following command
/opt/chef-server/embedded/bin/psql -h "rds instance endpoint" -p 5432 -U "user_name" "database_name" < /opt/chef-server/embedded/service/erchef/lib/chef_db-f086a97/priv/pgsql_schema.sql
But i am not able to achieve that. chef-server-ctl reconfigure gives an error
curl -sf http:// 127.0.0.1:8000 /_status returned 7
Please help me to configure chef server with RDS instance.
I think i am able to solve my problem. It is because of encrypted password in erchef config file. I edited
"/opt/chef-server/embedded/cookbooks/chef-server/templates/default/echef.config.rb"
for the same and it seems working perfectly fine now.
Thanks
The chef-server-rds cookbook on github can be used to install Chef Server 11 with AWS RDS.
Given an iam key and secret, it will provision the rds instance if it doesn't exist in the account, initialize the chef schema, and install the appropriate platform-specific chef-server Omnibus package and perform the initial configuration of Chef Server on an AWS elastic compute ubuntu instance.
Using postgres on Amazon RDS offloads DB resource use away from the chef-server host. It also enables various DB functions like scaling, backup, and restore to be done independently of the chef-server installations. Similar configurations can be written for other db service providers.