Long user upload procedure in Keycloak (REST) - keycloak

Asking for your help.
We start docker image stand-alone Keycloak in openshift.
We add new users using REST. But everything works very slowly. For 240000 users the uploads takes 24 hours.
Has anyone come across this?
How do you add and update users?
Our server: CPU 4 4 GB.
Our version of keyclock 7.3.1.GA.
The connection is constantly breaking and we are forced to send user packages

Based on key cloak, the following actions were performed:
CREATE EXTENSION pg_trgm;
CREATE INDEX in_user_entity_001
     ON user_entity USING GIST (lower (username) gist_trgm_ops);
Explanations:
The pg_trgm extension is included in the set of plugins supplied with PostgreSQL and is supported by PostgreSQL developers (https://www.postgresql.org/docs/11/pgtrgm.html).
The created index is matched to a specific long query.

Related

How to setup mutli-tenancy using row level security on Postgres with knex

I am architecting a database where I expected to have 1,000s of tenants where some data will be shared between tenants. I am currently planning on using Postgres with row level security for tenant isolation. I am also using knex and Objection.js to model the database in node.js.
Most of the tutorials I have seen look like this where you create a separate knex connection per tenant. However, I've run into a problem on my development machine where after I create ~100 connections, I received this error: "remaining connection slots are reserved for non-replication superuser connections".
I'm investigating a few possible solutions/work-arounds, but I was wondering if anyone has been able to make this setup work the way I'm intending. Thanks!
Perhaps one solution might be to cache a limited number of connections, and destroy the oldest cached connection when the limit is reached. See this code as an example.
That code should probably be improved, however, to use a Map as the knexCache instead of an object, since a Map remembers the insertion order.

Making MS Access queries that allow data entry to PostgreSQL database via an ODBC driver

I've been asked to modify an Access database by putting the data themselves into a Postgres database while keeping the old Access file as a frontend. So far everything has worked just fine, with every linked table, query and form working just like before when viewed.
The issue is, however, that all of the forms call on MS Access queries which users can insert data into, but after the tables have been migrated into PostgreSQL, those queries no longer allow for data inserts, which means the forms no longer allow for data inserts. I can edit the rows already entered, but I cannot make new rows, and I can insert new rows into the linked tables. This is as a superuser.
I have made Access queries in the past that allowed for data entry to a Postgres database, but I don't have access to those files now, and I can't for the life of me figure out what I did diferently back then.
Highly appreciate any leads. Couldn't find anything on this. Using MS Access 2010 and PostgreSQL 9.1
Solved
Andre pointed out that these MS Access queries must include the primary key to give the option of creating new rows. Once I added the id field to the query, the forms worked like they did before.
The answer, supplied by Andre, is that simple MS Access queries allow for inserts into PostgreSQL if the queries include the primary key of the queried table. Cheers!

PCI Compliance with Native Postgresql

We have PostgreSQL database no 3rd party software a Linux ad min and a SQL dba with little PostgreSQL experience.
We need to set up audit\access logging of all transactions on the CC tables. We enabled logging, but we are concerned about enabling everything to log. We want to restrict it to specified tables. I am not finding a resource that I under stand to accomplish this.
a few blogs have mentioned table triggers and logfiles
I found another that discusses functions. I am just not sure how t proceed on this. The following is the PCI information I am working off of:
(Done) Install pg_stat_statements extension to monitor all queries (SELECT, INSERT, UPDATE, DELETE)
Setup monitor to find out suspicious access on PAN holding table
Enable connection/disconnection logging
Enable Web Server access logs
Monitor Postgres logs for unsuccessful login attempts
Automated log analysis & Access Monitoring using Alerts
Keep archive audit and log history for at least one year and for last 3
months ready available for analysis
Update
also need to apply password policy to postgrsql db users.
90 day expirations (there is a place to set a date but not an
interval)
Lock out user 6 failed attempts Locked out for 30 minutes
or until an administrator enables the userID.
Force re-authenticate when idle for more than 15 minutes
Passwords/phrases must meet the
following: Require a minimum length of at least seven characters.
Contain both numeric and alphabetic characters.
Cannot be same as last 4 passwords/passphrases used
2) There is no direct way to log access to tables. The extension pg_audit claims that it can do that. I have never used it though.
3) can easily be done using log_connections and log_disconnections
4) has nothing to do with Postgres
5) can be done once connection logging has been done by monitoring the logfile
6) no idea what that should mean
7) that is independent of the Postgres setup. You just need to make sure the Postgres logfiles are archived properly.

Talend open studio run only created or modified records among 15k

I have a job in talend open studio which is working fine, it conects a tMSSqlinput to a tMap then tMysqlOutput, very straight forward. My problem is that i need this job running on daily basis, but only run when a new record is created or modified...any help is highly aprecciated!
It seems that you are searching for a Change Data Capture Tool for Talend.
Unfortunately it is only available on the licenced product.
To implement your need, you do have several ways. I want to show the most popular ones.
CDC from Talend
As Corentin said correctly, you could choose to use CDC (Change Data Capture) from Talend if you use the subscription version.
CDC of MSSQL
Alternatively you can check if you can activate or use CDC in your MSSQL server. This depends on your license. If it is possible, you can use the function to identify new elements and proceed them.
Triggers
Also you can create triggers on your database (if you have access to it). For example, creating a trigger for the cases INSERT, UPDATE, DELETE would help you getting the deltas. Then you could store those records separately or their IDs.
Software driven / API
If your database is connected to a software and you have developers around, you could ask for a service which identifies records on insert / update / delete and shows them to you. This could be done e.g. in a REST interface.
Delta via ID
If the primary key is an ID and it is set to autoincrement, you could also check your MySQL table for the biggest number and only SELECT those from the source which have a bigger ID than you have already got. This depends of course from the database layout.

Upgrading postgresql database on heroku

Heroku provide instructions for provisioning a new postgresql database, which involves a command of the form
heroku addons:create heroku-postgresql:standard-0
where the text after postgresql: is a key to the level of database. Presumably standard-0 is the lowest level of standard. It took me about ten minutes, using the web interface to be reasonably sure that hobby-basic is the key for the highest level of basic. Can someone please provide a table that gives the key for each of the database levels so we do not have to guess?
The different levels, along with the keys are available in the Heroku Postgres add-on page.
You can install the add-on directly from there, or use one of the keys you can extract from the anchor part of the URI (e.g. for https://elements.heroku.com/addons/heroku-postgresql#standard-6 it is standard-6).