How to integrate Postgres RLS into backend server - postgresql

I'm trying to understand backends more and using row level security seems to be great idea to simplify backend code and increase security. I struggle to understand how to make RLS work. Let's say i have the following policy
CREATE POLICY user_mod_account ON accounts USING (username = current_user)
How does it know who is 'current_user'. I've followed few tutorials for making backend servers and if i'm not mistaken the way auth works is that you have some middleware that is in charge of authentication (usually by making additional requests against db to check if session exists and is valid), but all operations on the database are performed by backend which logs in as one user.
Let's say I have two tables - user, post and post has foreign key 'author' relating to 'username' in user table. Is it possible to have the following query on backend
SELECT * FROM post
and have it protected by RLS so the data is protected by the database itself (and have it return only posts from the user that backend is making request for)? If i connect to db as 'postgres' user, is 'current_user' postgres, or is there a way to make requests from backend as selected user. Maybe it's not the purpose of RLS, i would be glad if you could point me to some sources that could help me understand database <-> backend relation better. I feel like it is not really that complicated and my brain just didn't 'click' yet :P

Related

How to use Supabase RLS with third-party client library?

I'm using Supabase as my database, with RLS turned on. I query the DB from a Node backend using the postgres.js client (it could be any other JS client really). As such I'm connecting direct with the default postgres user and I issue raw SQL queries.
How can I query my DB while still maskerading as a user from Supa's auth.users table? I have triggers and RLS policies that take advantage of the current logged-in user, via Supa's auth.uid() function.
All I can find online is that I must set the current_user_id postgres setting to the current Supabase user, but I don't grasp the implications of that, from a functional and security perspective. I wouldn't want to leak the current user to another session.

How to setup row level access in Postgres without creating a user

I have an existing API connected to an AWS PostgreSQL database that uses AWS Cognito for User authentication.
The goal is for users to insert data via the API with some field mapped to their Cognito id, and retrieve the same data. The idea would be for each user to only have access to the data 'owned' by them. Similarly to the way row level access works.
But I do not want to create a role for each user which seems to be necessary.
The idea would be that I need to somehow setup a connection to the PostgreSQL DB with the user_id without creating a user and handle the accessible data via a policy, or somehow pass the data to the policy directly.
What would be an ideal way to do this, or is creating a PG user for each user a necessity for this setup?
Thanks in advance
EDIT: I am currently querying the database through my backend with custom code. But I would rather have a system where instead of writing the code myself, the PostgreSQL system handles the security itself using policies(or something similar). I fully understand how PostgreSQL row-level-access works with roles and policies and I would prefer a system where PostgreSQL does the major work without me implementing custom back-end logic and preferably not creating thousands of PostgreSQL roles for the users.
You should not allow users to make a direct connection to the database.
Instead, they should make requests to your back-end, where you have business logic that determines what each user is permitted to access. Your back-end then makes the appropriate calls to the database and returns the response to the user.
This is a much 'safer' response because it prevents users having direct access to your database and it is also a better architecture because it allows you to swap-out the database engine for another one without impacting your service.
The database is for your application, not for your users.

Keycloak. Storage SPI with external database

We already have DB with users.
We have to migrate all records to Keycloak DB or we can just implement Storage SPI ?
We don't want to migrate records, because we should also support old DB, it brings problems because we will need synchronize 2 DB.
Can you please write what could be the problems in this approach and write your advices for resolve theirs ?
USER DATA SOURCES
Moving to a system such as Keycloak will require an architectural design on how to manage user fields. Some user fields will need migrating to an identity database managed by Keycloak. Applications can then receive updates to these fields within tokens.
KEYCLOAK DATA
Keycloak will expect to have its own user account storage, and this is where each user's subject claim will originate from. If a new user signs up, the user will be created here before being created in your business data.
Keycloak user data will include fields such as name and email if they are sent in forgot password workflows. You can keep most other user fields in your business data if you prefer.
So to summarize, a migration will be needed, but you don't have to migrate all user fields.
BUSINESS DATA
This may include other user fields that you want to keep where they are, but also include in access tokens and use for authorization in APIs. Examples are values like roles, permissions, tenant ID, partner ID, supscription level.
DESIGN STEPS
My recent blog post walks through some examples and suggests a way to think through your end-to-end flows. There are a couple of different user data scenarios mentioned there.
It is worth doing a day or two of sketching out how you want your system to work. In particular how your APIs will authorize requests, and how you will manage both existing and new users. This avoids the potential for finding expensive problems later.

With SSO (like for example Keycloak), how does one handle/synchronise users in own databases?

Consider the following scenario: you have a SSO service (let's say Keycloak), and X applications, that have their own databases, where somewhere in each database, you're referencing a user_id. How to handle this? How to satisfy the foreign constrain problem? Should one synchronise Keycloak, and the applications? How? What are some best practices? What are some experiences?
I've been using Keycloak for several years, and in my experience there are several scenarios regarding synchronizing user data between Keycloak
and your application's database :
Your application is the owner of the user data.
Keycloak is only used for authentication/authorization purposes. In this scenario, your application creates/updates a keycloak user using the admin rest API when needed.
Keycloak is the owner of the user data and you don't need more info than the userid in your database.
In this scenario everything regarding users could be managed by Keycloak (registration, user account parameters, even resource sharing using the authorization services).
Users would be referenced by userid in the database when needed.
NB: You can easily add custom data to the user in Keycloak using the user attributes but one interesting possibility is to extend the user model directly using this : https://www.keycloak.org/docs/latest/server_development/index.html#_extensions_jpa
Keycloak is the owner of the user data and you need more than just the user id (email, firstname, etc)
If performance is not an issue, you could retrieve user info via the Admin Rest API when needed.
If performance is an issue you'll need a copy of Keycloak's user data in your app's database, and you would want that copy to be updated on every user changes.
To do that you could implement callbacks in keycloak (using SPIs: https://www.keycloak.org/docs/latest/server_development/index.html#_events), that will notify your application when an user is created/updated.
NB : You could also use a Change Data Capture tools (like Debezium: https://debezium.io/) to synchronize Keycloak's database with yours.
There's pros and cons to each scenario, you'll have to choose the one which better suits your needs :)

Keycloak user management

I'm developing a microservice (restful) project that uses kaycloak as IAM. I could create realm, client, users,... for authenticating but my concern is should I manage users only on keycloak or creating my own user table in my microservice?
is should I manage users only on keycloak or creating my own user
table in my micro-service?
First you need to check what can one do (or not) with Keycloak regarding user management and compared with your current (and possible future) requirements. If it does not completely fulfill your requirements then you can either extend Keycloak, adapt your requirements, or (probably the most straightforward solution) have your own user table in your micro-service.
You might want also to create your own user table for performance reasons. Depending on how slow it is to access Keycloak in your setup you might consider using that user table as caching mechanism for quick access of user-related information.
The problem of having that user table is that depending on the user information stored on Keycloak and on the user table you might have to keep them in sync. Moreover, if that information exists on the user table and not on Keycloak, and you need that information on the tokens, you will have to think about how will you handle such situations.
Personally, I would try to avoid creating the user table unless it is really necessary. So a complete answer to your question will most-like be highly dependent of your own needs.