Keycloak integration with Atlassian Crowd - keycloak

We have a software stack with authentication based on keycloak.
A client asks us to handle users / roles in its own system (Atlassian Crowd).
Is there a way to connect both, so we don't have to redevelop the authentication layer in our apps ?
Import Crowd users/roles into keycloak regularly ?
Delegate keycloak users authentication to Crowd (how ?)
... ?
I have no deep knowledge about auth mechanisms (Usage of keycloak in our software stack is pretty simple), but would like to have a picture about what's possible.

Related

SPA webapp SSO federation

I have an SPA web app using openidconnect for authentication and authorization with local keycloak.
This app is now moving to an windows onprem infrastructure using AD, kerberos tickets and a central SSO.
users log in in their windows session, and then we shall be able to transparently login in our SPA web app. (ie with out entering credentials)
How can I convert kerberos ticket/authentication into Openidconnect world? Where is the magic?
Shall we add some kerberos in our app?
how can we retrieve our access token containing the user role?
thanks
Your SPA should continue to talk to Keycloak using OIDC, and no code in the SPA should need to change. Your APIs will also continue to receive the same access tokens.
You should only need to configure Keycloak to use AD for authentication as an LDAP data source. Here is an article on how to do that. It is an infrastructure job rather than just a coding one, so I would recommend collaboration with AD administrators on the environment setup.
AD is only one possible authentication method, and by doing things this way you keep your options open. You are likely to need to perform account linking, eg to identify users the same before and after the migration. There may be some data setup involved here, eg ensure AD has the same emails as the existing system.

OpenID Connect: transparent authentication for legacy clients using Resource Owner Password Credentials

We're currently rewriting various services to use OpenID Connect (via Keycloak).
This works great for any modern browser-based clients, but in our case we also need to support legacy IoT devices, which:
cannot receive a firmware update (and thus are stuck in their current modes of authentication/communication)
are not aware of Keycloak and are not configured to participate in OpenID Connect. (and are also only aware of the application's URL and not the Keycloak URL)
authenticate directly with the application using either Basic Authentication or SSL Client Authentication with a certificate.
From the documentation we gathered that mapping each device to a Keycloak user and using the
Resource Owner Password Credentials would be the way to go in such cases.
We were thinking that it'd be nice to add centralized support for such devices by exposing a reverse proxy that sits in front of all services and performs the following steps:
Receive the IoT device requests (and optionally terminate SSL)
Extract the credentials from the request (either basic auth / client certificate)
Perform the Resource Owner Password Credentials Flow against Keycloak to exchange the credentials for an access token (where the IoT device acts as the OAuth Resource Owner and the reverse proxy acts as the OAuth Client
If successful, enrich the original request with the retrieved access token and forward it to the proxied service
In that way the entire OpenID Connect authentication is transparent for any legacy devices.
This design could be further improved/optimized by caching the access tokens for the duration they are valid for (using the credentials as the cache key) and refreshing them when they expire.
Now, this idea seems like such a no-brainer, that we were surprised that we couldn't find any existing gateways, reverse proxies or plugins that do this.
So I guess we're in need of a sanity check on:
Is this something that can work as described or are there any obvious flaws with the idea?
Why isn't anyone doing this already? (assuming that supporting legacy devices is a major pain point when switching to OpenID Connect)
UPDATE 1: (responding to question) The described legacy IoT devices are (physically) Arduino microcontrollers with baked-in unique credentials. In the context of Keycloak, each such Arduino microcontroller is mapped as a Keycloak user. We're open for suggestions if this is not the most adequate mapping for this use-case.
UPDATE 2: (responding to question) Agreed that the Client Credentials Flow would be semantically more correct for such a device-to-device authentication and any future devices we produce will use it. However we can't use it for the existing legacy devices for two reasons: 1) the devices only know the server's URL and can't authenticate directly against Keycloak and 2) we also want to support SSL Client Authentication using a X.509 certificate and from our understanding Keycloak only supports X.509 client certificate user authentication for users, and not for clients.
Is this something that can work as described or are there any obvious flaws with the idea?
It works fine, so long as your OP supports the Resource Owner Password Credentials flow, which is deprecated and removed from modern OAuth2.
Why isn't anyone doing this already? (assuming that supporting legacy devices is a major pain point when switching to OpenID Connect)
Lots of reverse proxies do this, just not with resource owner credentials. The ROPC flow was never a good idea, exists for legacy reasons, and has been removed from OAuth 2.1.
I suspect that most people move away from storing and transmitting resource owner credentials as they modernize their architecture.

Local Identity based login along with saml 2.0 SSO

There is an existing mechanism to log into a website. Now, external / remote SAML IDP is being added to facilitate SSO. The website uses other micro-services and components that provide data and functionality to the website.
Is there a way to have an existing mechanism of local identity username password credentials to continue to co-exist as an alternate strategy for authentication alongside remote IDP SSO while keeping rest of the services handling authorization in a semantic way (using a saml token)?
P.S. I looked at the options to implement existing auth mechanism as saml IDP, but building it seems complex even with the likes of shibboleth or openSAML libraries.
P.P.S. I haven't looked at possibility of reimplementing existing auth mechanism with openId connect to co-exist with remote saml idps.
Sure: one can provide a landing page to the user that gives a choice between using a local account or an account at a remote IDP.

Keycloak client vs user

I understand that keycloak has built-in clients and we add the users later on.
But in general, what is the difference between a client and a user in Keycloak?
According to the Keycloak documentation
User - Users are entities that are able to log into your system
Client - Clients are entities that can request Keycloak to authenticate a user. Most
often, clients are applications and services that want to use Keycloak to secure
themselves and provide a single sign-on solution. Clients can also be entities that
just want to request identity information or an access token so that they can
securely invoke other services on the network that are secured by Keycloak
In short words, not only for keycloak but for OAuth and OpenId Connect too, a client represents a resource which some users can access. The built-in clients for keycloak represent some resources for keycloak itself.
Clients and users are two completely different constructs in keycloak.
In plain English, client is an application. Example for an application could be a e.g. yelp.com or any mobile application. Client can be a simple REST API. Keycloak's built in clients are for keycloak internal use, But any user-defined application has to be registered as a client in keycloak.
Users are the one which authenticate via keycloak to gain access to these applications/clients. Users are stored in keycloak DB or any externally hosted LDAP but synced with keycloak.

SSO and IDP proxy for UI and REST

We are building a SaaS application (enterprise oriented).
We have to be able to log-in the users against the saml2 IdP of their company with SSO functionality (so multi-tenant context)
We prefer to manage it in a isolated component and so not directly on the application it self.
We think to use a kind of "proxy".
We have two questions :
- Does WSO2 IS is able to act as proxy, delegating the authentication to an extern IdP ?
- Our SaaS application will be offered via UI relying on REST ful services, so we need to manage SSO
also with the services, so for example :
. The user comes on the UI without any log-in before
. The company IDP login-page is shown for authentication
. Once logged , the UI will perform some calls to REST service and we need to secure those service call, to be sure
the user is allowed to call this service
How to manage it ?
Does the "proxy" API can act also as "proxy services" in order to call the extern IDP API ?
Tks
Nicolas.
If i got your question correctly, There is an existing IDP in "foo" company. In "bar" domain you have applications. You are not going to integrate application directly with IDP in "foo". And you are wishing to install an another IDP in "bar" domain where this "bar" domain IDP can talks to existing IDP in "foo" domain. Yes. WSO2IS can be used to implement such use case. It has "Authentication Framework" for SAML2 SSO logon... Let me explain it bit. When user is directed to WSO2IS SAML2 IDP, user can be authenticated by verifying user/password which is the default behavior. (default authenticator that is picked by "Authentication Framework"). But there can be any other authenticators such as SAML2 SSO (where WSO2IS can call to another SAML2 IDP and authenticate the user), OpenID and so on. I guess, same scenario has been discussed here. I found blog on implementing this.