Blazor WebAssembly EF Core row-level security - pass claims client to server - entity-framework-core

I'm driving myself mad with this and hoping a better method exists.
The scenario is this: I have written a Blazor Web Assembly application with server hosting. The application needs to implement row-level security (using EF Core) but I need some access permission to be looked up in another database via an API (external supplier so have no control) to be used in the security.
Ideally I'd love the permissions to be managed in AAD but that isn't supported in the external application and is too much overhead to manage manually.
On the client side I use a custom AccountClaimsPrincipalFactory to lookup the correct permissions and create the user claims.
What I'm struggling with is to then to manage this server side, I've looked into using IClaimsTransformation to lookup the claims again but this produces too much overhead as it is called on every request.
I've (not imagining it would work) tried adding claims client side, adding a new identity client side and accessing on the server. All I've really got left I can think of is using the AuthorizationMessageHandler to encrypt values and pass to the server that way as a token, it works as a work around but probably not the best method.
Is there a way to somehow lookup the database values and maintain them within the identity/token so they can just be passed around?
If this is a stupid question, feel free to give me a slap, I've just exhausted my google search terms for it.
Thanks

Related

Best Practices for authorizing local scripts via oauth to access Web Services

I couldn't find information on how other people solve this, so maybe you can help me out.
What I have
Multiple Services with REST APIs, that are secured using OpenID Connect. Connections between the Services work fine.
Now I have multiple developers, who sometimes need to write and execute local scripts (Python, R, Bash etc.) for quick analysis and testing.
What I want
I want to enable the developers to use the services as easy as possible, but still respecting security concerns.
What I tried
I defined the script itself as a client. Therefore I created a public client in my OIDC product, which is called somewhat like 'developer-scripts'. Using a library which handles the oauth dance, I can then execute the script connecting as aforesaid client. First time, the browser pops up and requests the user to authenticate and therefore authorize the client to use the REST API on behalf of the user. After that, the tokens are cached and I can easily continue working on that script.
This simplified drawing tries to summarize, what I just described
That works perfectly fine and regarding security I'm glad that credentials are not saved on the local computers as it was before with e.g. Basic Authentication. Furthermore, I'm able to control the access to different services on a user level.
Other ideas, which didn't convince me:
every web service also has an public client which can then be used as a client by the scripts (so the scripts aren't defined as clients anymore)
token generation is done somewhere else and the developer just adds the generated access/refresh token to the script
My problem
What concerns me about my current solution is the definition of that client. In the described case it would be either a generic client used by all developers for all scripts, or a new client for every developer who want's to write a local script. The latter seems to be a lot of overhead, the former may be a security problem?
So finally I'm asking the question: Are there any known best practices for my described use case?
EDIT:
I found a small article by [Martin Fowler](https://martinfowler.com/articles/command-line-google.html), he is basically explaining, how he is receiving a token to use for a local script. But in his case, he's using it for one certain use case, and not as a general public client. So unfortunately it doesn't really contribute to my answer.

using custom database with roles in identity server 4

I am working on an application where I need to setup identity server 4. I have an api as resource. and a web forms application as client.I have few roles like teachers, students, parents in my database. How can I use this custom database and perform authentication and authorization without using identity?
Please suggest.
From your other question here I get a better idea of what you want.
I think one solution for what you want would be to setup identityserver4 in a seperate project with its own seperate database. I noticed the tag identityserver3, but I think it is quite safe to go for identityserver4. It shouldn't make a difference for the client/user since they are conceptually compatible.
1) Give your application a client/secret (which you configure in identityserver) in order to identify your application and grant access to the resource api. Here is some information: http://docs.identityserver.io/en/dev/quickstarts/1_client_credentials.html
You'll only need to configure one client to protect your resource from the outside. The only way to access the resource api is through your application, since your application is making the actual calls. This is also the drawback, you cannot expose the token to the outside world.
Since your client isn't the actual user, you'll need to identify the user. You can use any mechanism based on your current model as you like. A simple user/pass (with or without asp.net identity) could be enough to determine the roles. But please keep in mind that your application has full access to the resource api.
2) However, since identityserver is available, why not use it? Why don't you want to use the identitymodel? I think you should consider to seperate the identity information and your datamodel. Your datamodel shouldn't be aware of the security. And the security has nothing to do with your datamodel.
When you create a seperate database for identityserver you have one place to configure the identity users. All you need is a reference (sub) to the user in the datamodel. http://docs.identityserver.io/en/dev/quickstarts/2_resource_owner_passwords.html
Add claims or roles and everything is in place and you'll see there is no need to keep identity data in your custom database. The structure of your custom database stays intact, including the user table but without the identity data.
I think this is a safer solution and considering the good documentation and sample projects it may even turn out to be a quicker solution.

Using Local storage and REST adapter at the same time?

I'm pretty new with an Ember so for the start I have a noob question - is it possible to use Local Storage and REST adapter at the same time?
For example, if I want to do a login via API, if login is success the server will return an API key which is used for later communication with a service. Is it possible to store that information locally on the client and to retrieve it when necessary but also, for other models, to use REST adapter?
If this is not a good way to handle such case, which one would you propose and is there any kind of example which would me lead me in the right direction?
Thanks to the people from #emberjs, I found out that there is a wonderful ember-auth authentication framework for the Ember.js which does what I need.

ASP.NET MVC 4, MongoDB, implementing login

I've used MongoDB before, but never with ASP.NET MVC.
Currently, I'm stuck trying to implement authentication for system which is going to use exclusively MongoDB (so, I don't have the option of leaving the users table to a SQL database).
Now, I figured a solution would be implementing my own Membership provider. However, that requires quite a lot of code. And, since it is related to security, it is not wise to reivent the wheel if I can avoid it.
Coming from Rails, it would be rather simple to just add something like Devise, set it up to use MongoDB and call it a day. I couldn't find anything similar for ASP.NET MVC - I am not sure if it is an uncommon use case, or if my Google-Fu is inadequate.
I don't need anything fancy -just the ability to create users, check their credentials and protect controllers from being called from unauthenticated users. Are there any packages that could solve my problem?
https://github.com/osuritz/MongoDB.Web
A collection of ASP.NET providers (caching, membership, profiles, roles, session state, web events) for MongoDB.
I would suggest to use https://extmongomembership.codeplex.com/ as this is newer provider that was presented in ASP.NET MVC4. And it contin eve more features (for instance permissions system if need)

Strategies for "Always-Connected" Windows Client Data Architecture

Let me start by saying: this is my 1st post here, this is a bit lenghty, and I havent done Windows Forms development in years....with that in mind please excuse me if this isn't directly a programming question and please bear with me as I really need the help!!
I have been asked to develop a Windows Forms app for our company that talks to a central (local area network) Linux Server hosting a PostgreSQL database. The app is to allow users to authenticate themselves into the system and thereafter conduct the usual transactions with the PG database. Ordinarily, I would propose writing a webforms app against Mono, but the clients need to utilise local resources such as USB peripheral devices, so that is out of the question. While it might not seem clear, my questions are italised below:
Dilemma #1:
The application is meant to be always connected. How should I structure my DAL/BLL - Should this reside on the server or with the client?
Dilemma #2:
I have been reading up on Client Application Services (CAS), and it seems like a great fit for authentication, as everything is exposed via URIs. I know that a .NET Data Provider exists for PostgreSQL, but not too sure if CAS will all work on a Linux (Debian) server? Believe me, I would get my hands dirty and try myself, but I need to come up with a logical design first before resources are allocated to me for "trial purposes"!
Dilemma #3:
If the DAL/BLL is to reside on the server, is there any way I can create data services, and expose only these services to authenticated clients. There is a (security) requirement whereby a connection string with username and password to the database cannot be present on any client machines...even if security on the database side is quite rigid. I'm guessing that the only way for this to work would be to create the various CRUD data service methods that are exposed by an ASP.NET app, and have the WindowsForms make a request for data or persist data to the ASP.NET app (thru a URI) and have that return a resultset or value. Would I be correct in assuming this? Should I be looking into WCF Data Services? and will WCF work with a non-SQL Server database?
Thank you for taking the time out to read this, but know that I am desperately seeking any advice on this! THANKS A MILLION!!!!
EDIT:
I am considering also using NHibernate as my ORM
Some parts of your questions are complicated and beyond my expertise. However, in general you can do almost anything you put effort into, CAP theorem and the like aside.
DAL/BLL stuff in general can reside in any of the tiers. I put a lot of this in my database and some in the middle tier, however this is to allow re-use in different environments which may or may not be a goal for you. The thing is I would think through carefully the separation of concerns issues here and what sorts of centralization of logic you want to place. The further back, the more re-usable this becomes but this is not always a free tradeoff.
I am not entirely familiar with CAS but it looked like AJAX kinds of stuff from what I saw on the MSDN web site. That could be wrong, but if it is right, then you have an issue in that such requests may be stateless and this could be an issue if you need a constant connection.
On the whole based on what you are saying it sounds cleanest to do a two tier rather than a three tier app, and have the DAL/BLL sit on the client, possibly supported by stored procedures in the server. You can then set PostgreSQL up to authenticate against whatever you use on your network (KRB5 if AD is what I would recommend). This simplifies your data access, and it allows you to control permissions based on the authentication against the database. Since you can authenticate users based on AD, you can then set permissions accordingly.
One important consideration is going to be number of connections. PostgreSQL does have some places where every current connection must be checked and iterated through, and connection startup and tear-down overhead in some cases can be significant. So one important decision will involve connection pooling. Whether or not you use connection pooling to boost performance will depend on what you are doing but I have seen cases where PostgreSQL has handled 600 connections without serious problems.