Using the next few lines of code, I managed to get what I wanted in my Identity Server database - in-database stored IS4 thingies. I thought that was what we were supposed to have.
services.AddIdentityServer()
.AddDeveloperSigningCredential()
.AddTestUsers(TestUsers.Users)
.AddConfigurationStore(options =>
{
options.ConfigureDbContext = builder =>
builder.UseSqlServer(Configuration.GetConnectionString("AppDB"),
sql => sql.MigrationsAssembly(migrationsAssembly));
})
// this adds the operational data from DB (codes, tokens, consents)
.AddOperationalStore(options =>
{
options.ConfigureDbContext = builder =>
builder.UseSqlServer(Configuration.GetConnectionString("AppDB"),
sql => sql.MigrationsAssembly(migrationsAssembly));
These created bunch of tables, and properties in them, which we want to be minimazed to the ones we need.
After that, I went on to follow the next walkthrough: https://damienbod.com/2017/12/30/using-an-ef-core-database-for-the-identityserver4-configuration-data/. All in hopes of finding a way to keep Identity Server 4 operational, and creating custom tables and columns in the database.
I am a noobie in IS4, and programming generally, so please have an understanding if I do something wrong.
The final questions are: is it possible to have reduced tables and columns? Is it an accepted practise? If so, can anyone help with a link, advice, snippet?
Thanks in advance!
Related
In my postgreSQL I currently have one database with user and business relevant tables. Our company is planing in future to add more categories, therefor I would like to seperate the categories into individual databases, as most of the data is quiet static once initialized. Below I am a showing an example with cars, please keep in mind that our company services differ much more as Pkw and LorryTruck and need much more extra tables within their scope.
Current State:
PostgreSQL:
CompanyDB
UserTable
PkwTable
ServiceTable
BookingTable
Future State
PostgreSQL:
UserDB
UserTable
PkwDB
PkwTable
ServiceTable
BookingTable
LorryTruckDB
LorryTruckTable
ServiceTable
BookingTable
My concern is if and how I could connect user relevant Data to the desired databases. In example a user can register for Pkw services and might be later on interested in LorryTruck services. The main goal is also that a user should only register once on our system.
Is this possible or could I design this better?
Please provide your opinion or experiences.
Thank you!
I would use a SCHEMA, not a different database. It's not possible (or easy) to get some data from a different database, while using different schema's is standard and working great.
Problem
I'm building a web app, where each user needs to have segregated data (due to confidentiality), but with exactly the same data structures/tables.
Looking around I think this concept is called multi-tenants? And it seems as though a good solution is 1 schema per tenant.
I think sqlalchemy 1.1 implemented some support for this with
session.connection(execution_options={
"schema_translate_map": {"per_user": "account_one"}})
However this seems to assume the schema and tables are already created.
I'm not sure how many tenants I'm going to have, so I need to create the schema, and the tables within them, on the fly, when the user's account is created.
Solution
What I've come up with feels like a bit of a hack, which is why I'm posting here to see if there's a better solution.
To create schemas on the fly I'm using
if not engine.dialect.has_schema(engine, user.name):
engine.execute(sqlalchemy.schema.CreateSchema(user.name))
And then directly afterwards I'm creating the tables using
table = TableModel()
table.__table__.schema = user.name
table.__table__.create(db.session.bind)
With TableModel defined as
class TableModel(Base):
__tablename__ = 'users'
__table_args__ = {'schema': 'public'}
id = db.Column(
db.Integer,
primary_key=True
)
...
I'm not too sure why to inherit from Base vs db.Model - db.Model seems to automatically create the table in public, which I want to avoid.
Bonus question
Once the schema are created, if, down the line, I need to add tables to all the schema - what's the best way to manage that? Does flask-migrations natively handle that?
Thanks!
If anyone sees this in the future, this solution seems to broadly work, however I've recently run into a problem.
This line
table.__table__.schema = user.name
seems to create some odd behaviour where the value of user.name seems to persist in order areas of the app, so if you switch user, the table from the previous user is incorrectly queried.
I'm not totally sure why this happens, and still investigating how to fix it.
Normally, MongoDB Collections are defined like this:
DuckbilledPlatypi = new Mongo.Collection("duckbilledplatypi");
I want to, though, dynamically generate Collections based on user input. For example, I might want it to be:
RupertPupkin20151212_20151218 = new Mongo.Collection("rupertPupkin20151212_20151218");
It would be easy enough to build up the Collection name:
var dynCollName = username + begindate +'_'+ enddate;
...and then pass "dynCollName") to Mongo.Collection:
= new Mongo.Collection(dynCollName);
...but what about the Collection instance name - how can that be dynamically generated? I would need something like:
"RupertPupkin20151212_20151218".ToRawName() = new Mongo.Collection(dynCollName);
-or:
"RupertPupkin20151212_20151218".Unstringify() = new Mongo.Collection(dynCollName);
...but AFAIK, there's no such thing...
On a single client instance, yes, and you could dynamically reference it. However in the general case (using it to sync data between the server and all connected clients), no.
I address this point in the Dynamically created collections section of common mistakes in a little detail, but the fundamental problem is that it would be highly complex to get all connected clients to agree on a dynamically generated set of collections.
It's much more likely that a finite set of collections where some have a flexible schema, is actually what you want. As Andrew Mao points out in the answer to this related question, partitioner is another tool available to help address some cases which give rise to this question.
I'm using MVC5 along with EF6 to develop an application. I'm using SQL Server Express for database. I have two tables/entities in the database.
Vehicle - Contains Information about the vehicles
GsmDeviceLog - Contains data received from GPS Tracker Fit in the vehicle.
My GsmDeviceLogs table currently has around 20K records and it takes around 90 Seconds to execute the below code. Just to fetch one record(i.e. The Last Record).
Here is the code:
var dlog = db.Vehicles.Find(2).GsmDeviceLogs.LastOrDefault();
When I try to open the table using Server explorer it shows ALL the data with in 5-10 seconds. Can anyone help me get the details loaded quickly on the page as well.
Thanks for reading and paying attention to the question.
Can anyone suggest any means to reduce the time.
Your query should look like this:
var dlog = db.Vehicles
.Where(v => v.Id == 2)
.SelectMany(v => v.GsmDeviceLogs)
.OrderByDescending(gdl => gdl.Id) // or order by some datetime
.FirstOrDefault();
In your original query you are loading the Vehicle with Find. Then accessing the GsmDeviceLogs collection loads all logs of that vehicle into memory by lazy loading and then you pick the last one from the loaded collection in memory. It's probably the loading of all logs that consumes too much time.
The query above is executed completely in the database and returns only one GsmDeviceLog record. Side note: You must use OrderByDescending(...).FirstOrDefault here because LastOrDefault is not supported with LINQ to Entities.
Your Linq query is inefficient. try doing db.Vehicles.Single( x => x.Id).GsmDeviceLogs.OrderByDescending(x => x.Id).FirstOrDefault() or whatever your primary keys are
I have to maintain an application that creates and manages quotations. (A quotation is a single class containing all information needed for pricing).
Most of the time, creating a quotation means adding a couple of lines to a couple of tables. It is pretty fast. Sometime however, the user attaches a large history of claims to the quotation and tens of thousands lines must be created in the database. Using EF, it takes forever.
So I've tried to use DbBulkCopy to bulk insert the claims while using EF to manage the reminder of the quotation, but the way I figure out how to achieve this is really, really cumbersome: I had to clone the quotation, detach the histories, delete the claims from the database, save the quotation, get the new foreign keys, bulk create the claims, attach the histories back to the quotation, etc.
Is it another way to achieve this?
Note: I could separate the claim history from the Quotation class and manage the first using ADO and the later using EF, but a lot of existing processes need the actual class design (not to mention that the user can actually attach many claim histories, which, of course, are sub-collections of sub-collections of sub-collections buried deep in the object tree...).
Many thanks in advance,
Sylvain.
I found a simple way to accomplish this :
// We will need a quotation Id (see below). Make sure we have one
SaveQuotation( myQuotation );
// Read the large claim history
var claimCollection = ImportClaimsFromExcelFile( fileName );
// Save the claim history without using EF. The quotation Id is needed to
// link the history to the quotation.
SaveClaimCollectionUsingSqlBulkCopy( claimCollection, myQuotation.Id );
// Now, ask EF to reload the quotation.
LoadQuotation( myQuotation.Id );
With an history of 60 000 claims, this code runs in 10 seconds. Using myObjectContext.SaveChanges(), 10 minutes were not even enough...
Thanks for your suggestions!
Note : Here is the code I used to bulk insert the claims :
using (var connection = new SqlConnection(constring))
{
connection.Open();
using (var copy = new SqlBulkCopy(connection))
{
copy.DestinationTableName = "ImportedLoss";
copy.ColumnMappings.Add("ImporterId", "ImporterId");
copy.ColumnMappings.Add("Loss", "Loss");
copy.ColumnMappings.Add("YearOfLoss", "YearOfLoss");
copy.BatchSize = 1000;
copy.WriteToServer(dt);
}
connection.Close();
}
Because of the many round-trips made to the db in order to persist an entity, EF is not the best choice for bulk operations. This said looks like the EF team is looking into improving this http://entityframework.codeplex.com/workitem/53
Also, have a look here : http://elegantcode.com/2012/01/26/sqlbulkcopy-for-generic-listt-useful-for-entity-framework-nhibernate/.
Another possible solution could be add all your Inserts within a single ExecuteSqlCommand, but then you will be loosing all the advantages of using an ORM.