Multi tenant MVC4 application and automatic DB migrations with Entity Framework - Good practice? - entity-framework

I've built a multi tenant MVC4 application that use a specific database depending on the hostname.
The site binds to all of our customers hostnames:
If a visitor surfs to 'domain1.com', the 'domain1.com' database is used.
If a visitor surfs to 'domain2.com, the 'domain2.com' database is used.
Automatic migrations are normally placed in Application_Start():
Database.SetInitializer(new MigrateDatabaseToLatestVersion<MyProject.Models.MyProjectContext, MyProject.Migrations.Configuration>());
This will run the migrations on application start. However, since my application responds to several hostnames, only one database will be migrated. When i switch to another hostname, that database IS NOT migrated since the application is already loaded into memory on the server.
I solved this by moving the above line of code to Session_Start().
Is this good practice? Is there a better solution?
Thanks in advance,
Andreas

Related

Database and user creation in EF Core when deploying to Kubernetes

We have a .net core app being deployed to a Kubernetes cluster which accesses an AWS RDS MS SQL database.
In this environment we'd like to use EF Code First to handle our model (maybe with migrations later, but initially dropping and creating is fine).
How in this environment do we create a SQL user with appropriate permissions on the RDS instance so that the web application can login as this user and create the code first model?
Our initial approach involved creating a user as part of a .sh script, creating a db and assigning permissions. This fell down as when in the C# code we tried to run Database.EnsureExists() is saw there was a database and didn't build the model.
I thought perhaps not creating the db and assigning higher permissions to the user might work, but this feels like a bad approach unless we run some kind of post deploy to remove the dboesqe permissions afterwards.
What is the recommended approach for a ephemeral deployment where we intend to drop/create/seed on each run?
I've been tussling with this question as well. We're doing .NET Core EF code first on Kubernetes with a Microsoft SQL database.
I've been messing around with context.Database.Migrate(). This will create the DB then create the tables and do the migrations (case 1), or if the DB already exists, it will just create the tables and do the migrations (case 2).
For case 1, the account needs to have the dbcreator server role. Once it creates the DB, it will assign itself the dbo database role.
For case 2, you could potentially just give db_ddladmin, db_datareader, and db_datawriter. I've tested this and it seems to work fine, but I'm unsure of the side effects of not having dbo access. Julie?

.NET Core Entity Framework on AWS

I've got a web application written in .NET Core using Entity Framework to connect to a SQL Server database. Everything works fine locally. The staging/production environment for the app is AWS Elastic Beanstalk, and I seem to be stuck on how to get the app to connect to the database there.
I've created an RDS (SQL Server) database underneath the Elastic Beanstalk app.
Connection string:
user id=[USER];password=[PASSWORD];Data Source=[SERVER].rds.amazonaws.com:1433;Initial Catalog=Development
Creating the DbContext:
services.AddDbContext<MyDbContext>(o =>
o.UseSqlServer(Configuration["Db:ConnectionString"]));
App fails to start, based on the following line in Startup.cs:
dbContext.Database.EnsureCreated();
You have to troubleshoot step by step as below procedure:
Db Connection string is working or not? Better to use with other app and simple doing the Db Connection testing. It could be possible that firewall block your port 1433.
As per your codes, .NET Core Framework will crate a database by code first approach. So that, you have to make sure, your IIS Application Pool user account has write access to SQL Database. Most probability it could be your problem.

Is This MSDTC configuration Issue?

It seems I am running into the Microsoft Distributed Transaction Coordinator (MSDTC) related issue.
SCENARIO
I am using TransactionScope and with in the single scope it hits two different databases on different servers (for instance, DB_A running Windows Server 2003 and DB_B running Windows Server 2008). One database is accessed using Entity Framework 4.0 and another using normal ADO.NET APIs.
When I run the application from my development machine (running WinXP) it commits and rollbacks both the connections accurately. But when I run the application, deployed on another server (for instance WAS_A running Windows Server 2003) it commits correctly but in case of exception is doesn't roll back the database activities on both the servers.
I thought it would be the MSDTC configuration issue on the WAS_A. So I went to the MSDTC -> Security Configuration and checked all the available options (as I did previously on other machines). But still I am facing the same issue.
Looking for your expert advices. :)
I believe that you need to look into Enabling Transaction Flow. Specifically, take a look at how one may error and the other complete as described in TransactionScope and WCF Services:
an error in a second WCF service call was NOT rolling back the changes made in a previous WCF service call...
In order to create an ambient transaction in your client and ensure that it is used by your WCF services...
The article then details the following steps:
Configure Your Binding with transactionFlow
Decorate Your Interface with [TransactionFlow(TransactionFlowOption)]
Decorate Your Method with [OperationBehavior(TransactionScopeRequired)]
Optionally update your Connection Strings with Transaction Binding*
*Note: This is optional in my opinion.

MONO / ASP.NET Authentication Persistence

I'm deploying an ASP.NET MVC 2 application using Apache / mod_mono / MONO (2.8.1) that uses the built in ASP.NET authentication framework.
When I restart Apache, or use the mod_mono control panel to restart the mono server process, users are logged out. I don't want this occurring.
I'm using custom Profile / Membership / Role providers (that are backed by a Redis database), and these currently have a bare minimum implementation. I can not see where my problem fits in here however, but am I missing something obvious?
I notice that the .MONOAUTH cookie changes value when a user logs back in, so I guess there is some persistence that needs to happen that is not happening.
Any solutions or pointers to the relevant documentation would be great!
NOTE: I'm not sure if the information below differs when you're using a Membership Provider -- it may be that session state is persisted by the Membership Provider itself.
It's likely that you're using "in-process" session state storage. This means that whenever you restart the web server process, you're clearing out all the session information stored in the web server process's memory space.
To avoid wiping out session information, you can move to using an out-of-process session state server, either running as an in-memory service (see below for the Mono version) or on SQL Server. Otherwise there are also a number of unofficial custom session store providers that use alternative storage mechanisms (e.g. MongoDB etc.)
I found what you may want, which is this Mono ASP.NET Session State Server: http://manpages.ubuntu.com/manpages/gutsy/man1/asp-state2.1.html
As a first step, take a look in your web.config at the system.web -> sessionState property. If it's set to mode="InProc" then there's your problem. It should look more like:
<sessionState
mode="StateServer"
stateConnectionString="tcpip=server:port"
stateNetworkTimeout="number of seconds"/>
Solution: set the validationKey and decryptionKey manually:
<machineKey validationKey="blahblah" decryptionKey="blahblah" />
I think this is probably a bug in mono that these take on different values over server resets when auto-generated (which is the default).

Database migrations: manage with build script or automatic on app startup?

I'm in the process of developing a deployment system for a new web app and I'm wondering where the best point in the process to manage database migrations is (the question of how to do the migrations is another problem entirely).
It seems there are two ways to go:
Use a migration script that can
either be run manually from command
line or as part of the automatic
deployment/build process
Run the migrations when the app
starts up (I'm using ASP.NET so this
can be done easily enough without
causing a long-running user request)
Does anyone have any suggestions/insight/experience with these approaches? Any other suggestions?
I can see why #1 might be more attractive - it gives me complete control over when the DB is updated. However, I quite like #2 as it allows me to quickly iterate between deployments and reduces the manual process. #2 could also be used on my development machine to allow even quicker iterations. Hmm, starting to think having both might be a good thing...
We have a sales-force system with ~100 client and we are updating database at application startup (True, our is a desktop application.) I like this approach, it's safe and iterative if we have indeterministic startpoint (is the client database new or only updated to verison x.y.z?).
But at serverside I'm preferr your #1 option: we create a SQL query file on our virtual machine (based on the copy of the original database) and runs this query against the real server.
So IMHO:
Disconnected clients: startup, iterative scripts
Server: query created on VM based on the actual and real database
So I'm interrested in this problem too, and find some (half)frameworks as RikMigrations. After some googling there is a good startplace about DB versioning/migration frameworks: .NET Database Migration Tool Roundup. Not neccessarely the documentation but the team blogs can be interresting.
I like option #1 better as it seems much more flexible. In lieu of actually performing migrations on each app start, I think I would verify that the database schema (version number?) matches the code, and if not, throw a warning or error about a mismatched database schema.
I'd prefer option #1 for a number of reasons. First, integration tests usually require your DB schema to be up-to-date, and launching a web-site to upgrade the schema will be a huge timewaster. Second, you cannot change database schema while your site is running (say, add a couple of indexes to speed things up).
As for production side of things, upgrading your database in transaction MSI-style installation is much better than attempting to upgrade at each app startup since you can potentially end up with desynchronized database-application versions.
And if you're looking for the migration framework, take a look at Wizardby.
If the application ever has to run on a customer's machine than migrating at startup can prevent a lot of support calls - assuming you can do seamless migration without user intervention (I hope you aren't normally running your web app with permission to modify the database).
If the application always runs under your control automatic migration is less of an issue - but still can be a good feature, especially if you want to minimize downtime and manual deployment steps.