I am working on an application that utilizes a database that often has tables added to it or modified. Is there a way I can regenerate the .edmx file as a build step or during compile time to add these new tables/modifications without manually running the wizard?
You can try to run sql scripts to insert/modify tables during build process.
Related extension:
ExecuteSqlScript
Run SQL Server Scripts Task
Or directly use PowerShell to Execute .SQL Files from Directory.
Reference below articles to change the database during build/release:
Build and Release Process for SQL Server Database
Scripts using Online TFS
Continuous Deployment of SQL Server Database Changes using Visual
Studio & TFS Release Manager
UPDATE:
Choosing the Update Model from Database is the best method for updating your EDMX. There are certain properties that don't get updated on the Conceptual layer.
See How do you update an edmx file with database changes?
Seems there isn't a good way to achieve that, however if the actions can be executed in command line, then we can add a step to run the command or script.
Related
I am setting up an Azure Release Pipeline and I need to execute any pending DB Migrations as part of the release.
I have been scouring the internet for over an hour and everything I can find is about dotnet Core, while the database is EF6 on .Net Framework, not dotnet Core (I've done this several times before for Core).
The problem, as I see it, is that EF6 works using Visual Studio's built in Package Manager Console - This just doesn't exist in an Azure Pipeline; It's a Visual Studio weirdness.
There seems to be several ways I can skin this cat, in my head, but I can't figure out how to start with either of them within the context of the pipeline...
OPTION 1: Run the Migrations on the Pipeline - but... how?
OPTION 2: SQL Scripts - Requires running the Package Manager to generate them so they can be run (if I could do that on the pipeline then I'd just run it anyway so these would have to be created locally and committed with the code which is somewhat backward as a solution IMO)
OPTION 3: Write a console app - Do I really have to do this??
You can try Entity Framework Migration Extensions.
This task allows a Build / Release to provide database connection parameters and execute an Entity Framework 6 migration against the database.
Build your project to an output folder and include the migrate.exe executable that comes with Entity Framework 6.
Create an automated build that packages up your files and makes them accessible during a Release.
Create a Release definition for the relevant Build
Add an EF6 Migration task. Once that task is added to an environment within the release, you'll need to enter the appropriate parameters to configure it. All file path parameters should be within the file system for the build, none of them are for TFS source control paths.
You can also check this article.
The answer here is to use the ef6.exe command line tool and make sure it gets shipped with your build.
This could be useful to anyone here until Microsoft update the non-existent docs: http://github.com/dotnet/EntityFramework.Docs/issues/1740 - This contains a table with a kind of translation matrix between the two.
We're using EF 6, in database-first mode.
We've been building a database migration package, using FluentMigrator, and we've been running our migrations against empty LocalDb databases in our solution, for use in integration testing, etc.
Because we need our solutions to build wherever they're checked out, we've been using |DataDirectory| in the connection strings.
We've written some unit [sic] tests, running against LocalDb instances, running the migrations first to ensure that the tests are running against the current schema. (Yes, these are integration tests, but we're using a unit test framework, and I don't want to argue about it.)
What I would like to do next is to change our configuration so that when we run "Update Model from Database" on our .edmx files, it's the LocalDb instance that the tool looks to. And I can't make that work.
My guess is that whatever tool is running, when we do this, doesn't have it's DataDirectory set where I need it to be, in it's appdomain.
Does anyone know what tool this might be? And how I can configure it to connect to a localdb instance, that's using |DataDirectory| to create a relative path?
Before anyone marks this as a duplicate, none of the questions similar to this addressed any of my concerns or answered any of my questions.
I am currently developing all the POCOs and data contexts in a library project, and running migrations from within this project. The database I'm updating is the development database.
What do I do if I want to create the current schema to a fresh, new database? I figure that all I have to do is to change the connection string in web.config and run Update-Database, correct?
While the live/production database is up and running, I want to add new columns and new tables to the schema, and test it out in development. So I switch back the connection string to the development database's connection string, and run Update-Database.
Going back and forth between two databases seems like I'll get conflicts between _MigrationHistory tables and the auto-generated migration scripts.
Is it safe to manually delete the _MigrationHistory tables in both databases, and/or delete the migration files in /Migrations (so I'll run Add-Migration again)? How do we manage this?
What do I do if I want to create the current schema to a fresh, new database?
- Yes, to create fresh database to the current migration level you simply modify the connection string to point to a database that does not yet exist and run update-database. It will run all the migrations in order.
As far as migrating to the Production database, I am running the update-database command with the -script switch to acquire the raw sql and then applying that script to the production database manually. This is helpful if you need to keep a record of sql commands run against the database as well. Additionally, you can generate the script explicitly from a specific migration to another specific migration via some of the other update-database switches.
Alternatively, you can create an Idempotent script that works from any migration by using the–SourceMigration $InitialDatabase switch and optionally specify an end migration with –TargetMigration
If you delete the _MigrationHistory tables you will have issues where the generated script will be trying to add columns that already exist and such.
You may find the following link helpful:
Microsoft Entity Framework Migrations
I would suggest having a separate trunk in your source code repository - one pointing to production and one to development to avoid risks of switching between the two in visual studio.
Me also had the same problem, even when using one and the same database - due to some merges in the repository, and the mix of automatic/manual migrations. For some reason the EF was not taking into account the target database, and calculating what scripts need to me executed, based on what is already in the database.
To fix this, I go to the [__MigrationHistory] table on the target database and get the latest migration name. This will help EF to determinate the state of the DB, and will execute just the scripts needed.
then the following script is run:
update-database -script -sourcemigration {latest migration name}
This creates update script that is specific to the target database (the connection string should be correct, as discussed in the other comments)
you can also use -force parameter if needed
this way you can update any database to latest version, no mater in what version you found it, if it has MigrationHistory table.
Hope this helps
My production and my developmental database went out of synch and it gave me endless problems. I solved it using a tool from Red-Gate to match up the databases. After using the tool, the databases were exactly the same but my migration was not working and I started to get odd errors i.e. trying to add tables/ columns that already existed etc. I solved that. I just deleted the migration folder on the local, recreated it, added the initial migration, updated the database and then matched the data of this migration file (local) to the one on the host (delete all the data in the migration file on the host, and add the same data that is on the local into the host). A more detailed explanation is at:
migration synch developmental and production databases
I'm having lots of trouble trying to figure out how to rebuild my database.
I've deleted a table, then i ran this command update-Database -Verbose and it doesn't rebuild my database.
The same thing goes for when i have a table, then i change a column name inside my Model (C#) and then i want to rebuild my database so that the name shows up in the database, and nothing happens when i run the same command.
How can i rebuild my database? I'm sure there's a command or something, besides the update-Database -Verbose.
I'm using Visual Studio express 2012 for Web.
EDIT: Couldn't find a command able to rebuild my Database, though i did find a simple way to do what i wanted.
You can delete tables or rename columns, in your Models, and it will always rebuild your database. In the Solution explorer, you'll find an icon that says "Show All Files", press it and open the folder App_Data, there you'll see your database file, delete it and re-run your application, and it will build again the Database, with all the changes you've made to your Models in your code. So that part about changing catalog and .mdf names in the Web.config file is not needed.
Update-Database is part of the Entity Framework Migrations package, which allows you to script the changes to the database sequentially. The command won't do anything on it's own without migrations to process. If you are using Migrations, you need to use the Add-Migration myNewMigration command first, and verify the script that is generated makes sense.
If you are not using Migrations, you can use the much simpler database options such as DropCreateDatabaseIfModelChanges() against your context class.
Another thing you can try is running Update-Database -f to "force" the migration to run even if it's already present in the _MigrationHistory
In order to implement run and re-run my integration tests an indefinite number of times, I would like to make use of SSDT in VS2012 to publish to a LOCALDB file instance and run EF against that file during integration tests.
Few notes:
We are using EF Database first
We already have a SSDT project that we will use to deploy to a full
database in our different environments
I know that SSDT uses internally a LOCALDB instance to build/deploy/check for errors, so deploying to another custom localdb seems like it should make sense/be doable
Few questions:
Can I deploy to a specific LOCALDB file with SSDT?
Can I do this from the command line in order to automate it when I run integration tests?
Does this roughly seems like a good idea for integration tests with EF or is there a better way? ;-)
Thank you all
You can change the localdb for SSDT in the Debug options for the project. By default the debug options are set to the (localdb) instance and a DB name that corresponds to the project.
You may have more success with Publish Profiles if you're trying to push the project changes to a DB server. You can use those with SQLPackage to push the changes along with a known set of options to a pre-defined server/database.
You can definitely push the changes through a command line. We're doing it with MSBuild to generate a dacpac file, then SQLPackage to publish the changes from the dacpac to the appropriate server/database.
Can't say for sure on this one. If it works for you, it's likely a good start. We do DB development outside of EF and try to do that first rather than trust EF to generate a good relational model.
I have a handful of blog posts on SSDT SQL Projects at http://schottsql.blogspot.com/search/label/SSDT that might be helpful.