SSDT: How to deploy to a specific LOCALDB file? - entity-framework

In order to implement run and re-run my integration tests an indefinite number of times, I would like to make use of SSDT in VS2012 to publish to a LOCALDB file instance and run EF against that file during integration tests.
Few notes:
We are using EF Database first
We already have a SSDT project that we will use to deploy to a full
database in our different environments
I know that SSDT uses internally a LOCALDB instance to build/deploy/check for errors, so deploying to another custom localdb seems like it should make sense/be doable
Few questions:
Can I deploy to a specific LOCALDB file with SSDT?
Can I do this from the command line in order to automate it when I run integration tests?
Does this roughly seems like a good idea for integration tests with EF or is there a better way? ;-)
Thank you all

You can change the localdb for SSDT in the Debug options for the project. By default the debug options are set to the (localdb) instance and a DB name that corresponds to the project.
You may have more success with Publish Profiles if you're trying to push the project changes to a DB server. You can use those with SQLPackage to push the changes along with a known set of options to a pre-defined server/database.
You can definitely push the changes through a command line. We're doing it with MSBuild to generate a dacpac file, then SQLPackage to publish the changes from the dacpac to the appropriate server/database.
Can't say for sure on this one. If it works for you, it's likely a good start. We do DB development outside of EF and try to do that first rather than trust EF to generate a good relational model.
I have a handful of blog posts on SSDT SQL Projects at http://schottsql.blogspot.com/search/label/SSDT that might be helpful.

Related

Find Task names of an SSIS file or Find xml code of an ssis package from sql server 2019 using T-SQL

there is a Windows Server which runs an SQL Server 2019 instance at my company. I have deployed an Integration services solution (project deployment model) with some SSIS packages. I wondered if there is any way to get data flow task names from the packages, or to get the xml file body in order to extract names by using T-SQL. If none of the above is possible i would like to know in which directory the actual dtsx files are stored in Windows Server when you deploy a solution in SQL Server. I have searched a lot for the above but i cannot find any answer.
Thanks in advance.
How is the package deployed? If it's File System, you could likely do
something like Powershell and SQL. I can't, personally, remember if
packages deployed in msdb are encrypted (I haven't used the deployment
method since 2012), however, if they are deployed via SSISDB you won't
be able to query the packages stored in the database as they are all
encrypted. You'd need to inspect the source packages (in your source
controlled project).
From a comment by #Larnu

Entity Framework 6 .net Framework Migrations / Package Management Console - How Do You Run These In An Azure Pipeline?

I am setting up an Azure Release Pipeline and I need to execute any pending DB Migrations as part of the release.
I have been scouring the internet for over an hour and everything I can find is about dotnet Core, while the database is EF6 on .Net Framework, not dotnet Core (I've done this several times before for Core).
The problem, as I see it, is that EF6 works using Visual Studio's built in Package Manager Console - This just doesn't exist in an Azure Pipeline; It's a Visual Studio weirdness.
There seems to be several ways I can skin this cat, in my head, but I can't figure out how to start with either of them within the context of the pipeline...
OPTION 1: Run the Migrations on the Pipeline - but... how?
OPTION 2: SQL Scripts - Requires running the Package Manager to generate them so they can be run (if I could do that on the pipeline then I'd just run it anyway so these would have to be created locally and committed with the code which is somewhat backward as a solution IMO)
OPTION 3: Write a console app - Do I really have to do this??
You can try Entity Framework Migration Extensions.
This task allows a Build / Release to provide database connection parameters and execute an Entity Framework 6 migration against the database.
Build your project to an output folder and include the migrate.exe executable that comes with Entity Framework 6.
Create an automated build that packages up your files and makes them accessible during a Release.
Create a Release definition for the relevant Build
Add an EF6 Migration task. Once that task is added to an environment within the release, you'll need to enter the appropriate parameters to configure it. All file path parameters should be within the file system for the build, none of them are for TFS source control paths.
You can also check this article.
The answer here is to use the ef6.exe command line tool and make sure it gets shipped with your build.
This could be useful to anyone here until Microsoft update the non-existent docs: http://github.com/dotnet/EntityFramework.Docs/issues/1740 - This contains a table with a kind of translation matrix between the two.

"Update Model from Database" as build step

I am working on an application that utilizes a database that often has tables added to it or modified. Is there a way I can regenerate the .edmx file as a build step or during compile time to add these new tables/modifications without manually running the wizard?
You can try to run sql scripts to insert/modify tables during build process.
Related extension:
ExecuteSqlScript
Run SQL Server Scripts Task
Or directly use PowerShell to Execute .SQL Files from Directory.
Reference below articles to change the database during build/release:
Build​ ​and​ ​Release​ ​Process​ ​for​ ​SQL​ ​Server​ ​Database​
​Scripts​ ​using​ ​Online​ ​TFS​
Continuous Deployment of SQL Server Database Changes using Visual
Studio & TFS Release Manager
UPDATE:
Choosing the Update Model from Database is the best method for updating your EDMX. There are certain properties that don't get updated on the Conceptual layer.
See How do you update an edmx file with database changes?
Seems there isn't a good way to achieve that, however if the actions can be executed in command line, then we can add a step to run the command or script.

Update Model from Database, with LocalDb and |DataDirectory|

We're using EF 6, in database-first mode.
We've been building a database migration package, using FluentMigrator, and we've been running our migrations against empty LocalDb databases in our solution, for use in integration testing, etc.
Because we need our solutions to build wherever they're checked out, we've been using |DataDirectory| in the connection strings.
We've written some unit [sic] tests, running against LocalDb instances, running the migrations first to ensure that the tests are running against the current schema. (Yes, these are integration tests, but we're using a unit test framework, and I don't want to argue about it.)
What I would like to do next is to change our configuration so that when we run "Update Model from Database" on our .edmx files, it's the LocalDb instance that the tool looks to. And I can't make that work.
My guess is that whatever tool is running, when we do this, doesn't have it's DataDirectory set where I need it to be, in it's appdomain.
Does anyone know what tool this might be? And how I can configure it to connect to a localdb instance, that's using |DataDirectory| to create a relative path?

add-migration does not function with remote sql server databases in shared hosting

It looks like CodeFirst stops doing its homework when it doesn't have full control of the database (I suppose).
The scenario is a web site hosted on Arvixe.com (or I suppose any other shared hosting server), where I have to create databases only from their control panel (and NOT with Sql Server Management Studio, just to say...).
Once created an empty database, I register a connection in the web site, and I use it to generate database from poco objects like in:
add-migration m1 -targetdatabase myconnection
This generates correctly my FIRST migration, that I can apply without problems with
update-database -targetdatabase myconnection
The first concern, not too important, is that since the database is existing, it will NOT issue the Seed command, so I have to insert my first records by hand, but this is not a great problem.
Then I change my poco objects, and I need to update the database, but when I issue ANOTHER
add-migration m2 -targetdatabase myconnection
it gives the error:
System.Data.Entity.Migrations.MigrationsPendingException: Unable to generate an explicit migration because the following explicit migrations are pending: [201111081426466_m1]. Apply the pending explicit migrations before attempting to generate a new explicit migration.
This is really strange, since if I look at the database, I can see even the table __MigrationHistory, but then it looks like it cannot recognize it...
Anyone with the same problem, or some good tip to where investigate?
Thanks in advance,
Andrea Bioli
I had this problem. I was able to resolve it by providing a connectionString and a connectionProviderName parameter to both the Update-Database and the Add-Migration commands.
If you have many projects in your solution with multiple config files, Package Manager seems to be confused. In my case, I had one project selected as the default project for Package Manager Console, but it was pulling the connection string from the Visual Studio solution default start-up project instead.