.NET Core dnx Entity Framework not working correctly - entity-framework

I am using .NET Core to build an application and I am having issues with entity framework. After creating a second migration to update changes made to my models using the "dnx ef migrations add" and the "dnx ef database update" commands, I get errors regarding the attempt to drop foreign key constraints that do not exist. It looks like entity framework is not reviewing the target database before it generates the migrations file.
To try and confirm this I created a brand new database in my development environment and updated my appsettings.json file to target the new database. I then generated another migrations file to check if it would notice the database is blank and create a migrations file to build the schema. It instead created a migrations file with the same issues of trying to drop constraints that do not exist.
Shouldn't entity framework always review the database so it can find the difference between the database schema and model classes?
Thanks

dnx commands doesn't exit any more ! It was in beta of .net core. Migrate your project to the latest dotnet core version (1.1) and use dotnet commands
To add a migration use command: dotnet ef migration add
To update: dotnet ef update

Related

Sql database migration in .net core (EF)

It is open question just to understand is it possible or not as i am new to .net core.
I add migration in .net core and then i have a code in start of the application.
context.Database.Migrate();
This automatically runs the pending migration and migration is applied on to the database at the start of the application.
Now here is scenario.
Is it possible that i skip this automatic migration but have some page where i can see pending migration and apply the pending migration by executing some methods ?
Thanks!

How to Implement ASP.NET Core Identity via Database First

I have a SQL Server database with ASP.NET Identity tables that was built using Entity Framework Core Code First. I now need to use that database with another project that requires ASP.NET Identity. What is the best approach to do and preserve existing data in the new project? The issue is how to address the AppContext : IdentityDbContext<[User Class]>?
Thank you.
When you have a database ready and you want to use it in your project, you should use Reverse Engineering.
in dotnet cli you can use this Command:
dotnet ef dbcontext scaffold "Data Source=(localdb)\MSSQLLocalDB;Initial Catalog=Chinook" Microsoft.EntityFrameworkCore.SqlServer
or in VS:
Scaffold-DbContext 'Data Source=(localdb)\MSSQLLocalDB;Initial Catalog=Chinook' Microsoft.EntityFrameworkCore.SqlServer
reference

What is a good action plan for database first connect data from postgresql to a c# project with dot net core and using entity framework core

I need to successfully do database first connect data from postgresql to a c# project with dot net core and using entity framework core.
Am familiar with mvc with sql and entity framework full database first (in this case we use the Entity Data Model Wizard which makes things easier.
I am not familiar with EFcore, dot net Core and also postgresql.
I successfully did a code first connect data from postgresql to a c# project with dot net core and using entity framework core, and using npgsql, on a console app.
Now I need to do database first web application, and should I try to edit my existing code first console app to try database first? Or should I build a new mvc project from scratch?
Also, if I do from scratch, what will be a good sequence I try
eg 1. try entity framework core tutorial first (which uses sql and is only code first),
then2. try to see how to do database first using reverse engineering
then3. try to replace the sql with postgresql
or are there any other methods that are better?
Scaffolding works with postresql.
Create project which will have your database entities and context. Install efcore tools and npgsql for this project.
Then, inside new project try this command using cmd:
dotnet ef dbcontext scaffold "Host=localhost;Database=databaseName;Username=user;Password=password" Npgsql.EntityFrameworkCore.PostgreSQL -t table1 -t table2 -o outputFolder
You dont have to pass -t parameters if you want to scaffold whole database.
If some error happens, try --force argument.
You should be able to use this database via context created by ef core.

EntityFramework PowerTools ReverseEngineer no mapping POCO classes

I'm trying to use Entity Framework PowerTools Reverse Engineer from an existing database in SQL server. It says successful and the context file is generated, however no Mapping folder with the poco classes is created.
I try to run this on a MVC4 project in VS2012, can there be a problem that its not allowed to add this to the Models folder?
Make sure every table in your database have primary key setup.

Is it possible to use EF codefirst database initialisers in Migrator .NET migrations?

I'm using Migrator.NET to manage schema changes on our production environment. Because I've been using EF code-first, all development to the database has been incremental to the code-first classes and no migrations have been applied to the project.
However, I'd like to be able to start using migrations once the project is in a production environment. As the baseline 'up' migration, I'd like to use code-first's database initializer to create the database and prime with default data. However, I'm having problems because the EF context classes and my wrapper classes for EF initializers are in .NET 4, whereas migrator .NET is using .NET 2.
When running the migrator console app, I'm getting 'This assembly is built by runtime newer than the currently loaded runtime...'
Am I expecting to much for this to work? I could use OSQL and create the SQL script on the server, but it would be nice if this worked just as it does in the development environment.
Hmm. Weird. Even if the migratordotnet binary is in .NET 2 you should be able to use it. I worked on a project where we did just this. We used EF Code First to automatically generate the schema if it didn't exist, otherwise we would run the migrations to the existing one (we started creating the migration steps while still in the dev phase).