I have a solution with several projects that use .NET Framework 4.7.2. One of the projects contains all of my Entity Framework models and the DbContext (EF v6.4.4). It also uses the ASPNet.Identity.EntityFramework (v2.2.3) extensions.
The projects contains an API layer, common layer (shared DTO models, helper functions, etc), WebJob project, and a Unit Test project.
I need to add in an Azure Function project to process messages from a service bus queue (which receives messages from our ERP system) and update our front facing e-commerce web app.
I need to get with the times and get more familiar with .NET Core / Standard project types. My issue is that the Azure Function project is using .NET 6.0. I want to reference my Entity Framework project.
I have been reading that with later version of .NET Core / Standard, there is "more" compatibility between those and .NET Framework projects.
I was able to reference everything in my Azure Function project and get it to compile with not much issue. However, when I go to do anything with the Entity Framework (read from the DB or write), I get an exception like this:
System.Private.CoreLib: Exception while executing function: InboundFunctions. EntityFramework: The type initializer for 'MethodCallTranslator' threw an exception. System.Private.CoreLib: Value cannot be null. (Parameter 'key').
After digging some more, I seem to have an issue with the Microsoft.AspNet.Identity.EntityFramework package. I had to include this in my AZ function project to satisfy the implementation of the IdentityDbContext. According to VS, this is the library that is throwing a warning about compatibility issues.
At the moment, I am not sure how to resolve this. Would it be easier to make a new DbContext for just the .NET Core stuff or is there a better approach? I don't need any of the Identity stuff in my Azure functions project - it's really only used in my API layer.
Thank you for your help!
Related
I'm trying to create a simple Blazor client server app using EF, similar to this article.
So I've got a client, server, and common libraries, and this worked fine. But then I added the EF component to the common library, so that I could use real data from my database, instead of toy data from the demo.
I tried making them all Core 3.0, but this doesn't work because Blazor seems to require .NET Standard 2.0. Without that, I get all kinds of errors.
But then the common library can't use EF, because (if I'm reading this right) EF6 isn't supported on Standard 2.0. If I try, I again get tons of errors.
So I'm not sure, but I can't find any scenario that would allow me to share EF objects between client and server--which is a major rationale for Blazor.
Is there some other way to accomplish this?
The shared library should not use or reference EF.
Add EF to the Server project only and make the data available through an API controller.
You should make the common project netstandard and use EF core (not EF 6)
I have created a new application, using the SPA templare of .netcore, to this solution i want to add another project to handle the database connection(DAL).
When i am adding ASP.net core Web Application to this solution, i then want to add to id an ADO.net entityframework template, but in the data section, it dont appear:
So I end-up adding a class Library(.net framework)
and to it i can add an ADO.net Entity Data Model
So now in the solution, i have 2 projects, 1 is .net core 2.1 for the API's, models and views(by angular).
The second project is a .net framework 4.6.1, class library project.
My question is, is it suppose to be like that?
is it a good thing to mix different frameworks
Please see this article regarding what each framework is, and what each is specifically designed for.
https://learn.microsoft.com/en-us/dotnet/standard/frameworks
In a nutshell, your requirements drive which framework you choose.
I would recommend sticking with EF Core (just my personal opinion, take it or leave it) The EF Core method of database first is only recommended if you require a 1 time migration from a source database. Microsoft Doc
If you need to CONTINUE working with an entity model past the first migration, it would be in your best interest to use Entity Framework 6, on a .NET Framework library like you have. But that doesn't stop you from using EF Core as your OR/M, because you can indeed have .NET Core reference .NET Framework.
I have a database that I created using EF6. I have a VS project (library) that includes only my models and DbContext. Whenever I need to use my database I just reference that library DLL.
I have a few questions about that:
What happens if I lose this DLL somehow, but still have my models? Am I able to recreate my library?
What if I want to start using .NET Core? As far as I understand, I would have to use EF Core, right?. How can I get the same experience as I had with my DLL (same models).
When you still have the Code of your Models you can simply recreate the DLL. It gets recreated as you rebuild anyway.
You can also use the full entity framework together with .Net Core but that would make your application depending on classic .Net again.
The entity framework core works similar in many ways and also a lot of the old annotations work. You should be able to port you Model easily from EF6 to EF Core if it is not to complicated. Just be aware of some limitations regarding group by that will be resolved in 2.1
Because .Net Core is independent of the OS you won't be shuffeling around DLL's for you dependencys. One way is to use independet projects and release them as packages. So you can consume them in other projects with the package manager.
I am looking for some best practice advice with regards to building a self contained service, that is a DLL with all of the domain logic and data layer. I would like to use an off the self CMS, such as orchard, then talk to the service to carry out CRUD operations. The service should have it's own IOC, and ORM, in this case I am using Ninject and Entity Framework. In this design I will have a separate database than the CMS, and can port it to other CMS systems when required.
The CMS should start the service and pass it a connection string or file name. If I use orchard it has different ORM, and IOC frameworks, so this leads me to wanting to keep Ninject and Entity Framework inside the service.
I have setup an experiment where the DbConext and domain are in the service DLL, and I call it from a console app. This only works if I have entity framework referenced in the console application, even though I don't use it in that dll. Here is the error message when EF is not referenced by the console app.
No Entity Framework provider found for 'System.Data.SqlClient' ADO.NET provider.
Why is this and how best to solve my design problem?
If your library (DLL) depends on Entity Framework, it's perfectly normal that you need to reference both in your application (whether it's console, web or whatever else). You always need to reference all dependencies.
Wiring your custom library with Orchard would be fairy simple. The only thing you'd need to do on Orchard side would be to register the services coming from your library with Autofac, in order to have them available for dependency injection. This post describes a similar scenario to yours.
Please bear in mind that using multiple database connections is a bit troublesome in Orchard <= 1.6, because of the usage of TransactionScope - you need to run all your custom database code in a suppressed scope, otherwise you'd have transaction errors and/or MSDTC-related problems. It will be a non-issue since Orchard 1.7 which is going to arrive in about a week. I'd strongly recommend waiting for the new version. You can also fetch the pre-release code from 1.x branch.
I am developing a (free, open-source) Entity Framework tool, it is basicaly an ADO.NET provider, but it uses some higher abstractions too (e.g. ObjectContext, EntityConnection). I want it to support almost all the legacy version of EF (EF4=<). Until EF5 came out it was quite easy, because I had been able to develop it by targeting only .NET40.
EF5 made the things more complicated, because some of the new features requires the .NET45 framework. On the other hand, EF5 supports .NET40 too. An on the top of that, EF is now developed independently from the .NET framework.
For now, It is obvious that targeting both .NET40 and .NET45 is inevitable. But currently I have no idea what is the best way to setup a multi-target environment that can comply with the independently developed EF. I also haven't found any good document about this problem.
Should I use multiple solution files? Multiple project files? Multiple solution configurations? Reference all version of EF somehow? Create an universal build script? If yes, how? How to run my unit test against different configurations? How to indicate that a test can/should fail in a specific configuration? What about the changed namespaces (e.g. ObjectContext)? Should I use #if directive to solve this conflict? What if a new EF release requires to implement a feature that will break the compatibilty with previous versions? I am really uncertain at this point.
Take a look at the EF6 code base at http://entityframework.codeplex.com/. We build EF6 for .NET 4 and .NET 4.5 in essentially the way you are suggesting--using multiple build configurations.
Some other points to consider:
If you don't make use of any .NET 4.5 APIs or behaviors, then you may be able to just target the .NET 4 version. If you are using anything from EntityFramework.dll, then this may require a binding redirect to use the 5.0 version, but in a lot of cases if you ship as a NuGet package then NuGet will handle this for you.
If you plan to support EF6, then keep in mind that the core types have been moved out of the .NET Framework. This means, for example, that the EF5 ObjectContext is a different type from the EF6 ObjectContext. You will likely have to compile your provider code twice to create EF6 and EF5 versions in order to handle this. More information can be found here: http://entityframework.codeplex.com/wikipage?title=Rebuilding%20EF%20providers%20for%20EF6