Entity Framework and System.data.common - entity-framework

I have an application that I maintain, which is designed to work using the Common Database Provider, thus flexible to connect to different databases.
Since I need to implement some modules for this application, is it possible to use Entity Framework and make that work with the Common DB Provider of .NET.
I could not find any method of using both on the net, neither any guidelines on best practices. Could anyone help me with this issue, in the sense of telling me whether that's possible or show me some resources on how to manage this issue?

Related

Can’t find best way for apply best in code design techniques in software dev

[Pre]
I have to say that I'm dummy newbie who is trying to get together important puzzles with such crucial details as DDD, TDD, MVVM, and EFCore. I have an about 10 years of windows form develop experience in complete wrong manner, and after I'm joined to Plurasight I'm understood that I'm just lost my last 10 years, and this is really sad :).
[Problem description]
I have an App that i want to re-write from scratch by using latest and greatest technics that've learned for the last 6 month on Pluralsight, but the problem is that these new knowledge’s is stopping me, because simply I'm afraid that I'll do it wrong again...(that is stupid I know, but it is what it is).
So back to my questions, I have a big problem domain, and pretty well documented business logic, which i have to turn in to the code. I'm understand that my start point is design data layer, for these purposes I want to use Entity framework core (I saw Julie Lerman's course on Pluralsight and I think's she is amazing and inspires me to use EFCore as ORM for my app). But at the same time leakage of experience produces more questions than what I’ve learned with Pluralsight, and I will try to write them all(please don’t judge me too hard)
It is looks like that I will need 2 or even more data model projects in my solution, and here is why I have multiple document set types, each of the type contain more than one reference books used to generate unique file names and data sheets. But it looks weird to me have 3 Data model projects such as MyApp.PackType1.DataModel, MyApp.PackType2.DataModel, and each of them will be preinstalled with the EFCore, and each of them will generates its own database based on Data Context defined by EF. Isn’t it very redundant or this is correct way?
I don’t understand how to join these multiple Data Models projects, including Shared Kernel into the one nice model
I don’t understand what is the best way to design my data classes? Should they be just POCO’s or I can design them as nice looking classes with the private var’s and public properties? What are the best practices in here?
Also I don’t understand what is the best practice to use a MVVM pattern on top of that, and is it applicable at all to use MVVM in this case?
Should I keep my Tests in separate projects like MyApp.PackType1.DataModel.Tests, or keep them in same project?
Best regards,
Maks!
P.S.
Apologize for unclear definitions and questions, English isn't my native language.
It's very complicated to answer your question because you have asked for a lot of details, but I going to provide a brief answer and I hope it will be helpful.
You can have only one model for your entities (DDD) and create sub model from this model in your end level projects (Web API or UI)
Read point #1
You have to create an Entity Layer project that represents your database and then you can create DTO's for specific scenarios
From my point of view, use Angular but you can use another UI framework such as React or VueJs, but I prefer to use Angular to build UI interfaces and consume .NET Core Web API from client
Create unit tests and integration tests for you Web API projects and as additional feature you can use Db in memory provider for tests
May be this guide is useful: https://www.codeproject.com/Articles/1160586/Entity-Framework-Core-for-Enterprise
Regards
Hm, multiple DbContexts (models) usually come about when you have distinct databases you are using. General rule is one Context = one Database. Exceptions can occur when there are a lot of tables that can be grouped functionally, but there are downsides to that approach.
A DbContext is a repository pattern but for individual tables. Using a Unit of Work pattern and layering with a custom repository provider would allow you to make it "appear" as a single database, hiding the complexity from the front-end.
Your entity descriptions are usually created as straight POCO. You can get creative with different DTOs
In a nutshell, an MVVM pattern goes like this:
Request from UI to a controller
Controller possibly issues multiple calls to Data Layer to gather data
Assemble data in a single ViewModel (everything the page needs)
Return to UI
The beauty of the approach is single roundtrip (request/response) to the UI
Separate Project in my opinion. There are techniques to spoof the database connection using EF so you are not using "live" data.
That CodeProject article will come in handy.

Is it feasible to build company specific framework that wraps NHibernate?

I heard that companies that use Java technologies, they used to build their own custom Framework that wraps Hibernate. However, is it really feasible for their .Net peers to do the same thing with NHibernate or Entity Framework?
This is almost always a horrible idea - I think Ayende sums it up best in this article. In general, you should consider NHibernate itself to be the "wrapper" around your data access - attempting to build an abstraction layer on top of it is probably going to be a losing proposition.
Actually, you should check out some of the articles on .NET Junkie's weblog. He wrote several great posts on how to deal with repositories, queries, commands and so on. We've been using these in a very large enterprise system where we switch between an in-memory dictionary, an in-memory SQLite database and a production environment using SQL Server or Oracle. Obviously, we use NHibernate for this.
I use the repository pattern and a separate project/dll to abstract away the data framework nhibernate / entity framework. this is a good starting point http://codebetter.com/petervanooijen/2008/04/04/wrapping-up-nhibernate-in-repositories/

Should CSLA be used with a dependency injection framework?

My development team is evaluating the various frameworks available for .NET to simplify our programming, one of which is CSLA. I have to admit to being a bit confused as to whether or not CSLA would benefit from being used in conjunction with a dependency injection framework, such as Spring.net or Windsor. If we combined one of those two DI frameworks with, say, the Entity Framework to handle ORM duties, does that negate the need or benefit of using CSLA altogether?
I have various levels of understanding of all these frameworks, and I'm trying to get a big picture of what will best benefit our enterprise architecture and object design.
Thank you!
CSLA is a framework for creating business entities, so has separate concerns than an IoC container or ORM. In a enterprise application you should consider the benefits of all three.
In particular, you should consider CSLA if you want data binding built in to your models, dirty checking, N-level undo, validation and business rules, as well as the data portal implementation which allows easy configuration for n-tier deployments.
Short answer: Yes.
Long answer: It requires a bit of grunt work and some experimentation to setup, but it can be done without fundamentally breaking CSLA. I put together a working prototype using StructureMap and the repository pattern and used the BuildUp method of Setter Injection to inject within CSLA. I used a method similar to the one found here to ensure that my business objects are re-injected when the objects are serialized.
I also use the registry base class of StructureMap to separate my configuration into presentation, CSLA client, CSLA server, and CSLA global settings. This way I can use the linked file feature of Visual Studio to include the CSLA server and CSLA global configuration files within the server-side Data Portal and the configuration will always be the same in both places. This was to ensure I can still change the Data Portal configuration settings in CSLA from 2 tier to 3 tier without breaking anything.
Anyway, I am still weighing the potential benefits with the drawbacks to using DI, but so far I am leaning in the direction of using it because testing will be much easier although I am skeptical of trying to use any of the advanced features of DI such as interception. I recommend reading the book Dependency Injection in .NET by Mark Seemann to understand the right and wrong way to use DI because there is a lot of misinformation on the Internet.

Creating a Custom Entity Framework for Unsupported System

I could be totally misunderstanding Entity Framework here. I want to use that in my latest project (how else do you learn?) The problem is that the IBM i driver doesn't have support for that built in. Is is possible to create that framework from scratch? It is worth it?
It sounds like you'd be writing your own ADO.NET data provider to connect to IBM DB2 for i. Microsoft provides documentation for creating your own provider and a sample.
The data provider would be responsible for communicating with the database, so I'm not sure how you'd accomplish that. Either you'd be implementing your own connection to the database server running on the i (maybe you can port the SQL piece of JTOpen), or you'd be delegating your calls to the IBM-provided data provider (if that's even possible) or other data access method.
I couldn't decide whether I thought this was (1) a huge pain in the butt or (2) an opportunity for an open source project. (I guess it could be both.) It seems like it'd be easier to lobby IBM to make this part of their stock provider. You might complain about it on MIDRANGE-L and see if people will take up the cause.
Disclaimer: I am a newbie in the .NET world, so maybe there's an easier way to accomplish what you're trying to do.

Ado Entity Best Practice

I’m just working on this interesting thing with ADO.net entities and need your opinion. Often a solution would be created to provide a service (WCF or web service) to allow access to the DB via the entity framework, but I working on an application that runs internally and has domain access pretty much all the time. The question is if it’s good practice to create a data service for the application to interface from or could I go from the WPF application directly to the entity framework. What’s the best practice in this case and what are some of the pros’ and cons’ to the two different approach.
By using entity framework directly, do you mean that the WPF application would connect to the database, or that it would still use services but re-use the entities?
If it's the first approach, I tend to be against this because it means multiple clients connecting to the database, which a) is an additional security concern, b) could make it more expensive from a licensing perspective, and c) means you don't get the benefits of connection pooling. Databases are the most expensive things to scale so I'd try to design the solution to use services and reduce the pressure on the database. But there are times when it's appropriate. One thing I've noticed is that applications which do start out connecting directly tend to get refactored to go via a service later; it seldom happens the other way around. But it might also be a case of YAGNI.
If it's the second approach, I think that's fine. It's common for people looking at WCF to think "service oriented" - that is, there should be a strict contract between services and things shouldn't be shared. But a "multi-tier" application, which is only designed to have one client, is also a perfectly valid architecture and doesn't need to be so decoupled. In that case, reusing the entities on both sides of the service boundary should be fine. However, I'm not sure how easy this is to do with EF specifically, since I haven't used it except in experiments.
It really depends on the level of complexity and the required level of coupling/modularity. I think a good compromise would be to create a EF model in it's own library or the like with a simple level of abstraction. In that scenario if you chose to change the model to use an exposed service instead of direct access it shouldn't be a big deal to refactor existing code and the new service could utilize the existing library.