What does Microsoft recommend for 2nd level Caching in Entity Framework? - entity-framework

I've used "EF Provider Wrappers" made by Jarek Kowalski. It works fine but I noticed "Limitations and Disclaimers" section where it says:
The providers have not been extensively tested beyond what’s included in the sample code, so you should use them at your own risk.
As with any other sample, Microsoft is not offering any kind of support for it, but if you find bugs or have feature suggestions, please use this blog’s contact form and let me know about them.
I'm little confused here, Does Microsoft really expect developers to use EnityFramework on production websites without any official support (or recommendation) for 2nd level Caching?

There is no official 2nd level cache support. I'm even not sure if EF Provider wrappers are compatible with .NET 4.5. 2nd level cache is in backlog for future versions of EF.
You can also implement your own solution because EF is fully open sourced.
Btw. I have seen dozens of quite complex web sites running in production without any cache ...

There is now a 2nd level cache provider available for EF 6.x
Entity Framework does not currently support caching of query results. A sample EF Caching provider is available for Entity Framework version 5 and earlier but due to changes to the provider model this sample provider does not work with Entity Framework 6 and newer. This project is filling the gap by enabling caching of query results for Entity Framework 6.1 applications.
https://github.com/moozzyk/EFCache
And Redis provider implemented on top of it :
Extends EFCache by adding Redis support
I wanted to add L2 Cache to EF using Redis - there was nothing
available at the time.
I found EFCache written by Pawel Kadluczka (moozzyk) over on CodePlex
https://github.com/silentbobbert/EFCache.Redis

Apache Ignite.NET provides a distributed in-memory 2nd level cache for Entity Framework: https://apacheignite-net.readme.io/docs/entity-framework-second-level-cache

Related

Can't get Blazor project running with Entity Framework

I'm trying to create a simple Blazor client server app using EF, similar to this article.
So I've got a client, server, and common libraries, and this worked fine. But then I added the EF component to the common library, so that I could use real data from my database, instead of toy data from the demo.
I tried making them all Core 3.0, but this doesn't work because Blazor seems to require .NET Standard 2.0. Without that, I get all kinds of errors.
But then the common library can't use EF, because (if I'm reading this right) EF6 isn't supported on Standard 2.0. If I try, I again get tons of errors.
So I'm not sure, but I can't find any scenario that would allow me to share EF objects between client and server--which is a major rationale for Blazor.
Is there some other way to accomplish this?
The shared library should not use or reference EF.
Add EF to the Server project only and make the data available through an API controller.
You should make the common project netstandard and use EF core (not EF 6)

Is it feasible to build company specific framework that wraps NHibernate?

I heard that companies that use Java technologies, they used to build their own custom Framework that wraps Hibernate. However, is it really feasible for their .Net peers to do the same thing with NHibernate or Entity Framework?
This is almost always a horrible idea - I think Ayende sums it up best in this article. In general, you should consider NHibernate itself to be the "wrapper" around your data access - attempting to build an abstraction layer on top of it is probably going to be a losing proposition.
Actually, you should check out some of the articles on .NET Junkie's weblog. He wrote several great posts on how to deal with repositories, queries, commands and so on. We've been using these in a very large enterprise system where we switch between an in-memory dictionary, an in-memory SQLite database and a production environment using SQL Server or Oracle. Obviously, we use NHibernate for this.
I use the repository pattern and a separate project/dll to abstract away the data framework nhibernate / entity framework. this is a good starting point http://codebetter.com/petervanooijen/2008/04/04/wrapping-up-nhibernate-in-repositories/

Multi-target development of a Entity Framework based tool with legacy support

I am developing a (free, open-source) Entity Framework tool, it is basicaly an ADO.NET provider, but it uses some higher abstractions too (e.g. ObjectContext, EntityConnection). I want it to support almost all the legacy version of EF (EF4=<). Until EF5 came out it was quite easy, because I had been able to develop it by targeting only .NET40.
EF5 made the things more complicated, because some of the new features requires the .NET45 framework. On the other hand, EF5 supports .NET40 too. An on the top of that, EF is now developed independently from the .NET framework.
For now, It is obvious that targeting both .NET40 and .NET45 is inevitable. But currently I have no idea what is the best way to setup a multi-target environment that can comply with the independently developed EF. I also haven't found any good document about this problem.
Should I use multiple solution files? Multiple project files? Multiple solution configurations? Reference all version of EF somehow? Create an universal build script? If yes, how? How to run my unit test against different configurations? How to indicate that a test can/should fail in a specific configuration? What about the changed namespaces (e.g. ObjectContext)? Should I use #if directive to solve this conflict? What if a new EF release requires to implement a feature that will break the compatibilty with previous versions? I am really uncertain at this point.
Take a look at the EF6 code base at http://entityframework.codeplex.com/. We build EF6 for .NET 4 and .NET 4.5 in essentially the way you are suggesting--using multiple build configurations.
Some other points to consider:
If you don't make use of any .NET 4.5 APIs or behaviors, then you may be able to just target the .NET 4 version. If you are using anything from EntityFramework.dll, then this may require a binding redirect to use the 5.0 version, but in a lot of cases if you ship as a NuGet package then NuGet will handle this for you.
If you plan to support EF6, then keep in mind that the core types have been moved out of the .NET Framework. This means, for example, that the EF5 ObjectContext is a different type from the EF6 ObjectContext. You will likely have to compile your provider code twice to create EF6 and EF5 versions in order to handle this. More information can be found here: http://entityframework.codeplex.com/wikipage?title=Rebuilding%20EF%20providers%20for%20EF6

Using DbContext and Database First in EF 4.1

I have started working on a new project and am switching from LinqToSQL to EF 4.1 as my ORM.
I already have a database set up to work with and so am going with the database first approach. By default the EF generates a context which extends ObjectContext. I wanted to know if a good approach would be to replace it with DbContext.
Most of the available examples deal with only Code First and DbContextbut DBContext can be used with Database First too. Are there any advantages I get by using the DBContext? From what I have read the DBContext is a simplified version of the ObjectContext and makes it easier to work with. Are there any other advantages or disadvantages?
You will not replace anything manually. You will need DbContext T4 Generator available at VS Gallery. Don't touch your autogenerated files - your changes will be lost every time you modify EDMX file.
I answered similar question last year. Now my answer is mostly - for new users DbContext API is probably better. DbContext API is simplified - both in terms of usage and features but you can still get ObjectContext from DbContext and use features available only in ObjectContext API. On the other hand DbContext API has some additional performance impact and additional layer of bugs. In simple project you will probably not find any disadvantage in DbContext API - you will not see performance impact, you will not use corner features available only in ObjectContext and you will not be affected by occasional bugs.
A lot of information and blog posts was collected since DbContext API was released so you don't have to be afraid that you will not find description of the API. Also ADO.NET team now uses DbContext API as their flag ship.
I'm not a big fan of DbContext API but my opinion is not related to its functionality but to its existence - there is no need to have two APIs and split development capacity of ADO.NET team to maintain and fix two APIs doing the same. It only means that there is less capacity for implementation of really new features.
I'm using it now with Oracle on an add on to an existing application. The simplification that Ladislav refers to works well for me on this project as I am short on time and resources. I have not found any gotchas as long as you stick to simple CRUD operations and less than ~150 tables.
You can still use metadata annotations to provide basic validation and localization and there is enough documentation out there but you won't find much on official Microsoft sites.

.NET connector for PostgreSQL

Last time I used Npgsql, i.e., version 1.0, it worked very slow. Is there any other alternative to Npgsql?
Version 1.0 is three years old. Try to use the newest one.
Npgsql is an excellent connector. Just upgrade to the new one. Make sure you take a look at the documentation it is really good. That will solve the speed issue.
You asked about an alternative, so I also have to recommend an another good connector: dotConnect for PostgreSQL. It is made by Devart. There is a simple free one as well as a fully robust pay connector. The pay one has Linq and entity framework support.
http://www.devart.com/dotconnect/postgresql/
I have experience with the .NET MySQL connector. What you are describing seems to be a DNS issue. If you are using a URL in your connection string and are able to change it to an IP address, try that and see if your delay goes away.
npgsql is still the choice for .NET when connecting to PostgreSQL.
Since version 1.0, the connector has improved drastically, check out this presentation from Shay Rojansky; it is not the latest, still the boost was already quite impressive in 2018/11.
If you are upgrading from an old version, read the release notes of the latest carefully, you might break functionality in your code.
Also, I strongly recommend considering optimizing PostgreSQL as well. I work with it daily in a distributed enterprise environment, with massive workloads; it can be tuned and tweaked with a dramatic impact on the overall performance.
As #yojimbo87 told upgrade to newer connector version. Try that.
Use entity core framework. Npgsql has an Entity Framework (EF) Core provider.
Use Postgres 11
Check connection pool setting
Like most ADO.NET providers, Npgsql uses connection pooling by default. When you Close() the NpgsqlConnection object, an internal object representing the actual underlying connection that Npgsql uses goes into a pool to be re-used, saving the overhead of creating another unnecessarily.
This suits most applications well, as it's common to want to use a connection several times in the space of a second.
It doesn't suit you at all, but if you include the option Pooling=false in your connection string, it will override this default, and Close() will indeed close the actual connection.
Npgsql has an Entity Framework (EF) Core provider. It behaves like other EF Core providers (e.g. SQL Server), so the general EF Core docs apply here as well. If you're just getting started with EF Core, those docs are the best place to start.
Development happens in the Npgsql.EntityFrameworkCore.PostgreSQL repository, all issues should be reported there.
https://www.npgsql.org/efcore/index.html