What is faster - ADO.NET or ADO.NET Entity Framework?
Nothing is faster than an ADO.NET datareader.
Entity framework also uses this in "the basement".
However entitity framework helps you to map from database to objects..
With ADO.NET you have to do that yourself.
It depends on how you program it how fast it is..
When you use ADO.NET datatables as "objects". They are a bit slower and memory hungry than plain objects..
As Julian de Wit says nothing is faster than ADO.NET DataReaders.
ADO.NET Entity Framework is a wrapper to the old ADO.NET.
It is pure Provider independent, ORM and EDL System.
It gives us a lot of benefits that we have had to hand craft or "copy & paste" in the past.
Another benefit that comes with it is that is completely Provider independent.
Even if you like the old ADO.NET mechanism or you are a dinosaur like me(:P) you can use the Entity Framework using the EntityClient like SqlClient, MySqlClient and use the power of Entity-Sql witch is provider independent.
I Know that with ADO.NET you can write a data access Layer and the DataReaders etc can be "independent" but you have steal have Queries that are provider specific.
On the other hand,in an enterprise application you may never want to change the data provider.
But as the technology grows, always new needs arise and you may want have to alter the database schema.
When it happens with the old ADO.NET Framework we have to refactor alot of code which is less than maintainable, no matter how we reuse the code.
The performance will be affected but with all these cache technologies out there we can overcome this.
As i always say, "The C is fast, the Assembly even more...but we use C#/VB.NET/Java"
Related
I would like to ask the Entity Framework Core team what their ambition is for the scope/complexity of query translation compared to EF6.
I've used EF6 extensively and I know that if you can express it in LINQ and don't use any untranslatable functions, EF can probably translate the query correctly.
Will Entity Framework's translation be eventually as good as that, or is that something that is considered secondary, like the lazy loading feature.
If so, about what is the team eventually aiming at compare to EF6?
There's a ticket discussing GroupBy that appears to indicate they deem grouping an advanced type of query, but compared to what EF6 can translate, a normal group-by is pretty average.
(I'm asking here as the EF Core team says on it's site it is monitoring SO for questions.)
We took a very different approach in EF Core. Every LINQ query should work--even if you use untranslatable functions. We do this by translating the parts of the query we can into SQL and processing the rest on the client after the results are returned by the server. As EF Core evolves, we'll translate more and more of the query into SQL (e.g. GROUP BY) which can make it more efficient.
In theory, our goal is to translate everything that the store supports. In some cases however (especially on NoSQL stores) there simply is no translation for a LINQ operator, and we feel it's better to be functional and inefficient than to throw.
If you want to ensure your whole query is translated, you can disable client evaluation. This will cause it to throw like EF6.
I'm just starting to learn EF and now readind about Code First workflow. From what I gather, you would design your objects first and then the database would be created based on those objects. I can't seem to see the good in this. Why would you let your database schema be dictated by the hierarchy of your objects? Would you be able to optimize your database using Code First?
Also, as I have not read far enough yet, does Code First fully support DBMS features (indexes, triggers, sp, etc)? I ask as I've read in some articles that this is what most preferred (Code First). I have seen something about Code Second which is from what little I've read, I think is much better (existing database, but code centric development?), but maybe I'm missing something or haven't yet read enough and you guys can clear those things up. Thanks.
The capabilities of code first are the same since you have the same ability to express all the features of EF manually in your code. The main difference is that you don't use a designer to generate your EF code. This offers some benefits since you can decouple your entity classes from the EF context. The main benefit of this is that you can use plain old c# classes that aren't necessarily tied to EF if you decide to switch to another orm down the line.
The downside of course is that you have to hand code the entire model.
Keep in mind that you don't have to generate the database from your code. You can code against an existing database.
Is Entity Framework faster than ado.net for Queries?
My test shows that ado.net is faster than Entity Framework in querying.Why?
ADO.Net is used by EF under the scenes. This means that overall EF is always going to be slower than ADO.Net (assuming they both are generating similar SQL statements)
However I have observed an interesting performance characteristic with EF5 vs ADO.Net and a low number of rows (either queried or inserted). EF5 appears to be consistently faster than ADO.net for under 10 items. I imagine this is due to an optimisation at connection setup time however I haven't yet tracked down what this is.
My results around this and a little more explanation is avaliable here
If anyone has any more information around why EF5 appears so fast on small datasets I would love to hear.
NOTE in this post I actually don't show the raw ADO.net results but they are very similar to the dapper results. I actually wanted to answer this specific question before posting the ADO.Net results :)
Is it possible to use Entity Framework 4.3 without linking the model to an actual DB in the back-end?
I need to build a conceptual model of a database in the VS designer and then I'd like to manually handle fetches, inserts and updates to various back-end databases (horrible legacy systems). I need to be able to do this without EF moaning about not having tables mapped, etc. I realise that this is a very odd thing to want to do...
The reason for this is that we would like to move from these legacy systems into a well designed data model and .NET environment, but we need to still maintain functionality and backward compatibility with the old systems during development. We will then reach a stage where we can import the old data (coming from about 6 different databases) into a single DB that matches the EF model I'm building. In theory, we should then be able to switch over from the hacked up EF model to a proper EF model matching the new data structure.
Is this viable? Is it possible to use the EF interface, with LINQ without actually pointing it to a database?
I have managed to query the legacy systems by overriding the generated DbContext and exposing IQueryable properties which query the old systems. My big fight now is with actually updating the data.
If I am able to have EF track changes to entities, but not actually save those changes. I should be able to override the SaveChanges() method on the context to manually insert into various legacy tables.
I'm sort of at wits end with this issue at the moment.
UDPATE 4 Sept 2012: I've opted to use the EDMX file designer to build the data model and I generate the code by using T4. This enables me to then manually write mapping code to suit my needs. It also allows me to later perform a legacy data migration with relative ease.
If I were in your situation I'd setup the new DB server and link the legacy servers to it. Then create stored procedures to interface with EF for the INSERT/UPDATE/DELETE. This way your EF code remains separate from the legacy support messiness. As you decommission the legacy DB servers you can update your stored procedures accordingly. Once you have no more legacy DB servers you can either continue using your sprocs or do a refresh of your EF data connection to use the table schema directly.
Entity framework is to link entities to a data store without manual populates.
Otherwise you're just using classes with linq.
If you mean you don't want a seperate data store like sql server, mongo etc etc, then just let your application create the database as an mdb file that gets bundled in your app_data file. That means you don't need a databsae server so to speak and the database is part of your app.
If on the other hand you want a different way to save to the database, you can create your own data adapters to behave however you like. The mongo .net entity framework component is an example of this.
Alternatively, using code only you can just use stored procedures to persist to the database which can be a bit verbose and annoying with EF, but could bridge the gap for you you and allow you to build a good architecture with a model you want that gets translated into the crappy one in your repositories.
Then when the new database is ready, you can just rework your repo's to use savechanges and you're done.
This will of course only work with the code only approach.
I am using Entity Framework 4.1 code first with no stored procedures. And I would like to know a general opinion on the performance of this on huge applications seeing that it generates the SQL in the background. Doesn't this go against best practices of not using stored procedures? How do you fine tune these generated code?
I know you can hack into it to use stored procedures, but is there definately going to be support for stored procedures and the other functions you get with going with the database first option?
Does EF 4.1 have any improvements on the database first option? How would I know if I have the latest version of EF?
The generated SQL is reasonably efficient, but although I've not resorted to SP's as yet, I have written some views (in 4.0) and written LINQ against those in places in order to overcome some performance issues.
Does 4.1 go against best practices of stored procedures ? Well there SP's are best practice for a number of reasons - performance is one, isolation and abstraction of the underlying table structure from your code is another. The performance part of this seems to have been abandoned as "probably not that important these days" for reasons that don't smell 100% to me. And the abstraction issue - well you are using EF Code First for a reason - that reason is that you are looking for a persistence framework for your applications objects: by the very act of choosing EF Code First, you are declaring that you don't want to know how they are stored, in what structures, and what happens to get them back.
How do you tune it ? Mainly by being very careful about lazy loading, by monitoring what's going on at the SQL end (EFProf is one tool, MSSql query profiling works too) and generally by fiddling with things.
To ensure you are running the latest EF (if you have been running the CodeFirst CTP) use the NuGet console and
uninstall-package EFCodeFirst
install-package EntityFramework
4.1 has improvements over 4.0 for database first - namely the lightweight dbContext
EDIT: Adding code as requested...
Simple case
foreach (var order in orders) y=order.orderlines.tolist();
which you fix with
foreach (var order in orders.Include("orderlines").tolist()) y=order.orderlines.tolist();
but less obvious is
foreach (var order in orders.Include("orderlines").tolist()) dothing(order);
where
public void dothing(Orderline ol)
{
if (ol.order.property=true)
....
}
to fix this I think you need
foreach (var order in orders.Include("orderlines.orders").tolist()) dothing(order);
(or better still refactor dothing(Orderline ol) to dothing(Orderline ol, Order ord). My point is that with a local database its incredibly easy to miss these. Its only when you profile the sql, or connect to an SQL database on a slow network (think Azure) or just get serious load, that this begins to hurt!