I am trying to profile EF to understand more of its inner workings, I have tried to add two entities using the Add method and the AddRange method, then of course committing with the SaveChanges method. And here is what I got on the profiler in both cases.
Does this mean that EF actually makes two trips to the database one per insert? which means that if I am trying to insert 100 entities for example this will mean 100 trips to the database? which will greatly impact performance. or am I missing something here?
Yes, that is correct, it will issue one database call per item attempting to be added, as it is using the standard SQL INSERT command in this case.
The alternatives would be to use BULKINSERT, such as using a stored procedure that takes in an object such as a DataTable.
Related
I'm moving from EF Core 2.2 to 3.1. One breaking change (#15392) was that it no longer composed over stored procedures, so you had to add 'AsEnumerable'. That usually works, but I have a stored procedure call on a TPH table where that fails:
My call to the SPROC is:
SqlParameter authorizedUserID_p =
new SqlParameter("#authorizedUserID", authorizedUser.ID);
IEnumerable<Post> query =
context.Posts.FromSqlRaw<Post>("Post.USP_ReadPost #ID, #AuthorizedUserID",
parameters: new[]{ parentID_p, authorizedUserID_p }
).AsEnumerable<Post>();
Post targetPost = query.ToList<Post>().FirstOrDefault<Post>();
And it produces this error, recommending using AsEnumberable (which I'm already using above):
System.InvalidOperationException: FromSqlRaw or FromSqlInterpolated was called with non-composable SQL and with a query composing over it.
Consider calling AsEnumerable after the FromSqlRaw or FromSqlInterpolated method to perform the composition on the client side.
I believe the reason is because my Posts table is Table-per-hiearchy, as other calls to SPROCS in the same application are working fine. Would appreciate any help possible!
This is yet another issue introduced by EFC 3, tracked by #18232: Impossible to use stored procedures related to entities that inherits another one.
The reason is that SP calls are not composable, and EF Core always try to compose SQL for TPH base entities in order to add discriminator condition. Similar to Global Query Filters, but there you can at least use IgnoreQueryFilters, while here you have no option.
The good news is that it's already fixed in EFC repository. The bad news is that it won't be released until EFC 5.0.
Since AsEnumerable() doesn't help, all you can do is to wait for EFC 5.0. Or, if possible, convert SPs like this to TVF (table valued functions) which are composable. In general, use scalar functions or stored procedures with output parameter(s) for non query returning calls (to be executed with ExecuteSql*), and table valued functions for single query returning calls (to be used with FromSql*). Note that currently EFC does not support multiple query returning stored procedures anyway.
Is there a performance difference when executing stored procedures from Entity Framework? I tried to do some testing, but I get similar results, so I want to make sure I decide to use better approach.
One way it to add the stored procedure in EDM, and second way is to call
content.Database.SqlQuery("sp_Name", params)
http://www.entityframeworktutorial.net/EntityFramework4.3/execute-stored-procedure-using-dbcontext.aspx
My C# application uses EF and calls min() on an int column to get the 'next' number in a sequence of numbers from a database table. The database table already has the next X numbers ready to go and my EF code just needs to get the 'next' one and after getting this number, the code then deletes that entry so the next request gets the following one etc. With one instance of the application all is fine, but with multiple users this leads to concurrency issues. Is there a design pattern for getting this next min() value in a serial fashion for all users, without resorting to a stored procedure? I'm using a mix of EF4.5 and EF5.
Thanks, Pete
Firstly, you can add an timestamp type column into your table and on Entity Framework property window set the concurrency mode to Fixed.
Doing that you enable optimistic concurrency check on the table. If there is another data context tries to interrupt your update, it will generate an excepton.
Check this link: http://blogs.msdn.com/b/alexj/archive/2009/05/20/tip-19-how-to-use-optimistic-concurrency-in-the-entity-framework.aspx?Redirected=true
Alternatively, you can use a TransactionScope object on your select/update logic. You can simply wrap around your code logic with a TransactionScope logic and everything within the scope will be enforced by the transaction.
Check this link for more information:
TransactionScope vs Transaction in LINQ to SQL
i have a question about the reasonableness of using entity framework only with stored procedures in our scenario.
We plan to have an N-tier architecutre, with UI, BusinessLayer (BLL), DataAccessLayer(DAL) and a BusinessObjectDefinitions(BOD) layer. The BOD layer is known by all other layers and the results from executes queries in the DAL should be transformed into Objects (definied in the BOD) before passing into the BLL.
We will only use stored procedures for all CRUD methods.
So in case of a select stored procedure, we would add a function import, create a complex type and when we execute the function, we tranform the values of the complex type into a class of BOD and pass that to the BLL.
So basicly, we have no Entities in the Model, just Complex types, that are transformed into Business Objects.
I'm not sure if that all makes sense, since in my opinion, we lose a lot of the benefit, EF offers.
Or am i totally wrong?
I would not use EF if all I was just using was stored procs.
Personally, I'd look at something like PetaPoco, Massive or even just straight Ado.Net
EDIT
Here's an example of PetaPoco consuming SPs and outputting custom types
http://weblogs.asp.net/jalpeshpvadgama/archive/2011/06/20/petapoco-with-stored-procedures.aspx
I disagree with both of the existing answers here. Petapoco is great, but I think the EF still offers a number of advantages.
Petapoco works great (maybe even better than the EF) for executing simple stored procedures that read a single entity or a list of entities. However, once you've read the data and need to begin modifying it, I feel this is where the EF is the clear winner.
To insert/update data with petapoco you'll need to manually call the insert/update stored procedure using:
db.Execute("EXEC spName #param1 = 1, #param2 = 2")
Manually constructing the stored procedure call and declaring all the parameters gets old very fast when the insert/update stored procedures insert rows with more than just a couple of columns. This gets even worse when calling update stored procedures that implement optimistic concurrency (i.e. passing in the original values as parameters).
You also run the risk of making a typo in your in-lined stored procedure call, which very likely will not be caught until runtime.
Now compare this to the entity framework: In the EF I would simply map my stored procedure to my entity in the edmx. There's less risk of a typo, since the entity framework tools will automatically generate the mapping by analyzing my stored procedure.
The entity framework also will handle optimistic concurrency without any problems. Finally, when it comes time to save my changes the only step is to call:
entities.SaveChanges()
I agree, if you rely on stored procedures for all CRUD methods, then there is no need to use EF.
I use EF to map stored procedure calls as our DAL. It saves time in writing your DAL by mapping the functions. We are not using LINQ to SQL as much, as our DBA does not want direct data table access.
New to Entity framework.
I am using EF4 and I have implemented for now Database first using Stored procedures.
I have noticed that when I launch the application and regardless which stored procedure is invoked it takes 6 seconds .After that even if call another stored procedure that has never been called before the response is fast.
Is there a trick when you create entity context for the first time?
Has anybody experiencied the same?
thanks a lot
I don't know if you still need it, but you can follow a How to... following this link
How to use a T4 template for View Generation
This looks like the view generation is performed, and it takes the most of the time.
Take a look at this article, these recommendations should improve the situation.
More information about EF performance is available here.