Entity Framework 7 multiple levels of child tables - entity-framework-core

I'm just getting started with EF7 (CORE) and am struggling to find the right implementation of the following. Say I have a Table with multiple child tables, which in turn have grandchild tables (and these in turn have foreign key tables). If I wanted access to everything I'd need something like this
TABLE_A.Include(c => c.TABLE_B).ThenInclude(co => co.TABLE_C)
.ThenInclude(coi => coi.TABLE_D)
.ThenInclude(coia => coia.TABLE_E)
.Include(c => c.TABLE_B).ThenInclude(co => co.TABLE_F)
.ThenInclude(coa => coa.TABLE_G)
.ThenInclude(coaAcc => coaAcc.TABLE_H)
.ThenInclude(coaAccInt => coaAccInt.TABLE_D)
.ThenInclude(coaAccIntAgent => coaAccIntAgent.TABLE_E)
Now I understand the necessity for chaining the includes to include all of my child tables...but I look at the SQL it fires behind the scenes and its firing off 11 SQL statements. This seems terribly inefficient.
Is this the best way to be doing this? I have now received a new requirement to add 3 more child tables to TABLE_B...so I'll need more includes..and hence more selects running behind the scenes.
I understand the logic behind what I'm doing..and understand lazy loading isn't currently supported in EF7, but this doesn't seem like a very efficient way of doing things when I could write a stored procedure that does it in one go.
Are there best practices for things like this or something I'm not grasping about how to use EF7 to do what I need?
Any help or guidance would be much appreciated!
Thanks

add this extension method to your project, Load method exist in ef 6.x, but not implemented yet in ef core:
public static void Load<TSource, TDestination>(this EntityEntry<TSource> entry, Expression<Func<TSource, IEnumerable<TDestination>>> path, Expression<Func<TDestination, TSource>> pathBack = null) where TSource : class where TDestination : class
{
var entity = entry.Entity;
var context = entry.Context;
var entityType = context.Model.FindEntityType(typeof(TSource));
var keys = entityType.GetKeys();
var keyValues = context.GetEntityKey(entity);
var query = context.Set<TDestination>() as IQueryable<TDestination>;
var parameter = Expression.Parameter(typeof(TDestination), "x");
PropertyInfo foreignKeyProperty = null;
if (pathBack == null)
{
foreignKeyProperty = typeof(TDestination).GetProperties().Single(p => p.PropertyType == typeof(TSource));
}
else
{
foreignKeyProperty = (pathBack.Body as MemberExpression).Member as PropertyInfo;
}
var i = 0;
foreach (var property in keys.SelectMany(x => x.Properties))
{
var keyValue = keyValues[i];
var expression = Expression.Lambda(
Expression.Equal(
Expression.Property(Expression.Property(parameter, foreignKeyProperty.Name), property.Name),
Expression.Constant(keyValue)),
parameter) as Expression<Func<TDestination, bool>>;
query = query.Where(expression);
i++;
}
var list = query.ToList();
var prop = (path.Body as MemberExpression).Member as PropertyInfo;
prop.SetValue(entity, list);
}
public static object[] GetEntityKey<T>(this DbContext context, T entity) where T : class
{
var state = context.Entry(entity);
var metadata = state.Metadata;
var key = metadata.FindPrimaryKey();
var props = key.Properties.ToArray();
return props.Select(x => x.GetGetter().GetClrValue(entity)).ToArray();
}
then whenever you need each of navigation properties, before use first call load (just once for any navigation properties) method as fallowing:
//for first item
var item = TABLE_A.First();
context.Entry(item ).Load(b => b.TABLE_B);
Depending on your use case, can Include or ThenInclude some of navigation in first query that load TABLE_A.
Load extension method Source Link with more examples

Related

What is the difference between creating a new object inside select LINQ clause and inside a method

I have an entity class which is mapped to an SQL table:
public class EntityItem {
public virtual ICollection<EntityItem2> SomeItems { get; set; }
}
And I have the following two snippets:
var items = _repository.Table.Where(x => x.Id == id)
.Select(x => new ItemModel {
Items = x.SomeItems.Select(y => new SomeItem { //mapping is here...}).ToList()
});
And
var items = _repository.Table.Where(x => x.Id == id).Select(x => someModelMapper.BuildModel(x));
//inside a mapper
public ItemModel BuildModel(EntityType entity){
var model = new ItemModel();
model.Items = entity.SomeItems.Select(x => anotherMapper.BuildModel(x));
return model;
}
As a result, I am getting different SQL queries in both cases. Moreover, the second snippet is working much slower than the first one. As I can see in SQL profiler the second snipper is generating many SQL queries.
So my questions:
Why is that happening?
How to create new objects like in the second snippet but to avoid
lots of SQL queries?
The likely reason you are seeing a difference in performance is due to EF Core materializing the query prematurely. When a Linq statement is compiled, EF attempts to translate it into SQL. If you call a function within the expression, EF6 would have raised an exception to the effect that the method cannot be converted to SQL. EF Core tries to be clever, and when it encounters a method it cannot convert, it executes the query up to the point it could get to, and then continues to execute the rest as Linq2Object where you method can run. IMO this is a pretty stupid feature and represents a huge performance landmine, and while it's fine to offer it as a possible option, it should be disabled by default.
You're probably seeing extra queries due to lazy loading after the main query runs, to populate the view models in the mapping method.
For instance if I execute:
var results = context.Parents.Select(x => new ParentViewModel
{
ParentId = x.ParentId,
Name = x.Name,
OldestChildName = x.Children.OrderByDescending(c => c.BirthDate).Select(c => c.Name).FirstOrDefault() ?? "No Child"
}).Single(x => x.ParentId == parentId);
That would execute as one statement. Calling a method to populate the view model:
var results = context.Parents
.Select(x => buildParentViewModel(x))
.Single(x => x.ParentId == parentId);
would execute something like:
var results = context.Parents
.ToList()
.Select(x => new ParentViewModel
{
ParentId = x.ParentId,
Name = x.Name,
OldestChildName = x.Children.OrderByDescending(c => c.BirthDate).Select(c =>
c.Name).FirstOrDefault() ?? "No Child"
}).Single(x => x.ParentId == parentId);
at worst or:
var results = context.Parents
.Where(x => x.ParentId == parentId)
.ToList()
.Select(x => new ParentViewModel
{
ParentId = x.ParentId,
Name = x.Name,
OldestChildName = x.Children.OrderByDescending(c => c.BirthDate).Select(c =>
c.Name).FirstOrDefault() ?? "No Child"
}).Single();
... at best. These are due to the Extra .ToList() call prior to the Select which is roughly what the premature execution will do automatically. The issue with these queries compared to the first one is that when it comes to loading the child's name. In the first query, the SQL generated pulls the parent and related child's details in one query. In the alternative cases the query will execute to pull the parent's details, but getting the child details will constitute a lazy load call to get more details since that will be executed as Linq2Object.
The solution would be to use Automapper and it's built in ProjectTo method to populate your view model. This will place the mapping code in automatically so that it works like the first scenario without you needing to write out all of the mapping code.

Assign Mapped Object to Expression Result in LINQ to Entities

I have the following child object that we use an expression to map our 'entity' to our 'domain' model. We use this when specifically calling our ChildRecordService method GetChild or GetChildren:
public static Expression<Func<global::Database.Models.ChildRecord, ChildRecord>> MapChildRecordToCommon = entity => new ChildRecord
{
DateTime = entity.DateTime,
Type = entity.Type,
};
public static async Task<List<ChildRecord>> ToCommonListAsync(this IQueryable<global::Database.Models.ChildRecord> childRecords)
{
var items = await
childRecords.Select(MapChildRecordToCommon).ToListAsync().EscapeContext();
return items;
}
public async Task<List<ChildRecord>> GetChildRecords()
{
using (var uow = this.UnitOfWorkFactory.CreateReadOnly())
{
var childRecords= await uow.GetRepository<IChildRecordRepository>().GetChildRecords().ToCommonListAsync().EscapeContext();
return childRecords;
}
}
So that all works just fine. However we have another object that is a parent to that child, that in SOME cases, we also wish to get the child during the materialisation and mapping process.
In other words the standard object looks as such:
private static Expression<Func<global::Database.Models.Plot, Plot>> MapPlotToCommonBasic = (entity) => new Plot
{
Id = entity.Id,
Direction = entity.Direction,
Utc = entity.Utc,
Velocity = entity.Velocity,
};
However what I also want to map is the Plot.ChildRecord property, using the expression MapChildRecordToCommon I have already created. I made a second expression just to test this:
private static Expression<Func<global::Database.Models.Plot, Plot>> MapPlotToCommonAdvanced = (entity) => new Plot
{
ChildRecord = MapChildRecordToCommon.Compile() (entity.ChildRecord)
};
This fails:
System.NotSupportedException
The LINQ expression node type 'Invoke' is not supported in LINQ to Entities.
Is there a way to reuse my existing expression for ChildRecord, to materialise the object of ChildRecord (ie. one to one/singular not multiple) on the Plot object? I think my trouble is caused by there being just one object and being unable to use the .Select(Map) method. I am not too great at expressions and have hit a wall with this.
For reference, there are actually up to 5 or 6 other child objects on the "Plot" object that I also want to make expressions for.
I resolved this by using the third party library LinqKit.
The library allowed the use of 2 methods, .AsExpandable() (which allows for the expressions to properly compile and be invoked as I understand), and .Invoke() as an extension method to an expression, rather than calling Expression.Invoke(yourexpression). I included a null check just in case.
My code now looks as follows:
public static async Task<List<Plot>> ToCommonListAsync(this IQueryable<global::Database.Models.Plot> plots)
{
var items = await
plots.AsExpandable().Select(MapPlotToCommon).ToListAsync().EscapeContext();
return items;
}
private static Expression<Func<global::Database.Models.Plot, Plot>> MapPlotToCommon = (entity) => new Plot
{
Id = entity.Id,
Direction = entity.Direction,
Utc = entity.Utc,
Velocity = entity.Velocity,
ChildRecord = entity.ChildRecord != null ? MapChildRecordToCommon.Invoke(entity.ChildRecord) : default
};
public static Expression<Func<global::Database.Models.ChildRecord, ChildRecord>> MapChildRecordToCommon = entity => new ChildRecord
{
DateTime = entity.DateTime,
Type = entity.Type,
};

Delete data sith Breeze.js without loading it to client

I am using Breeze.js with Entity Framework WebAPI backend, and I need to delete a large set of data that is not loaded to client. I would really like to do it on the server and not load it.
Is there a "breeze way"? By that I mean a method in a BreezeController.
EDIT
I have to delete all rows from one table that belong to the user, whose date field is in future, and all their child rows.
public override int SaveChanges()
{
foreach (
var entry in
this.ChangeTracker.Entries()
.Where((e => (e.State == (EntityState) Breeze.WebApi.EntityState.Deleted))))
{
if (entry.Entity.GetType() == typeof(User))
{
var entity = entry.Entity as User;
var childEntitiesInFuture = ChildEntities.Where(c => c.DateField > DateTime.Now);
foreach (var child in childEntitiesInFuture){
var grandchildrenForDeletion = Grandchildren.Where(c => c.ChildId == child.Id);
foreach (var g in grandchildrenForDeletion) Grandchildren.Remove(g);
ChildEntities.Remove(child);
}
}
}
}
Assuming you are deleting User, one User has many ChildEntity saved in ChildEntities and each ChildEntity has many Grandchild saved in Grandchildren. A bit messy names, but that's what you get with no real names :)
This method goes into your Context class. Good luck.

How do I delete multiple rows in Entity Framework (without foreach)

I want to delete several items from a table using Entity Framework. There is no foreign key / parent object, so I can't handle this with OnDeleteCascade.
Right now I'm doing this:
var widgets = context.Widgets
.Where(w => w.WidgetId == widgetId);
foreach (Widget widget in widgets)
{
context.Widgets.DeleteObject(widget);
}
context.SaveChanges();
It works, but the foreach bugs me. I'm using EF4, but I don't want to execute SQL. I just want to make sure I'm not missing anything -- this is as good as it gets, right? I can abstract the code with an extension method or helper, but somewhere we're still going to be doing a foreach, right?
EntityFramework 6 has made this a bit easier with .RemoveRange().
Example:
db.People.RemoveRange(db.People.Where(x => x.State == "CA"));
db.SaveChanges();
Warning! Do not use this on large datasets!
EF pulls all the data into memory, THEN deletes it. For smaller data sets this might not be an issue but generally avoid this style of delete unless you can guarantee you are only doing very small changes.
You could easily run your process out of memory while EF happily pulls in all the data you specified just to delete it.
using (var context = new DatabaseEntities())
{
context.ExecuteStoreCommand("DELETE FROM YOURTABLE WHERE CustomerID = {0}", customerId);
}
Addition: To support list of ids you can write
var listOfIds = String.Join(',',customerIds.Select(id => $"'{id}'").ToList());
var sql= $#"DELETE [YOURTABLE] WHERE CustomerID in ({listOfIds})";
Note: if CustomerID Is a string, you should double-check for potential SQL injection risks, for integer CustomerID it’s safe
this is as good as it gets, right? I can abstract it with an extension
method or helper, but somewhere we're still going to be doing a
foreach, right?
Well, yes, except you can make it into a two-liner:
context.Widgets.Where(w => w.WidgetId == widgetId)
.ToList().ForEach(context.Widgets.DeleteObject);
context.SaveChanges();
I know it's quite late but in case someone needs a simple solution, the cool thing is you can also add the where clause with it:
public static void DeleteWhere<T>(this DbContext db, Expression<Func<T, bool>> filter) where T : class
{
string selectSql = db.Set<T>().Where(filter).ToString();
string fromWhere = selectSql.Substring(selectSql.IndexOf("FROM"));
string deleteSql = "DELETE [Extent1] " + fromWhere;
db.Database.ExecuteSqlCommand(deleteSql);
}
Note: just tested with MSSQL2008.
Update:
The solution above won't work when EF generates sql statement with parameters, so here's the update for EF5:
public static void DeleteWhere<T>(this DbContext db, Expression<Func<T, bool>> filter) where T : class
{
var query = db.Set<T>().Where(filter);
string selectSql = query.ToString();
string deleteSql = "DELETE [Extent1] " + selectSql.Substring(selectSql.IndexOf("FROM"));
var internalQuery = query.GetType().GetFields(BindingFlags.NonPublic | BindingFlags.Instance).Where(field => field.Name == "_internalQuery").Select(field => field.GetValue(query)).First();
var objectQuery = internalQuery.GetType().GetFields(BindingFlags.NonPublic | BindingFlags.Instance).Where(field => field.Name == "_objectQuery").Select(field => field.GetValue(internalQuery)).First() as ObjectQuery;
var parameters = objectQuery.Parameters.Select(p => new SqlParameter(p.Name, p.Value)).ToArray();
db.Database.ExecuteSqlCommand(deleteSql, parameters);
}
It requires a little bit of reflection but works well.
If you don't want to execute SQL directly calling DeleteObject in a loop is the best you can do today.
However you can execute SQL and still make it completely general purpose via an extension method, using the approach I describe here.
Although that answer was for 3.5. For 4.0 I would probably use the new ExecuteStoreCommand API under the hood, instead of dropping down to the StoreConnection.
For anyone using EF5, following extension library can be used: https://github.com/loresoft/EntityFramework.Extended
context.Widgets.Delete(w => w.WidgetId == widgetId);
Entity Framework Core
3.1 3.0 2.2 2.1 2.0 1.1 1.0
using (YourContext context = new YourContext ())
{
var widgets = context.Widgets.Where(w => w.WidgetId == widgetId);
context.Widgets.RemoveRange(widgets);
context.SaveChanges();
}
Summary:
Removes the given collection of entities from the context underlying the set
with each entity being put into the Deleted state such that it will be deleted
from the database when SaveChanges is called.
Remarks:
Note that if System.Data.Entity.Infrastructure.DbContextConfiguration.AutoDetectChangesEnabled
is set to true (which is the default), then DetectChanges will be called once
before delete any entities and will not be called again. This means that in some
situations RemoveRange may perform significantly better than calling Remove multiple
times would do. Note that if any entity exists in the context in the Added state,
then this method will cause it to be detached from the context. This is because
an Added entity is assumed not to exist in the database such that trying to delete
it does not make sense.
Still seems crazy to have to pull anything back from the server just to delete it, but at least getting back just the IDs is a lot leaner than pulling down the full entities:
var ids = from w in context.Widgets where w.WidgetId == widgetId select w.Id;
context.Widgets.RemoveRange(from id in ids.AsEnumerable() select new Widget { Id = id });
Finally bulk delete has been introduced in Entity Framework Core 7 via the ExecuteDelete command:
context.Widgets
.Where(w => w.WidgetId == widgetId)
.ExecuteDelete();
Something to note here is that ExecuteDelete does not need a SaveChanges, as per its documentation:
This operation executes immediately against the database, rather than being deferred until DbContext.SaveChanges() is called. It also does not interact with the EF change tracker in any way: entity instances which happen to be tracked when this operation is invoked aren't taken into account, and aren't updated to reflect the changes.
I know that the question was asked for EF4, but if you upgrade this is a good alternative!
EF 6.1
public void DeleteWhere<TEntity>(Expression<Func<TEntity, bool>> predicate = null)
where TEntity : class
{
var dbSet = context.Set<TEntity>();
if (predicate != null)
dbSet.RemoveRange(dbSet.Where(predicate));
else
dbSet.RemoveRange(dbSet);
context.SaveChanges();
}
Usage:
// Delete where condition is met.
DeleteWhere<MyEntity>(d => d.Name == "Something");
Or:
// delete all from entity
DeleteWhere<MyEntity>();
For EF 4.1,
var objectContext = (myEntities as IObjectContextAdapter).ObjectContext;
objectContext.ExecuteStoreCommand("delete from [myTable];");
The quickest way to delete is using a stored procedure. I prefer stored procedures in a database project over dynamic SQL because renames will be handled correctly and have compiler errors. Dynamic SQL could refer to tables that have been deleted/renamed causing run time errors.
In this example, I have two tables List and ListItems. I need a fast way to delete all the ListItems of a given list.
CREATE TABLE [act].[Lists]
(
[Id] INT NOT NULL PRIMARY KEY IDENTITY,
[Name] NVARCHAR(50) NOT NULL
)
GO
CREATE UNIQUE INDEX [IU_Name] ON [act].[Lists] ([Name])
GO
CREATE TABLE [act].[ListItems]
(
[Id] INT NOT NULL IDENTITY,
[ListId] INT NOT NULL,
[Item] NVARCHAR(100) NOT NULL,
CONSTRAINT PK_ListItems_Id PRIMARY KEY NONCLUSTERED (Id),
CONSTRAINT [FK_ListItems_Lists] FOREIGN KEY ([ListId]) REFERENCES [act].[Lists]([Id]) ON DELETE CASCADE
)
go
CREATE UNIQUE CLUSTERED INDEX IX_ListItems_Item
ON [act].[ListItems] ([ListId], [Item]);
GO
CREATE PROCEDURE [act].[DeleteAllItemsInList]
#listId int
AS
DELETE FROM act.ListItems where ListId = #listId
RETURN 0
Now the interesting part of deleting the items and updating Entity framework using an extension.
public static class ListExtension
{
public static void DeleteAllListItems(this List list, ActDbContext db)
{
if (list.Id > 0)
{
var listIdParameter = new SqlParameter("ListId", list.Id);
db.Database.ExecuteSqlCommand("[act].[DeleteAllItemsInList] #ListId", listIdParameter);
}
foreach (var listItem in list.ListItems.ToList())
{
db.Entry(listItem).State = EntityState.Detached;
}
}
}
The main code can now use it is as
[TestMethod]
public void DeleteAllItemsInListAfterSavingToDatabase()
{
using (var db = new ActDbContext())
{
var listName = "TestList";
// Clean up
var listInDb = db.Lists.Where(r => r.Name == listName).FirstOrDefault();
if (listInDb != null)
{
db.Lists.Remove(listInDb);
db.SaveChanges();
}
// Test
var list = new List() { Name = listName };
list.ListItems.Add(new ListItem() { Item = "Item 1" });
list.ListItems.Add(new ListItem() { Item = "Item 2" });
db.Lists.Add(list);
db.SaveChanges();
listInDb = db.Lists.Find(list.Id);
Assert.AreEqual(2, list.ListItems.Count);
list.DeleteAllListItems(db);
db.SaveChanges();
listInDb = db.Lists.Find(list.Id);
Assert.AreEqual(0, list.ListItems.Count);
}
}
You can use extensions libraries for doing that like EntityFramework.Extended or Z.EntityFramework.Plus.EF6, there are available for EF 5, 6 or Core. These libraries have great performance when you have to delete or update and they use LINQ. Example for deleting (source plus):
ctx.Users.Where(x => x.LastLoginDate < DateTime.Now.AddYears(-2))
.Delete();
or (source extended)
context.Users.Where(u => u.FirstName == "firstname")
.Delete();
These use native SQL statements, so performance is great.
This answers is for EF Core 7 (I am not aware if they merged EF Core with EF now or not, before they kept the two separately).
EF Core 7 now supports ExecuteUpdate and ExecuteDelete (Bulk updates):
// Delete all Tags (BE CAREFUL!)
await context.Tags.ExecuteDeleteAsync();
// Delete Tags with a condition
await context.Tags.Where(t => t.Text.Contains(".NET")).ExecuteDeleteAsync();
The equivalent SQL queries are:
DELETE FROM [t]
FROM [Tags] AS [t]
DELETE FROM [t]
FROM [Tags] AS [t]
WHERE [t].[Text] LIKE N'%.NET%'
If you want to delete all rows of a table, you can execute sql command
using (var context = new DataDb())
{
context.Database.ExecuteSqlCommand("TRUNCATE TABLE [TableName]");
}
TRUNCATE TABLE (Transact-SQL) Removes all rows from a table without logging the individual row deletions. TRUNCATE TABLE is similar to the DELETE statement with no WHERE clause; however, TRUNCATE TABLE is faster and uses fewer system and transaction log resources.
You can execute sql queries directly as follows :
private int DeleteData()
{
using (var ctx = new MyEntities(this.ConnectionString))
{
if (ctx != null)
{
//Delete command
return ctx.ExecuteStoreCommand("DELETE FROM ALARM WHERE AlarmID > 100");
}
}
return 0;
}
For select we may use
using (var context = new MyContext())
{
var blogs = context.MyTable.SqlQuery("SELECT * FROM dbo.MyTable").ToList();
}
UUHHIVS's is a very elegant and fast way for batch delete, but it must be used with care:
auto generation of transaction: its queries will be encompassed by a transaction
database context independence: its execution has nothing to do with context.SaveChanges()
These issues can be circumvented by taking control of the transaction. The following code illustrates how to batch delete and bulk insert in a transactional manner:
var repo = DataAccess.EntityRepository;
var existingData = repo.All.Where(x => x.ParentId == parentId);
TransactionScope scope = null;
try
{
// this starts the outer transaction
using (scope = new TransactionScope(TransactionScopeOption.Required))
{
// this starts and commits an inner transaction
existingData.Delete();
// var toInsert = ...
// this relies on EntityFramework.BulkInsert library
repo.BulkInsert(toInsert);
// any other context changes can be performed
// this starts and commit an inner transaction
DataAccess.SaveChanges();
// this commit the outer transaction
scope.Complete();
}
}
catch (Exception exc)
{
// this also rollbacks any pending transactions
scope?.Dispose();
}
In EF 7 you can use bulk delete
var ids = widgets.Select(x => x.Id).ToList();
await _mrVodDbContext.Widgets.Where(x => ids.Contains(x.Id)).ExecuteDeleteAsync();
EF core generate
DELETE FROM [i]
FROM [Widgets] AS [i]
WHERE [i].[Id] IN (4,3,2,1)
More about deleting or updating in release notes. https://learn.microsoft.com/en-us/ef/core/what-is-new/ef-core-7.0/whatsnew#basic-executedelete-examples
You can also use the DeleteAllOnSubmit() method by passing it your results in a generic list rather than in var. This way your foreach reduces to one line of code:
List<Widgets> widgetList = context.Widgets
.Where(w => w.WidgetId == widgetId).ToList<Widgets>();
context.Widgets.DeleteAllOnSubmit(widgetList);
context.SubmitChanges();
It probably still uses a loop internally though.
Thanh's answer worked best for me. Deleted all my records in a single server trip. I struggled with actually calling the extension method, so thought I would share mine (EF 6):
I added the extension method to a helper class in my MVC project and changed the name to "RemoveWhere". I inject a dbContext into my controllers, but you could also do a using.
// make a list of items to delete or just use conditionals against fields
var idsToFilter = dbContext.Products
.Where(p => p.IsExpired)
.Select(p => p.ProductId)
.ToList();
// build the expression
Expression<Func<Product, bool>> deleteList =
(a) => idsToFilter.Contains(a.ProductId);
// Run the extension method (make sure you have `using namespace` at the top)
dbContext.RemoveWhere(deleteList);
This generated a single delete statement for the group.
I came up with a great library Zack.EFCore.Batch. It will convert your expression into simple DELETE FROM .... WHERE query. (Like some answers proposed) https://github.com/yangzhongke/Zack.EFCore.Batch
The usage example:
await ctx.DeleteRangeAsync<Book>(b => b.Price > n);
The Zack.EFCore.Batch library has lots of benefits over Z.EntityFramework.Extended https://entityframework-extensions.net/ which does not have true Async methods. (They are just wrappers around sync methods) You can get lots of unexpected issues by using this library in high load environment.
EF 6.=>
var assignmentAddedContent = dbHazirBot.tbl_AssignmentAddedContent.Where(a =>
a.HazirBot_CategoryAssignmentID == categoryAssignment.HazirBot_CategoryAssignmentID);
dbHazirBot.tbl_AssignmentAddedContent.RemoveRange(assignmentAddedContent);
dbHazirBot.SaveChanges();
Best : in EF6 => .RemoveRange()
Example:
db.Table.RemoveRange(db.Table.Where(x => Field == "Something"));
If you are using Generic Repository:
Inside Generic repository, following could be new method.
public void RemoveMultiple(Expression<Func<T, bool>> predicate)
{
IQueryable<T> query = _context.Set<T>().Where(predicate);
_context.Set<T>().RemoveRange(query.AsNoTracking());
}
Usage:
_unitOfWork.YOUR_ENTITY.RemoveMultiple(x => x.AccountId == accountId);
_unitOfWork.Complete();
context.Widgets.RemoveRange(context.Widgets.Where(w => w.WidgetId == widgetId).ToList());
db.SaveChanges();
See the answer 'favorite bit of code' that works
Here is how I used it:
// Delete all rows from the WebLog table via the EF database context object
// using a where clause that returns an IEnumerable typed list WebLog class
public IEnumerable<WebLog> DeleteAllWebLogEntries()
{
IEnumerable<WebLog> myEntities = context.WebLog.Where(e => e.WebLog_ID > 0);
context.WebLog.RemoveRange(myEntities);
context.SaveChanges();
return myEntities;
}
In EF 6.2 this works perfectly, sending the delete directly to the database without first loading the entities:
context.Widgets.Where(predicate).Delete();
With a fixed predicate it's quite straightforward:
context.Widgets.Where(w => w.WidgetId == widgetId).Delete();
And if you need a dynamic predicate have a look at LINQKit (Nuget package available), something like this works fine in my case:
Expression<Func<Widget, bool>> predicate = PredicateBuilder.New<Widget>(x => x.UserID == userID);
if (somePropertyValue != null)
{
predicate = predicate.And(w => w.SomeProperty == somePropertyValue);
}
context.Widgets.Where(predicate).Delete();

How to do recursive load with Entity framework?

I have a tree structure in the DB with TreeNodes table. the table has nodeId, parentId and parameterId. in the EF, The structure is like TreeNode.Children where each child is a TreeNode...
I also have a Tree table with contain id,name and rootNodeId.
At the end of the day I would like to load the tree into a TreeView but I can't figure how to load it all at once.
I tried:
var trees = from t in context.TreeSet.Include("Root").Include("Root.Children").Include("Root.Children.Parameter")
.Include("Root.Children.Children")
where t.ID == id
select t;
This will get me the the first 2 generations but not more.
How do I load the entire tree with all generations and the additional data?
I had this problem recently and stumbled across this question after I figured a simple way to achieve results. I provided an edit to Craig's answer providing a 4th method, but the powers-that-be decided it should be another answer. That's fine with me :)
My original question / answer can be found here.
This works so long as your items in the table all know which tree they belong to (which in your case it looks like they do: t.ID). That said, it's not clear what entities you really have in play, but even if you've got more than one, you must have a FK in the entity Children if that's not a TreeSet
Basically, just don't use Include():
var query = from t in context.TreeSet
where t.ID == id
select t;
// if TreeSet.Children is a different entity:
var query = from c in context.TreeSetChildren
// guessing the FK property TreeSetID
where c.TreeSetID == id
select c;
This will bring back ALL the items for the tree and put them all in the root of the collection. At this point, your result set will look like this:
-- Item1
-- Item2
-- Item3
-- Item4
-- Item5
-- Item2
-- Item3
-- Item5
Since you probably want your entities coming out of EF only hierarchically, this isn't what you want, right?
.. then, exclude descendants present at the root level:
Fortunately, because you have navigation properties in your model, the child entity collections will still be populated as you can see by the illustration of the result set above. By manually iterating over the result set with a foreach() loop, and adding those root items to a new List<TreeSet>(), you will now have a list with root elements and all descendants properly nested.
If your trees get large and performance is a concern, you can sort your return set ASCENDING by ParentID (it's Nullable, right?) so that all the root items are first. Iterate and add as before, but break from the loop once you get to one that is not null.
var subset = query
// execute the query against the DB
.ToList()
// filter out non-root-items
.Where(x => !x.ParentId.HasValue);
And now subset will look like this:
-- Item1
-- Item2
-- Item3
-- Item4
-- Item5
About Craig's solutions:
You really don't want to use lazy loading for this!! A design built around the necessity for n+1 querying will be a major performance sucker. ********* (Well, to be fair, if you're going to allow a user to selectively drill down the tree, then it could be appropriate. Just don't use lazy loading for getting them all up-front!!)I've never tried the nested set stuff, and I wouldn't suggest hacking EF configuration to make this work either, given there is a far easier solution. Another reasonable suggestion is creating a database view that provides the self-linking, then map that view to an intermediary join/link/m2m table. Personally, I found this solution to be more complicated than necessary, but it probably has its uses.
When you use Include(), you are asking the Entity Framework to translate your query into SQL. So think: How would you write an SQL statement which returns a tree of an arbitrary depth?
Answer: Unless you are using specific hierarchy features of your database server (which are not SQL standard, but supported by some servers, such as SQL Server 2008, though not by its Entity Framework provider), you wouldn't. The usual way to handle trees of arbitrary depth in SQL is to use the nested sets model rather than the parent ID model.
Therefore, there are three ways which you can use to solve this problem:
Use the nested sets model. This requires changing your metadata.
Use SQL Server's hierarchy features, and hack the Entity Framework into understanding them (tricky, but this technique might work). Again, you'll need to change your metadata.i
Use explicit loading or EF 4's lazy loading instead of eager loading. This will result in many database queries instead of one.
I wanted to post up my answer since the others didn't help me.
My database is a little different, basically my table has an ID and a ParentID. The table is recursive. The following code gets all children and nests them into a final list.
public IEnumerable<Models.MCMessageCenterThread> GetAllMessageCenterThreads(int msgCtrId)
{
var z = Db.MCMessageThreads.Where(t => t.ID == msgCtrId)
.Select(t => new MCMessageCenterThread
{
Id = t.ID,
ParentId = t.ParentID ?? 0,
Title = t.Title,
Body = t.Body
}).ToList();
foreach (var t in z)
{
t.Children = GetChildrenByParentId(t.Id);
}
return z;
}
private IEnumerable<MCMessageCenterThread> GetChildrenByParentId(int parentId)
{
var children = new List<MCMessageCenterThread>();
var threads = Db.MCMessageThreads.Where(x => x.ParentID == parentId);
foreach (var t in threads)
{
var thread = new MCMessageCenterThread
{
Id = t.ID,
ParentId = t.ParentID ?? 0,
Title = t.Title,
Body = t.Body,
Children = GetChildrenByParentId(t.ID)
};
children.Add(thread);
}
return children;
}
For completeness, here's my model:
public class MCMessageCenterThread
{
public int Id { get; set; }
public int ParentId { get; set; }
public string Title { get; set; }
public string Body { get; set; }
public IEnumerable<MCMessageCenterThread> Children { get; set; }
}
I wrote something recently that does N+1 selects to load the whole tree, where N is the number of levels of your deepest path in the source object.
This is what I did, given the following self-referencing class
public class SomeEntity
{
public int Id { get; set; }
public int? ParentId { get; set; }
public string Name { get; set;
}
I wrote the following DbSet helper
using System;
using System.Collections.Generic;
using System.Linq;
using System.Linq.Expressions;
using System.Threading.Tasks;
namespace Microsoft.EntityFrameworkCore
{
public static class DbSetExtensions
{
public static async Task<TEntity[]> FindRecursiveAsync<TEntity, TKey>(
this DbSet<TEntity> source,
Expression<Func<TEntity, bool>> rootSelector,
Func<TEntity, TKey> getEntityKey,
Func<TEntity, TKey> getChildKeyToParent)
where TEntity: class
{
// Keeps a track of already processed, so as not to invoke
// an infinte recursion
var alreadyProcessed = new HashSet<TKey>();
TEntity[] result = await source.Where(rootSelector).ToArrayAsync();
TEntity[] currentRoots = result;
while (currentRoots.Length > 0)
{
TKey[] currentParentKeys = currentRoots.Select(getEntityKey).Except(alreadyProcessed).ToArray();
alreadyProcessed.AddRange(currentParentKeys);
Expression<Func<TEntity, bool>> childPredicate = x => currentParentKeys.Contains(getChildKeyToParent(x));
currentRoots = await source.Where(childPredicate).ToArrayAsync();
}
return result;
}
}
}
Whenever you need to load a whole tree you simply call this method, passing in three things
The selection criteria for your root objects
How to get the property for the primary key of the object (SomeEntity.Id)
How to get the child's property that refers to its parent (SomeEntity.ParentId)
For example
SomeEntity[] myEntities = await DataContext.SomeEntity.FindRecursiveAsync(
rootSelector: x => x.Id = 42,
getEntityKey: x => x.Id,
getChildKeyToParent: x => x.ParentId).ToArrayAsync();
);
Alternatively, if you can add a RootId column to the table then for each non-root entry you can set this column to the ID of the root of the tree. Then you can fetch everything with a single select
DataContext.SomeEntity.Where(x => x.Id == rootId || x.RootId == rootId)
For an example of loading in child objects, I'll give the example of a Comment object that holds a comment. Each comment has a possible child comment.
private static void LoadComments(<yourObject> q, Context yourContext)
{
if(null == q | null == yourContext)
{
return;
}
yourContext.Entry(q).Reference(x=> x.Comment).Load();
Comment curComment = q.Comment;
while(null != curComment)
{
curComment = LoadChildComment(curComment, yourContext);
}
}
private static Comment LoadChildComment(Comment c, Context yourContext)
{
if(null == c | null == yourContext)
{
return null;
}
yourContext.Entry(c).Reference(x=>x.ChildComment).Load();
return c.ChildComment;
}
Now if you were having something that has collections of itself you would need to use Collection instead of Reference and do the same sort of diving down. At least that's the approach I took in this scenario as we were dealing with Entity and SQLite.
This is an old question, but the other answers either had n+1 database hits or their models were conducive to bottom-up (trunk to leaves) approaches. In this scenario, a tag list is loaded as a tree, and a tag can have multiple parents. The approach I use only has two database hits: the first to get the tags for the selected articles, then another that eager loads a join table. Thus, this uses a top-down (leaves to trunk) approach; if your join table is large or if the result cannot really be cached for reuse, then eager loading the whole thing starts to show the tradeoffs with this approach.
To begin, I initialize two HashSets: one to hold the root nodes (the resultset), and another to keep a reference to each node that has been "hit."
var roots = new HashSet<AncestralTagDto>(); //no parents
var allTags = new HashSet<AncestralTagDto>();
Next, I grab all of the leaves that the client requested, placing them into an object that holds a collection of children (but that collection will remain empty after this step).
var startingTags = await _dataContext.ArticlesTags
.Include(p => p.Tag.Parents)
.Where(t => t.Article.CategoryId == categoryId)
.GroupBy(t => t.Tag)
.ToListAsync()
.ContinueWith(resultTask =>
resultTask.Result.Select(
grouping => new AncestralTagDto(
grouping.Key.Id,
grouping.Key.Name)));
Now, let's grab the tag self-join table, and load it all into memory:
var tagRelations = await _dataContext.TagsTags.Include(p => p.ParentTag).ToListAsync();
Now, for each tag in startingTags, add that tag to the allTags collection, then travel down the tree to get the ancestors recursively:
foreach (var tag in startingTags)
{
allTags.Add(tag);
GetParents(tag);
}
return roots;
Lastly, here's the nested recursive method that builds the tree:
void GetParents(AncestralTagDto tag)
{
var parents = tagRelations.Where(c => c.ChildTagId == tag.Id).Select(p => p.ParentTag);
if (parents.Any()) //then it's not a root tag; keep climbing down
{
foreach (var parent in parents)
{
//have we already seen this parent tag before? If not, instantiate the dto.
var parentDto = allTags.SingleOrDefault(i => i.Id == parent.Id);
if (parentDto is null)
{
parentDto = new AncestralTagDto(parent.Id, parent.Name);
allTags.Add(parentDto);
}
parentDto.Children.Add(tag);
GetParents(parentDto);
}
}
else //the tag is a root tag, and should be in the root collection. If it's not in there, add it.
{
//this block could be simplified to just roots.Add(tag), but it's left this way for other logic.
var existingRoot = roots.SingleOrDefault(i => i.Equals(tag));
if (existingRoot is null)
roots.Add(tag);
}
}
Under the covers, I am relying on the properties of a HashSet to prevent duplicates. To that end, it's important that the intermediate object that you use (I used AncestralTagDto here, and its Children collection is also a HashSet), override the Equals and GetHashCode methods as appropriate for your use-case.