Data retrieval fails after clearing Local - entity-framework

I'm writing unit tests for entity framework, using the Effort in memory db.
In one set of tests the EF part creates an object and adds some existing objects to a child list in it (think Master / Detail). Then it SavesChanges
In the unit test I then retrieve the record from EF and check that it actually got there and the children are in place: they are.
However, I want to be sure that I'm not just reading the data from the cache and that it actually got persisted. So I Clear() the Local cache in the DbSet.
When I do this, I retrieve the main record OK, but the child records are NOT retrieved.
This is true if I run the code against SQL Server as well as Effort (so it's not Effort).
If I dispose the context and create a new one, the same method retrieves the data correctly include the children.
So something about clearing the local cache is interfering with the data retrieval. I've tried clearing just cache for the main record and also for both that and the child records - no difference.
Any suggestions would be appreciated.

Related

JPA First level cache and when its filled

working with Spring data JPA and reading it Hibernate first level cache is missed, the answer says "Hibernate does not cache queries and query results by default. The only thing the first level cache is used is when you call EntityManger.find() you will not see a SQL query executing. And the cache is used to avoid object creation if the entity is already loading."
So, if If get an entity not by its Id but other criteria, if I update some property I should not see an update sql inside a transactional methods because it has not been stored int the first level cache, right?
According to the above answer, if I get some list of entities, they will not be stored in first level cache not matter the criteria I use to find them, right?
When a Transactional(propagation= Propagation.NEVER) method loads the same entity by its id two times, is not supposed it will hit the database two times because each loading will run in its own "transaction" and will have its own persistent context? What is the expected behaviour in this case?
Thanks

Avoiding OptimisticLockException in JEE JSF web apps using JPA where multiple commits are possible within the same view

Working with JEE and JPA, using Optimistic locking with #Version within a web application, is there a recommended approach for dealing with updates (entityManager#merge), if the same user can make multiple commits of different sets of entity objects within the same view, including updating the same entity object more than once?
The issue being entityManager#merge does not update the version number in the memory object it received, but rather returns a new object that should be used in the future if merging again. This means if there are complex relationships in memory, all have to be updated with this new object. Else, the next time a merge is attempted from the same view, an OptimisticLockException will be thrown.
Rather than having to traverse through the view's model and update every instance of a merged object (which opens the door to high risk of runtime errors), one option will be to ensure no partial work is allowed in the view, then always rebuilding the view's model after every update. This seems rather primitive and non-performing. Wondering if there are any other recommended approaches. Thanks!
Here is an example scenario:
User "Manager" has a list of "Employees" in her web view. "Manager" selects an "Employee" and changes their phone number and saves. Manager realizes she had the phone number wrong, corrects it, and saves again within the same view (JSF) or web session.
The web view maintained a list of Employees, which is needed to do a selection. After the first update, without refetching of that list from DB, or updating the list in-memory, the new entity version number will not propagate to the view causing an OptimisticLockException on the second save. Is there another option other than refetch the list after 1st merge, update in memory after 1st merge, or find-before-every-merge merge memory object into found object, then commit?

Performance Entity framework 6 startup and update

Im using lazyloading and pre generated views.
I create the context.
I get all my objects about 6000 + navigations are filled in like 3sec. thats ok.
I update all my objects first time ( 4mins....)
I update all my objects second time ( 6sec )
I suspect lazyloading running on background and updating must be doing something that makes him reloop or EF startup is still loading..
I'm on EF 6.1 and the datas are hierarchical.
Database size is about 6000rows on 30 tables.
EF model is DatabaseFirst.
Any Workaround ?
If your concern is about the time difference between the two updates, I suspect that the second one runs faster because fewer objects have been modified.
Why do you need all the objects loaded into your context? What is the lifecycle of your context?
The general recommendation is that you create single use contexts - for a single web request or a single windows form.
You'll further see a lot of example with a using statement, where the life of the context is purposely kept very short. Letting the context live too long can increase memory usage and increase the possibility of concurrency problems. Your database and other layers also do their own caching.
Lastly, Lazy loading is fine, as long as you don't know if you're going to need things. You can consider explicitly loading data if you know you're going to need related records, and want to avoid multiple round trips.
http://msdn.microsoft.com/en-us/data/jj574232.aspx

How to restrict the growth of local collection of entities in EF 4.1.?

Hi, I have written a parser which parse code files and save language constructs(Properties,Methods,events,functions,subroutines) to db using EF 4.1.
There is only a single instance of DBContext which is used throughout the parsing.
Each file during parsing creates various objects of entities and save the dbcontext once that file is parsed.
But, now even after save, if we check the local collection of any entity, it still shows the objects in memory like : DbContext.EntityName.Local
So, after parsing couple of files, the inmemory local collection of entities keep growing ad consumes a considerable amount of memory, resulting into hanging-up the entire process.
Is there a way to clear the local collection of entities without calling dispose().?
Contexts should be short lived for this very reason.
You could always Detach your entities after you have saved it.

How do I detach objects in Entity Framework Code First?

There is no Detach(object entity) on the DbContext.
Do I have the ability to detach objects on EF code first?
This is an option:
dbContext.Entry(entity).State = EntityState.Detached;
If you want to detach existing object follow #Slauma's advice. If you want to load objects without tracking changes use:
var data = context.MyEntities.AsNoTracking().Where(...).ToList();
As mentioned in comment this will not completely detach entities. They are still attached and lazy loading works but entities are not tracked. This should be used for example if you want to load entity only to read data and you don't plan to modify them.
Both previous answers provide good instructions, however, both might leave you with the entities still loaded into EF's context and/or its Change Tracker.
This is not a problem when you are changing small data sets, but it will become an issue when changing large ones. EF would have increased memory and resource usage, which in turn would reduce the procedure performance as it uses more data/entities.
Both other approaches are valid but, In this case, Microsoft recommends cleaning the Change tracker instead of detaching the entities individually
Clearing the Change tracker on the data changing loop (which changes a chunk of data for instance) can save you from this trouble.
context.ChangeTracker.Clear();
This would unload/detach all entities and its related changeTracker references from the context, so use with care after your context.SaveChanges().