Unexpected removal of relationship using SDN4 - spring-data-neo4j-4

I have a test in my data model failing that previously passed after converting some neo4j-ogm code to use spring-data. However, I'm almost sure spring-data-neo4j is not the cause of the issue I am having. More something conceptually I'm missing (about sessions or dirtiness or something?)
The behavior of the test is:
Save Object A
Save Object B with relationship to A
Set property of Object A
Save Object A
Previously, Node A was updated with its property and there was a relationship existing between Node A and Node B in the database after test execution.
After the conversion however, the result is the same except that the relationship between Node A and Node B no longer exists in the database (it is removed) after Object A is saved the second time.
Could there be anything I've done to change this behaviour? What is expected behaviour?

Related

Kotlin immutable entities changing unexpectedly when using it with JPA

In our project we are using kotlin with JPA. All of our entities are immutable so, it is not possible to set fields of our entities directly. You have to create a new instance by using the copy method. If you want these changes to be reflected to database, you must persist this newly created entity with an explicit function call.
In the beginning, this approach looks perfect to us. However, nowadays we are having some problems like some of our instances are changing unexpectedly in the memory.
val instance1 = repository.findById(entityId)
repository.save(instance1.copy(deletedAt = Instant.now()))
..
..
assertNull(instance1.deletedAt())
In the code snipped above, instance1 is retrieved from database and its deletedAt field is set with copy method and the new instance which is created with this copy method is passed to save method of the repository. We don't set any field of instance1, we create a new instance to do these changes. However, the result on assert line is unexpectedly not-null.
It seems, There is a confliction on JPA persistence context (first level cache) and kotlin's immutable and copy method logic.
Is anyone facing this problem or any suggestion or best practices when using JPA and immutable Kotlin entities?
I suspect the problem is that you're ignoring the return value from save().  Its docs say:
Saves a given entity. Use the returned instance for further operations as the save operation might have changed the entity instance completely.
But you're not doing that; you're instead continuing to use the original instance which (as that says) may have changed.
Instead, store the return value from save(), and use that thereafter.  (Either by making instance1 a var, or creating a new val and not referring to instance1 afterward.)
(This isn't a Kotlin-specific problem, and is exactly the same in Java.  JPA , Spring, &c work their magic by futzing with the bytecode, so can do things your code can't — such as changing immutable values.  Most of the time you can ignore it, but this case makes it obvious.)
Immutable types are not compatible on how JPA works.
JPA works around the concept of UnitOfWork, which mean objects retrieved from the database lives in a PersistedContext (1st level cache) and they get discarded once the EntityManager is closed (on a web application at the end of the HTTP request).
When using the copy method in an entity you just retrieved from the database, the copied object is considered detached from the current session meaning that changes on it cannot be tracked by JPA and the underlying implememtation (Hibernate / EclipseLink) have hard time figuring out which SQL statement needs to be fired (Insert/Update/Delete ????)
Things got way more complex when you have complex object graph with OneToMany associations and cascading options.
So my recommendation is unfortunately is to avoid Immutable types when using JPA.

Can I get the object from the database, without knowing what type of the object is?

I have the class GraphHandler. Inside of superclass I try to handle restoring the object that was last saved into the database. For this I'm using primaryKey.
The point is that a the time of restoring I don't know yet which type should I expect. So I tried with this:
let realm = ClientManager.cacheRealm()
realm.object(ofType: Object.self, forPrimaryKey: "uniqueid")
But I get the error:
Terminating app due to uncaught exception RLMException, reason: 'Object type RealmSwiftObject is not managed by the Realm. If using a custom objectClasses / objectTypes array in your configuration, add RealmSwiftObject to the list of objectClasses / objectTypes.'
Im trying to do it a way that the handler don't need to know in advance which type of object was saved last. What can solve this? I think that implementing generics won't do any good to it as I can't be changed on the fly.
By the time you call the function realm.object, the type of the object needs to be known, since using this function, realm is only searching for objects of a specific type. Moreover, the type of primary key can be different as well, hence the type of object you are looking for needs to be known before querying realm.
Querying all types and filtering afterwards is only not an option at the moment, since Result can only store a single type of objects. However, if you really need to query all types using a single query to get the last database entry regardless of what class it had, have a look at this comment on a related GitHub issue, where Realm engineers give some workaround for the issue.
Another workaround you could try is the following: create a TimeStamps class, which is managed by Realm, has only a single entry in Realm and which has a one-to-one relation to each of your other Realm classes. The object on the other side of the one-to-one relation would always be the object that you added the last time to Realm of that specific class. With this approach, if you are looking for the latest object added to realm, you can use a simple query that retrieves the only TimeStamps object you have and you can filter for the object added last by filtering the one-to-one relations TimeStamps have. Of course for this to work, you need to associate the creation date of your objects with the relationships you are storing in your TimeStamps object and update these relationships in each of your write transactions.

Data retrieval fails after clearing Local

I'm writing unit tests for entity framework, using the Effort in memory db.
In one set of tests the EF part creates an object and adds some existing objects to a child list in it (think Master / Detail). Then it SavesChanges
In the unit test I then retrieve the record from EF and check that it actually got there and the children are in place: they are.
However, I want to be sure that I'm not just reading the data from the cache and that it actually got persisted. So I Clear() the Local cache in the DbSet.
When I do this, I retrieve the main record OK, but the child records are NOT retrieved.
This is true if I run the code against SQL Server as well as Effort (so it's not Effort).
If I dispose the context and create a new one, the same method retrieves the data correctly include the children.
So something about clearing the local cache is interfering with the data retrieval. I've tried clearing just cache for the main record and also for both that and the child records - no difference.
Any suggestions would be appreciated.

copy records from between two databases using EF

I need to copy data from one database to another with EF. E.g. I have the following table relations: Forms->FormVersions->FormLayouts... We have different forms in both databases and we want to collect them to one DB. Basically I want to load Form object recursively from one DB and save it to another DB with all his references. Also I need to change IDs of the object and related objects if there are exists objects with the same ID in the second database.
Until now I have following code:
Form form = null;
using (var context = new FormEntities())
{
form = (from f in context.Forms
join fv in context.FormVersions on f.ID equals fv.FormID
where f.ID == 56
select f).First();
}
var context1 = new FormEntities("name=FormEntities1");
context1.AddObject("Forms", form);
context1.SaveChanges();
I'm receiving the error: "The EntityKey property can only be set when the current value of the property is null."
Can you help with implementation?
The simplest solution would be create copy of your Form (new object) and add that new object. Otherwise you can try:
Call context.Detach(form)
Set form's EntityKey to null
Call context1.AddObject(form)
I would first second E.J.'s answer. Assuming though that you are going to use Entity Framework, one of the main problem areas that you will face is relationship management. Your code should use the Include method to ensure that related objects are included in the results of a select operation. The join that you have will not have this effect.
http://msdn.microsoft.com/en-us/library/bb738708.aspx
Further, detaching an object will not automatically detach the related objects. You can detach them in the same way however the problem here is that as each object is detached, the relationships that it held to other objects within the context are broken.
Manually restoring the relationships may be an option for you however it may be worthwhile looking at EntityGraph. This framework allows you to define object graphs and then perform operations such as detach upon them. The entire graph is detached in a single operation with its relationships intact.
My experience with this framework has been in relation to RIA Services and Silverlight however I believe that these operations are also supported in .Net.
http://riaservicescontrib.codeplex.com/wikipage?title=EntityGraphs
Edit1: I just checked the EntityGraph docs and see that DetachEntityGraph is in the RIA specific layer which unfortunately rules it out as an option for you.
Edit2: Alex Jame's answer to the following question is a solution to your problem. Don't load the objects into the context to begin with - use the notracking option. That way you don't need to detach them which is what causes the problem.
Entity Framework - Detach and keep related object graph
If you are only doing a few records, Ladislav's suggestion will probably work, but if you are moving lots of data, you should/could consider doing this move in a stored procedure. The entire operation can be done at the server, with no need to move objects from the db server, to your front end and then back again. A single SP call would do it all.
The performance will be a lot better which may or may not not matter in your case.

How does deleteObject: work in coreData?

I just took the plunge and rewrote my App on top of CoreData (previously I was using my own internal save format).
Things are mostly working, although I'm a little confused by the behaviour of deleteObject:.
I have an object that is part of my graph, and when I delete it nothing seems to happen to the object. The object has relationships where some of them are "Cascade" and some are "Nullify". Every relationship to / from the object has an inverse relationship.
After I delete the object, the only thing that seems to change is that the "isDeleted" flag is set on my object. All of the relationships exist as they did before.
If I try to find the objects using a NSFetchRequest, it does not find the deleted objects. However, if I traverse my graph using the KVC relationships, the NSSet returned contains all of the objects including the deleted objects.
After I send the save: method to my ManagedObjectContext, then everything is as I expect.
When I do a deletion, do I need to manually nil out relationships I don't want or do I need to continuously save to keep my data sane? This seems very counter intuitive to me.
Is there anything that I can do to "commit" the deletion or at least make my object graph sane short of doing a save. It seems a little drastic to be doing a save every time I want to modify my graph.
Thanks,
Ron
p.s. Here is some of the behaviour that seems strange to me:
Before deleting the object, this is the "description" of the parent object which has a categoryObjs "to many" relationship:
categoryObjs = (
"0x613e1a0 <x-coredata://1A1AE9E7-66B1-4F4D-A7AB-07D4504CAE2C/TestCategory/p9>",
"0x613e1b0 <x-coredata://1A1AE9E7-66B1-4F4D-A7AB-07D4504CAE2C/TestCategory/p12>",
"0x613e190 <x-coredata://1A1AE9E7-66B1-4F4D-A7AB-07D4504CAE2C/TestCategory/p7>"
);
After deleting the "p12" object (the middle one above), the state of the relationship does not change when accessed through KVC. If I try to fetch the TestCategory entities, then only two are found.
After a "save:" the p12 object disappears:
categoryObjs = (
"0x613e1a0 <x-coredata://1A1AE9E7-66B1-4F4D-A7AB-07D4504CAE2C/TestCategory/p9>",
"0x613e190 <x-coredata://1A1AE9E7-66B1-4F4D-A7AB-07D4504CAE2C/TestCategory/p7>"
);
Every time you call save:, your Managed Object Context must go back to the store and actually write the changes. This is expensive. Therefore, deleteObject: merely marks the object as deleted, which will actually be applied the next time you save. (Remember that this helps out with undo functionality too, it's just going against the way you want to do things.)
According to the documentation, the isDeleted property just states whether the object is going to be deleted upon the next commit, and sets an isDeleted flag on the object. Additionally, deleteObject: will remove the receiver from the context if it was never committed.
For example (where Objects A and B are NSManagedObject instances):
Create Object A
Save MOC
Delete Object A
Object A has been marked for deletion but is not actually deleted until you perform step 2 again.
Contrast with this:
Create Object B
Delete Object B
Object B is gone, since it was never saved, there is no "marking for deletion". It's simply gone.
Edit:
I'm just curious, are you using an NSFetchedResultsController for your tableview's datasource? It's worth looking in to, if you haven't already.
I think Core Data want to minimize memory & IO usage while deleteObject:, and do all the heaver jobs, like write sqllite file, in save:. That could be the most time-efficient way.