Trying to persist an entity which contains a member variable that is a reference to some other entity which is not under the current persistence context will not be possible, when this happens one needs to fetch the required entity and set it in the original entity in order to respect the relationship and be allowed to persist it.
When I need to accomplish this I usually make use of the EntityManager's find method, but that will hit the database and fetch the entire entity along with it's relationships that may not be annotated for lazy loading. I was happy to find out about getReference, which suposedly won't hit the database but return a proxy representation where only the primary key is available and that is really all that is required for this type of situation.
Unfortunately after some debugging I find myself being able to view all the information about the getReference'd entity and not just the primary key when I "inspect" it via Eclipse debug mode.
Am I missing something? Am I being deceived by the debug mode? Could it be fetching the information like a getter method would when used on the proxy reference?
Thanks in advance
Whe you inspect it using the Eclipse debugger, the debugger initializes the proxy. Just turn on SQL logging, execute the em.getReference() method, and verify that no select statement has been executed by your JPA engine.
Related
I'm playing around with spring-data-jdbc and discovered a problem, with I can't solve using Google.
No matter what I try to do, I just can't push a trivial object into the database (Bean1.java:25):
carRepository.save(new Car(2L, "BMW", "5"));
Both, without one and with a TransactionManager +#Transactional the database (apparently) does not commit the record.
The code is based on a Postgres database, but you might also simply use a H2 below and get the same result.
Here is the (minimalistic) source code:
https://github.com/bitmagier/spring-data-jdbc-sandbox/tree/stackoverflow-question
Can somebody tell me, why the car is not inserted into the database?
This is not related to transactions not working.
Instead, it's about Spring Data JDBC considering your instance an existing instance that needs updating (instead of inserting).
You can verify this is the problem by activating logging for org.springframework.jdbc.core.namedparam.NamedParameterJdbcTemplate. You should see an update but no insert.
By default, Spring Data JDBC considers an entity as new when it has an id of an object type and a value of null or of a primitive type (e.g. int or long) and a value of 0.
If your entity has an attribute with #Version annotation that attribute will be used to determine if the instance is a new one.
You have the following options in order to make it work:
Set the id to null and configure your database schema so that it will automatically create a new value on insert. After the save your entity instance will contain the generated value from the database.
Note: Spring Data JDBC will set the id even if it is final in your entity.
Leave the id null and set it in a Before-Convert listener to the desired value.
Let your entity implement Persistable. This allows you to control when an entity is considered new. You'll probably need a listener as well so you can let the entity know it is not new any longer.
Beginning with version 1.1 of Spring Data JDBC you'll also be able to use a JdbcAggregateTemplate to do a direct insert, without inspecting the id, see https://jira.spring.io/browse/DATAJDBC-282. Of course, you can do that in a custom method of your repository, as is done in this example: https://github.com/spring-projects/spring-data-examples/pull/441
I just found some really strange behaviour which turns out it is not so strange at all.
My select statement (query from database) worked only the first time. The second time, query from database was cached.
Inside Hub method I read something from database every 10 seconds and return result to all connected clients. But if some API change this data, Hub context does not read actual data.
In this thread I found this:
When you use EF it by default loads each entity only once per context. The first query creates entity instance and stores it internally. Any subsequent query which requires entity with the same key returns this stored instance. If values in the data store changed you still receive the entity with values from the initial query. This is called Identity map pattern. You can force the object context to reload the entity but it will reload a single shared instance.
So my question is how to properly use EFCore inside SignalR Core hub method?
I could use AsNoTracking, but I would like to use some global setting. Developer can easily forget to add AsNoTracking and this could mean serving outdated data to user.
I would like to write some code in my BaseHub class which will tell context do not track data. If I change entity properties, SaveChanges should update data. Can this be achieved? It is hard to think all the time to add AsNoTracking when querying from hub method.
I would like to write some code in my BaseHub class which will tell context do not track data.
The default query tracking behavior is controlled by the ChangeTracker.QueryTrackingBehavior property with default value of TrackAll (i.e. tracking).
You can change it to NoTracking and then use AsTracking() for queries that need tracking. It's a matter of which are more commonly needed.
If I change entity properties, SaveChanges should update data.
This is not possible if the entity is not tracked.
If you actually want tracking queries with "database wins" strategy, I'm afraid it's not possible currently in EF Core. I think EF6 object context services had an option for specifying the "client wins" vs "database wins" strategy, but EF Core currently does not provide such control and always implements "client wins" strategy.
Let's say I have an entity called Product and this entity is loaded every time user hits the product information page. Usually I'd save the object in Zend_Cache (memcache) for an hour to avoid hitting the db for each request but as far as I understand that's not possible with Doctrine2 entities because of the Proxy objects.
So my question is, how can I avoid loading the same entity from the database for each request?
[EDIT]
I tried using Doctrine Cache like this
$categoryService = App_Service_Container::getService('\App\Service\Category');
$cache = $categoryService->getEm()->getConfiguration()->getResultCacheImpl();
$apple = $cache->fetch('apple');
But I get the following error
Warning: require(App/Entity/Proxy/_CG_/App/Entity/Category.php)
[function.require]: failed to open stream: No such file or directory
in /opt/vhosts/app/price/library/Doctrine/Common/ClassLoader.php on
line 163
This is same for Zend Cache as well as you can't serialize the entity because of the Proxy class
You've got several options:
Use Doctrine's built-in result caching
Try just sticking entity in memcache via Zend_Cache. When you pull it out, you may need to merge() the Product back into the EM so proxies can be dereferenced. If you fetch-join any associations you need to display the product info, and you're only doing reads, this shoudl work fine.
Don't cache the entity at all. Cache whatever output you generate instead.
EDIT: If you don't care about the hydration overhead, you're using mysql, and your Products and associated tables don't change very often, you might prefer to just rely on the mySQL query cache. It's a fairly blunt object, but useful enough to mention.
You might want to try implementing __sleep or __wakeup methods for your entity class, as Doctrine 2 has special requirements and limitations concerning serialization/deserialization of entities (which is what happens when storing them in Zend_Cache).
There is this guidance.
General information about limitations including serialization.
I find this extremely strange since i just messed around with this myself and didn't have any issues with the proxy object being stored in the database. So im guessing your configuration is not setup 100% ?
If you find the issue with your configuration then be very aware of what timdev said you MUST merge the object back into the EntityManager else you will have weird bugs down the line.
A fourth solution available for you is also to retrieve the data as an array instead of an object, but then of course you lose all the functionality connected to your module which might not be exactly want you wanted.
It seems to me more like a configuration error. Either Proxies have not been generated or there is something wrong with the proxy directory and namespace.
Depending on your configuration, proxies can be either generated automatically or manually. Does your proxies have been indeed generated under App/Entity/Proxy ? Is this indeed the right directory?
FYI proxies can be manually generated by executing doctrine orm:generate-proxies <dest-dir>
Seconding what timdev says: Doctrine has built-in caching, you want to use it.
I also wonder from your question if you are experiencing any performance issues or if you are a victim of overly eager optimisation.
I wanted to add a new property to one of my model (table). Basically its a property that doesn't exist in the database but i need to add it to my model so that the custom generation tool (self tracking entity generator) will create the the property inside the the custom generated file.
I added a scaler property, its a string and called testme but it gives me the following error, Anybody know how i can fix this?
Error 2538 Error 11009: Property 'testme' is not mapped.
I am confused why do i need to map it to a table... its a field that doesn't exist in the table ...
Any help really appreciated
Thanks
Generally, you add un-mapped properties to a partial class instead of via the model. That said, use discretion; un-mapped properties can be confusing, since they mostly can't be used in LINQ to Entities queries.
I encountered this problem and was able to resolve it by deleting the entity (a view) in tne designer and readding it by refreshing from the database. This occured after a major redesign of the database and rewriting the view.
I know this doesn't address your problem, but Googling for this error returns this question. Hopefully this answer will be useful to others who are new to EF and hit this message, like I did.
I've been generating my DB from my conceptual model. If I modify the model without updating the DB, then I see this error message.
At the moment I don't have any data in my model, so simply regenerating the DB from the changed model makes these errors go away.
I have a Product object with a property that is a collection of type Workflows. In my "GetProducts" method on the domaincontext object I have set a breakpoint at the return statement to see if the workflows collection is filled.
It is.
On the client side I check Context.Products[0].Workflows in another breakpoint and I see 0 results. Is there a way to persist this nested data for consumption on the client side or is RIA Services inhibited from doing this?
If you have or can download the RiaServicesOverviewPreview.pdf document section 4.8 details how to do this. The basic summary it.
Make sure your L2S query specifies the .LoadWith<>() parameter. Lazy loading doesn't work with RIA services so you have to use implicit loading.
You need to apply the "IncludeAttribute" to the associated member. For example add the [Include] attribute on your Workflows field in the Product metadata class.
Ensure that your Workflow (child) type is exposed as a client type so it gets genned to the client side.
You can get the document here: http://www.microsoft.com/downloads/details.aspx?FamilyID=76bb3a07-3846-4564-b0c3-27972bcaabce&displaylang=en
I should kick myself. I realized that I needed to add "[Include]" to the property in Product within the DataService.metadata.cs file and now it gets sent to the client.