I am trying to enable Auditing using Annotations. My domain class has #Id field that is populated while constructing the object. I have added a java.util.Date field for lastModified and annotated it with #LastModifiedDate.
#Document
public class Book {
#Id
private String name;
private String isbn;
#LastModifiedDate
private Date lastModified;
public Book(String name) {
this.name = name;
}
}
I have enabled auditing in the Spring Configuration XML using <mongo:auditing/>.
When I try to save an instance of my object, I get the following error:
Book book1 = new Book("ABCD");
mongoOps.save(book1);
java.lang.IllegalArgumentException: Unsupported entity com.pankaj.Book! Could not determine IsNewStrategy.
I do not want to use the Auditable interface nor extend my domain classes from AbstractAuditable. I only want to use the Annotations.
Since I am not interested in the #CreatedBy and the #LastModifiedBy, I am not implementing the AuditAware interface as well.
I just want the #LastModifiedDate to work for my domain classes. What am I missing?
I am using version 1.7.0 of SpringData MongoDB.
You don't mention how you are configuring your MongoDB connection but if you are using AbstractMongoConfiguration, it will use the package of the actual configuration class to look for #Document annotated classes at startup.
If your entities are in a different package, you will have to manually hand that package by overriding AbstractMongoConfiguration.getMappingBasePackage(). Placing this in you Mongo Configuration class should do the trick (again, this is considering you are extending AbstractMongoConfiguration for your Mongo configuration):
#Override
protected String getMappingBasePackage() {
return "package.with.my.domain.classes";
}
I had same issue, later I determined that I was missing ID field with annotation;
#Id
private String Id
in my class I was trying to persist with
#Document(collection="collectionName")
I had the same issue when using annotations only configuration.
When you put #EnableMongoAuditing on a configuration class, Spring will create a MappingContext bean.
Then you have to make sure the same mappingContext is being used in the MongoTemplate.
#Configuration
#EnableMongoAuditing
#EnableMongoRepositories(value = "my.repositories.package", mongoTemplateRef = "myMongoTemplate")
class MongoConfig {
#Autowired
//Autowiring the MongoMappingContext will supply the same MongoMappingContext as the one used in auditing
MongoMappingContext mongoMappingContext;
#Bean
MongoTemplate myMongoTemplate() {
String databaseName = "mydbname";
MongoDbFactory factory = new SimpleMongoDbFactory(mongoClient, databaseName);
MongoConverter converter = new MappingMongoConverter(factory, mongoMappingContext);
MongoTemplate mongoTemplate = new MongoTemplate(factory, converter);
return mongoTemplate;
}
}
My project running in version 1.6.2 runs normally, except that # LastModifiedDate does not update. After I updated to version 1.7.1. I had the same problem as you.
I tried to implement the class: org. Springframework. Data. Domain. The Auditable this interface, seemingly can preserve the normal, but the createdBy and createdDate two fields could not be saved to the database.
I had the same issue and fixed it by extending the Document class with AbstractPersistable. In you case it can be
public class Book extends AbstractAuditable
Related
I need help to get the data from another document I have the following class.
#Data
#Document(collection = "tmVersion")
public class TmVersion {
#Id
private String id;
private String cVrVersionId;
#DBRef
private TaApplicationVersion taApplicationVersion;
}
and
#Data
#Document(collection = "taApplicationVersion")
public class TaApplicationVersion {
#Id
private String id;
private String dVrAppName;
private String dVrAppCode;
}
This is my repository in which I map what I want to be shown but in taApplicationVersion I need to show all this object also how is it done?
#Query(value="{}", fields="{'cVrVersionId': 1, 'taApplicationVersion.dVrAppName': 2,
'dVrVersionNumber': 3}")
Page<TmVersion> getAllVersionWithOutFile(Pageable pageable)
Couple of things to mention here.
If you want this kind of join between tables, then you need to rethink your choice of Mongodb as database. No Sql Databases thrive on the fact that there is very less coupling between tables(collections). So if you are using #DBRef, it negates that. Mongodb themselves do not recommend using #DBRef.
This cannot be achieved with the method like you have in the repository. You need to use Projections. Here is the documentation for that.
Create a Porjection interface like this. Here you can control which fields you need to include in the Main class(TmVersion)
#ProjectedPayload
public interface TmVersionProjection {
#Value("#{#taApplicationVersionRepository.findById(target.taApplicationVersion.id)}")
public TaApplicationVersion getTaApplicationVersion();
public String getId();
public String getcVrVersionId();
}
Change the TmVersionRepository like this
public interface TmVersionRepository extends MongoRepository<TmVersion, String> {
#Query(value="{}")
Page<TmVersionProjection> getAllVersionWithOutFile(Pageable pageable);
}
Create a new Repository for TaApplicationVersion. You can add #Query on top of this method and control which fields from subclass needs to be returned.
public interface TaApplicationVersionRepository extends MongoRepository<TaApplicationVersion, String> {
TaApplicationVersion findById(String id);
}
I want to update a MongoDB document containing a dbrf lazy attribute using spring data.
First of all, I load the existing document, I change the attributes I want and after that, I call #Repository save method, but when I check the document in MongoDB the dbrf lazy attribute is null.
I tried to load the attribute before by calling getAttribute, but that doesn't fix the problem.
Someone could help me?
Thanks
I have the Collection below:
#Data
#Document(collection = "calendriers")
public class CalendrierEntity {
#Id
#AutoGenerate(SequanceKey.CALENDRIER)
private Long id;
#NotNul
private String label;
#NotNull
private HorairesEntity horairesEntity;
#DBRef(lazy = true)
#CascadeSave
#Getter(AccessLevel.NONE)
private List<AbsenceEntity> absenceEntities;
}
and the repository bellow :
#Repository
public interface AbsenceRepository extends MongoRepository<CalendrierEntity, Long> {
AbsenceEntity findById(Long enfantId, LocalDate localDate);
}
I have calendrier document with Id 1L and want to update his label.
The calendrier document have allready a list of Absences.
this my code to update the label.
#Transactional
public void updateLabelCalendrier(Long id, String label){
CalendrierEntity calendrier = calenderRepository.findById(1L);
calendrier.setLabel(label);
calenderRepository.save(calendrier);
}
but when i check data in mongodb, i have the new label but my list of absences became null.
I'm trying to use JPA in Play Framework for Java version 2.3.7.
Before in Play 1.x, there was a Model superclass that made it really easy to execute queries like "List persons = Person.findAll();".
Is there Model superclass for javaJpa to do this?
There is no play.db.jpa.Model class for Play 2
But you can use play.db.jpa.JPA
and to find all do
JPA.em().createQuery("select p from Person p").getResultList();
where the create query contains JPQL and Person is entity name .
For more details check sample/computer-database-jpa.
Also check Play Docs,Similar
I think there's no play.db.jpa.Model on play 2.
The closest thing should be Ebean and SpringJPA which I use and recommend because of Ebean being soon removed in favor of JPA and being JPA mature and well documented.
As a quick example, those should look like:
Ebean
FindAllUsage
List<Person> people = Person.find.all();
Person model
#Entity
public class Person extends Model
{
#Id
public Long id;
public String value;
public static final Model.Finder<Long, UserPermission> find =
new Model.Finder<Long, UserPermission>(Long.class,UserPermission.class);
}
SpringJPA
FindAllUsage
List<Person> people = personRepository.findAll();
Person repository
#Named
#Singleton
public interface PersonRepository extends CrudRepository<Agent,Long> {
}
Person model
#Entity
public class Person
{
#Id
public Long id;
public String value;
}
In Play 2 the Model class is extending Ebean ORM by default and it has these general methods as save, update, find.byId, find.all etc.
This is my first time using MongoDb and morphia and I am pretty new to databases in general. I am wondering how I should organize my code with morphia. I was looking into using a DAO like it says on the morphia documentation, but the way they seem to be doing it, I would have to create a DAO for each model object that I have. I liked play's methodology of basically giving Model objects the ability to save themselves but I only have vague notions of what is going on under the hood here, so I am not sure how to achieve this with morphia, or if it is even desirable to do so. The code I have so far looks like this for the skeleton of a User model.
#Entity("user")
public class User extends BasicDAO<User, ObjectId>{
#Id ObjectId id;
public String firstName;
public String lastName;
public String email;
#Indexed public String username;
public String password;
public User(Mongo mongo, Morphia morphia){
super(mongo, morphia, "UserDAO");
}
public User(){
this(DBFactory.getMongo(), DBFactory.getMorphia());
}
public void save(){
ds.save(this);
}
public static User findByUsername(String uname){
return DBFactory.getDatastore().find(User.class, "username =", uname).get();
}
public static boolean authenticate(String uname, String pword){
User user = DBFactory.getDatastore().createQuery(User.class).filter("username", uname).filter("password", pword).get();
if(user == null)
return false;
else
return true;
}
}
It is currently throwing a StackOverflowException, and I am not sure why, but is this a reasonable pattern to try to accomplish?
Also the DBFactory basically just exists to maintain the singleton mongodb connection.
Play 2.0 have a module for working with MongoDb I think You should give it a try
https://github.com/vznet/play-mongo-jackson-mapper#readme
I started using Marphia with play framework 2.x. In my opinion, it is more sophisticated than the jackson mapper. I followed this example to install marphia plugin: https://github.com/czihong/playMongoDemo
While trying to do some tests on lazy loading, to check if i'm understanding it well, i got totally confused.
Here's the entities i'm using on my test:
#Entity
public class Family {
#Id
private int id;
#OneToMany(mappedBy="family", fetch=FetchType.LAZY)
private Set<Person> members;
//getters & setters
public String toString(){
String s="";
for(Person p:getMembers()){
s+=p.getFirstName();
}
return s;
}
}
#Entity
public class Person implements Comparable<Person>{
#Id
private int id;
private String firstName;
private String lastName;
#ManyToOne
private Family family;
//getters &setters
}
here's my main method:
public static void main(String[] args) {
factory = Persistence.createEntityManagerFactory(PERSISTENCE_UNIT_NAME);
em = factory.createEntityManager();
Query q = em.createQuery("select f from Family f");
List<Family> families= q.getResultList();
em.clear();
em.close();
factory.close();
for(Family f:families){
System.out.println(f);
}
}
What i understood from lazy loading, is that if an attribute is marked to be fetched lazily, and doesn't get accessed while it's managed, it won't be loaded in memory and any attempt to access it later won't work. Now what confuses me is that the test described above doesn't have any problem when accessing the lazy members attribute through the detached Family list, even after closing the EM and the EMF ! ... Is that normal? Am-i miss-understanding the lazy loading concept?
Note : I'm using a J2SE environment with an embedded DB. My provider is EclipseLink
Thanks in Advance
George
Check that your toString method is not triggered before the factory is closed, such as if the entity is being logged. I would not recommend triggering relationship in a toString method as this is error prone and can be triggered unexpectedly. Turning on EclipseLink logging will help show you where it gets accessed in the factory's lifecycle, assuming it is not part of the problem.
Ensure that you are using the eclipselink agent, or using static weaving. If you are using neither, then LAZY will not be weaved, and you will have EAGER.
Also EclipseLink supports access to LAZY relationships after the EntityManager is closed.
Although not after the factory is closed. However if the object was in the cache, then it may work after being closed as well. Also, if you have another factory open on the same persistence unit, then the persistence unit is still open.
It might be because the JPA provider is not required to use lazy initialization. It is not a must requirement for a JPA provider but a hint.
The JPA is required to eagerly fetch data when FetchType.EAGER is specified, but is not required to lazily fetch data when FetchType.LAZY is specified.