I have a pretty simple relationship where I have a Person entity with a #OneToOne mapping to a PersonAttributes entity.
I have a PersonRepository as such:
#Transactional(propagation = Propagation.REQUIRED)
public interface PersonRepository extends JpaRepository<Person, BigDecimal>, JpaSpecificationExecutor<Person> {
}
If I write a test against the repository where I update both the Person and the related PersonAttributes entity everything gets saved fine.
If I write the same test "putting" the updates to "/api/persons/{theId}" only the Person entity gets updated, the PersonAttributes entity does not.
I've done the same thing through the actual client application ( with AngularJS ) and still only the Person gets updated.
This test works:
#Test
public void testUpdatePersonData() throws Exception {
Person person = personRepository.findOne(new BigDecimal(411));
assertNotNull( person );
assertFalse(person.getPersonData().getLastName().equalsIgnoreCase("ZZZ"));
person.setFormerSsn("dummySSN");
person.getPersonData().setLastName("ZZZ");
personRepository.saveAndFlush(person);
Person updatedPerson = personRepository.findOne(new BigDecimal(411));
assertEquals( "dummySSN", updatedPerson.getFormerSsn() );
assertEquals( "ZZZ", updatedPerson.getPersonData().getLastName() );
}
This test does not work:
#Test
public void testUpdatePersonData() throws Exception {
ObjectMapper mapper = new ObjectMapper();
MvcResult result = mockMvc.perform(get("/api/persons/411")).andReturn();
MockHttpServletResponse response = result.getResponse();
byte [] content = response.getContentAsByteArray();
// Convert from JSON to object
Person person = mapper.readValue(content, Person.class);
assertNotNull( person );
assertFalse(person.getPersonData().getLastName().equalsIgnoreCase("ZZZ"));
person.setFormerSsn("dummySSN");
person.getPersonData().setLastName("ZZZ");
String json = mapper.writeValueAsString(person);
result = mockMvc.perform(put( "/api/persons/411")
.content(json).contentType(MediaType.APPLICATION_JSON)).andReturn();
assertNotNull(result);
assertEquals( 204, result.getResponse().getStatus() );
personRepository.flush();
result = mockMvc.perform(get("/api/persons/411")).andReturn();
response = result.getResponse();
content = response.getContentAsByteArray();
Person updatedPerson = mapper.readValue(content, Person.class);
assertNotNull( updatedPerson );
assertEquals( "812818181", updatedPerson.getFormerSsn() );
// This seems to be a bug
// Refer to: http://stackoverflow.com/questions/24699480/spring-data-rest-onetoone-not-saving-the-relationship-updated-entity
// assertEquals( "ZZZ", updatedPerson.getPersonData().getLastName() );
}
Thoughts?
Thanks,
Cory.
Related
I am trying to do a multiple insert using the camel mongo db component.
My Pojo representation is :
Person {
String firstName;
String lastName;
}
I have a processor which constructs a valid List of Person pojo and is a valid json structure.
When this list of Person is sent to the mongodb producer , on invocation of createDoInsert the type conversion to BasicDBObject fails. This piece of code below looks to be the problem. Should it have more fall backs / checks in place to attempt the list conversion down further below as it fails on the very first cast itself. Debugging the MongoDbProducer the exchange object being received is a DBList which extends DBObject. This causes the singleInsert flag to remain set at true which fails the insertion below as we get a DBList instead of a BasicDBObject :
if(singleInsert) {
BasicDBObject insertObjects = (BasicDBObject)insert;
dbCol.insertOne(insertObjects);
exchange1.getIn().setHeader("CamelMongoOid", insertObjects.get("_id"));
}
The Camel MongoDbProducer code fragment
private Function<Exchange, Object> createDoInsert() {
return (exchange1) -> {
MongoCollection dbCol = this.calculateCollection(exchange1);
boolean singleInsert = true;
Object insert = exchange1.getIn().getBody(DBObject.class);
if(insert == null) {
insert = exchange1.getIn().getBody(List.class);
if(insert == null) {
throw new CamelMongoDbException("MongoDB operation = insert, Body is not conversible to type DBObject nor List<DBObject>");
}
singleInsert = false;
insert = this.attemptConvertToList((List)insert, exchange1);
}
if(singleInsert) {
BasicDBObject insertObjects = (BasicDBObject)insert;
dbCol.insertOne(insertObjects);
exchange1.getIn().setHeader("CamelMongoOid", insertObjects.get("_id"));
} else {
List insertObjects1 = (List)insert;
dbCol.insertMany(insertObjects1);
ArrayList objectIdentification = new ArrayList(insertObjects1.size());
objectIdentification.addAll((Collection)insertObjects1.stream().map((insertObject) -> {
return insertObject.get("_id");
}).collect(Collectors.toList()));
exchange1.getIn().setHeader("CamelMongoOid", objectIdentification);
}
return insert;
};
}
My route is as below :
<route id="uploadFile">
<from uri="jetty://http://0.0.0.0:9886/test"/>
<process ref="fileProcessor"/>
<unmarshal>
<csv>
<header>fname</header>
<header>lname</header>
</csv>
</unmarshal>
<process ref="mongodbProcessor" />
<to uri="mongodb:mongoBean?database=axs175&collection=insurance&operation=insert" />
and the MongoDBProcessor constructing the List of Person Pojo
#Component
public class MongodbProcessor implements Processor {
#Override
public void process(Exchange exchange) throws Exception {
ArrayList<List<String>> personlist = (ArrayList) exchange.getIn().getBody();
ArrayList<Person> persons = new ArrayList<>();
for(List<String> records : personlist){
Person person = new Person();
person.setFname(records.get(0));
person.setLname(records.get(1));
persons.add(person);
}
exchange.getIn().setBody(persons);
}
}
Also requested information here - http://camel.465427.n5.nabble.com/Problems-with-MongoDbProducer-multiple-inserts-tc5792644.html
This issue is now fixed via - https://issues.apache.org/jira/browse/CAMEL-10728
Usually tipical EF6 add should be
var newStudent = new Student();
newStudent.StudentName = "Bill";
context.Students.Add(newStudent);
context.SaveChanges();
In my case i'm using this logic
var student = context.Students.Find(id);
if (student == null)
{
student = new Student();
context.Students.Add(student );
}
student.blabla1 = "...";
student.blabla2 = "...";
//other 20 blabla...
student.StudentName = "Bill"; // StudentName is a required field
context.SaveChanges();
It's a bad practise edit data model after add method on entity framework 6? With a context injected can be thrown an error on case savechanges is called on another method and my actual thread is just before the assignment of the "StudentName"?
Why you can't do it like this ...
var student = context.Students.Find(id);
if (student == null)
{
student = new Student();
ModifyBlabla(student );//call private method
context.Students.Add(student);
}
else
{
ModifyBlabla(student );//call private method
}
context.SaveChanges();
Method for Edit :
private void ModifyBlabla(Student student)
{
student.blabla1 = "...";
student.blabla2 = "...";
//other 20 blabla...
student.StudentName = "Bill";
}
I've managed to Mock an Entity Framework dbcontext and dbset to allow for unit testing querying functions against a repository component.
I've been unable to perform a successful test against an a method using Entity Frameworks' AddOrUpdate() method. The error received is:
"Unable to call public, instance method AddOrUpdate on derived IDbSet type 'Castle.Proxies.DbSet`1Proxy'. Method not found."
Is it at all possible to test this?
private IRepository _Sut;
private Mock<DbSet<JobListing>> _DbSet;
private Mock<RecruitmentDb> _DbContext;
[SetUp]
public void Setup()
{
_DbContext = new Mock<RecruitmentDb>();
var JobsData = GenerateJobs().AsQueryable();
_DbSet = new Mock<DbSet<JobListing>>();
_DbSet.As<IQueryable<JobListing>>().Setup(x => x.Provider).Returns(JobsData.Provider);
_DbSet.As<IQueryable<JobListing>>().Setup(x => x.Expression).Returns(JobsData.Expression);
_DbSet.As<IQueryable<JobListing>>().Setup(x => x.ElementType).Returns(JobsData.ElementType);
_DbSet.As<IQueryable<JobListing>>().Setup(x => x.GetEnumerator()).Returns(JobsData.GetEnumerator());
_DbContext.Setup(x => x.JobListings).Returns(_DbSet.Object);
_Sut = new JobListingRepository(_DbContext.Object);
}
[Test]
public void Update_ChangedTitleProperty_UpdatedDetails()
{
var Actual = GenerateJobs().First();
var OriginalJob = Actual;
Actual.Title = "Newly Changed Title";
_Sut.Update(Actual);
Actual.Title.Should().NotBe(OriginalJob.Title);
Actual.Id.Should().Be(OriginalJob.Id);
}
private List<JobListing> GenerateJobs()
{
return new List<JobListing>
{
new JobListing{ Id = 1,
Title = "Software Developer",
ShortDescription = "This is the short description",
FullDescription = "This is the long description",
Applicants = new List<Applicant>(),
ClosingDate = DateTime.Now.AddMonths(5).Date},
new JobListing{
Id = 2,
Title = "Head Chef",
ShortDescription = "This is the short description",
FullDescription = "This is the long description",
Applicants = new List<Applicant>(),
ClosingDate = DateTime.Now.AddMonths(2).Date
},
new JobListing
{
Id = 3,
Title = "Chief Minister",
ShortDescription = "This is the short description",
FullDescription = "This is the long description",
Applicants = new List<Applicant>(),
ClosingDate = DateTime.Now.AddMonths(2).Date
}
};
}
The issue is because AddOrUpdate is an extension method. I overcame this issue by designing a wrapper. You can mock/stub the IAddOrUpdateHelper interface instead.
public class AddOrUpdateHelper : IAddOrUpdateHelper
{
public void AddOrUpdateEntity<TEntity>(DataContext db, params TEntity[] entities) where TEntity : class
{
db.Set<TEntity>().AddOrUpdate(entities);
}
}
public interface IAddOrUpdateHelper
{
void AddOrUpdateEntity<TEntity>(DataContext db, params TEntity[] entities) where TEntity : class;
}
I am using EclipseLink for my project.
I extend XMLMetadataSource (to provide a custom class loader) because entities I persist are runtime created. And it works OK.
I am getting "unknown entity type" when I do following.
Create entity
Create mapping
Create entity manager factory, provide custom class loader
create entity manager and persist. -- IT WORKS OK.
now drop entity , and drop from class loader
create same entity ,
create mapping again (of course it looks same)
try to refresh entity manager factory with new properties (new class loader, mapping file)
try to persist - complains "unknown type"
Any idea, if EL caches XML mappings.
I tried to re-creating factory again but its same error.
I am tried MySQL and Derby. with 'drop-and-create-tables' and 'create-or-extend-tables' .
same result.
I filed a bug with eclipse link.
https://bugs.eclipse.org/bugs/show_bug.cgi?id=426310
Its not a bug per say in EL. But problem is with EL "not building or re-creating 'class-->class_descriptor' map (an internal map that holds Class object of each entity and entities description). I found this accidentally. For those interested, here is a sample code that might help.
public class Test1 {
public Test1(String pu, Map<String, Object> props ) {
pu_name = pu;
properties = new HashMap<String, Object> ();
properties.putAll(props);
loader = new MyClassLoader();
}
public void initialization( ) {
mms = new WAMetadataSource();
properties.put(PersistenceUnitProperties.METADATA_SOURCE, mms);
properties.put(PersistenceUnitProperties.CLASSLOADER,loader);
if(emf == null || !emf.isOpen()) {
synchronized(Test1.class) {
if (emf == null || !emf.isOpen()) {
emf = Persistence.createEntityManagerFactory(pu_name, properties);
}
}
} else {
JpaHelper.getEntityManagerFactory(emf).refreshMetadata(properties);
}
System.out.println("======> refreshed. emf.hascode : " + emf.hashCode() + ", loader.h : " + loader.hashCode());
} public EntityManager getEntityManager(Map<String, Object> props) {
if (em == null) {
em = emf.createEntityManager(props);
}
return em;
} public void persist(Object obj) {
try {
getEntityManager(properties);
System.out.println("===> em.hascode =" + em.hashCode() +", " + JpaHelper.getEntityManager(em).getProperties().get(PersistenceUnitProperties.CLASSLOADER).hashCode() );
em.clear();
em.getTransaction().begin();
em.persist(obj);
em.getTransaction().commit();
} finally {
}
}public Object getRuntimeEntityObject(int ii) {
Object obj=null;
Class clazz = loader.loadClass("com.xxx.sample.entity.runtime.User");
if(ii == 1){
obj = clazz.getConstructor(String.class).newInstance("Jai Ramjiki-1");
} else {
obj = clazz.getConstructor(String.class).newInstance("Jai Ramjiki-2");
}
obj = clazz.cast(obj);
return obj;
}public static void main(String[] args) {
Map<String, Object> props = new HashMap<String, Object>();
props.put(PersistenceUnitProperties.JDBC_DRIVER, "com.mysql.jdbc.Driver");
props.put(PersistenceUnitProperties.JDBC_URL, "jdbc:mysql://localhost:3306/test" );
props.put(PersistenceUnitProperties.JDBC_USER, "root");
props.put(PersistenceUnitProperties.JDBC_PASSWORD, "root");
props.put(PersistenceUnitProperties.DDL_GENERATION, "create-or-extend-tables");
Test1 t1 = new Test1("mysql", props);
Object obj1 = t1.getRuntimeEntityObject(1);
System.out.println(" ****> obj1 = " + obj1 + ", classloader hashcode : " + obj1.getClass().getClassLoader().hashCode() );
t1.initialization();
t1.persist(obj1);
System.out.println("Class 1 : " + obj1.getClass().hashCode() + ", obj1 : " + obj1);
t1.close();
// now drop the previous class loader and rerun same.
Test1 t2 = new Test1("mysql", props);
Object obj2 = t2.getRuntimeEntityObject(2);
System.out.println(" ****> obj2 = " + obj2 + ", classloader hashcode : " + obj2.getClass().getClassLoader().hashCode() );
t2.initialization();
t2.persist(obj2);
t2.close();
Object obj3 = t1.getRuntimeEntityObject(1);
System.out.println(" ****> obj3 = " + obj3 + ", classloader hashcode : " + obj3.getClass().getClassLoader().hashCode() );
t1.persist(obj3);
}
AND extend XMLMetadatSource
#Override
public XMLEntityMappings getEntityMappings(Map<String, Object> properties, ClassLoader classLoader, SessionLog log) {
properties.put(PersistenceUnitProperties.METADATA_SOURCE_XML_FILE, "eclipselink-orm-user.xml");
properties.put(PersistenceUnitProperties.VALIDATOR_FACTORY, null);
return super.getEntityMappings(properties, classLoader, log);
}
And create a runtime class using javassist in your CustomClassloader which extends ClassLoader
public void createRuntimeClass(String className) throws Exception {
CtClass bclass = pool.makeClass(className);
bclass.addConstructor(CtNewConstructor.defaultConstructor(bclass));
Map<String, String> fields = new HashMap<String, String>();
addFields(fields);
int noOfFields = fields.size();
CtClass[] fclasses = new CtClass[noOfFields];
int ii=0;
for (Entry<String, String> field : fields.entrySet()) {
String fieldName = field.getKey();
String fieldType = field.getValue();
//.. code to add field
bclass.addField(bfield);
//add getter method.
// add getter and setters
}
CtConstructor userConstructor = CtNewConstructor.make(constructorSource, bclass);
bclass.addConstructor(userConstructor);
byte bytes [] = bclass.toBytecode();
Class cls = bclass.toClass(this, null);
loadedClasses.put(className, cls);
loadClassBytes.put(className, bytes);
}
and Override loadClass and getResourceAsStream methods.
public Class<?> loadClass(String name) throws ClassNotFoundException {return clazz = loadedClasses.get(name);}
public InputStream getResourceAsStream(String name) {return loadClassBytes.get(className);}
hope this helps
EL has provided a way to clear the cache of current project, and setting descriptors maps. but none of them worked. Not sure it is intended behavior or by mistake they exposed that API.
Gopi
Yes, persistence unit loading is only done once. If you are using an XMLMetadataSource to change mappings, you must tell the factory to refresh its mappings using refreshMetadata() on the EMF as described here:
http://wiki.eclipse.org/EclipseLink/DesignDocs/340192#Refresh
After that, the next EntityManagers obtained will be using the new mappings, while existing EMs will still use the old mappings.
I need to populate my pojo class based on the request param 'type'.
so I have code like
#ModelAttribute
public void getModelObject(HttpServletRequest request, ModelMap modelMap) {
String typeCombo = request.getParameter("type");
System.out.println("typeCombo: " + typeCombo);
if (typeCombo != null) {
if (condition) {
modelMap.addAttribute("modelObj", new ClassB()); //ClassB extends ClassA
} else if (another condition) {
modelMap.addAttribute("modelObj", new ClassC()); //ClassC extends ClassA
} else {
System.out.println("no type found");
}
} else {
System.out.println("typecombo null");
}
}
I use above method to get create correct subclasses which will be used to add / update. The above one works fine in case of "POST" - for creating a record. But for "PUT" request.getParameter("type") always returns null. So for editing, I'm not able to get correct subclasses.
Below are my post and put request mapping:
#RequestMapping(value = "", method = RequestMethod.POST, headers = "Accept=*/*")
#ResponseBody
public String addCredentials(#ModelAttribute("modelObj") Credential credential,
ModelMap modelMap) {
//code
}
#RequestMapping(value = "/edit/{id}", method = RequestMethod.PUT, headers = "Accept=*/*")
#ResponseBody
public Credential editCredential(#ModelAttribute ("modelObj") Credential credential, #PathVariable long id, ModelMap model) {
//code
}
Any help is much appreciated.
Register the filter HttpPutFormContentFilter like this:
<beans:bean id="httpPutFormContentFilter"
class="org.springframework.web.filter.HttpPutFormContentFilter" />