Xstream attribute and implict collection ambiguity - deserialization

Hope you are fine.
Can you please help me to resolve my below problem. I have an xml element with attribute named as “value” which can have one value E.g. (1) below , but there can be another case which i can have it as a child element named as “value” with multiple values E.g. (2) below.
E.g. 1
<variable id="123" value="Adam"/>
E.g. 2
<variable id="123">
<value>Adam</value>
<value>Philip</value>
... ...
</variable>
Can you please tell me how would i map this in Xstream serializer in same java class?
if i configure as below it gives me duplicate error
#XStreamAlias(“variable”)
public class Variable {
#XStreamAsAttribute
private String id;
#XStreamAsAttribute
private String value;
#XStreamImplicit
private List<String> value = new ArrayList<String>();

Related

How to set an attribute of type Collection<> to null in a resultMap?

I have a result map that looks like this:
<resultMap id="myMap" type="myEntity">
<id property="id" column="ID" />
<result property="name" column="NAME" />
<collection property="places" ofType="MyPlace" >
<result property="placeName" column="PLACE_NAME" />
</collection>
<resultMap>
<select id="mySelectStatement" parameterType="MyQuery" resultMap="myMap">
....
</select>
In the incoming parameter (MyQuery) of the select statement I have a flag that indicates whether the places should by joined and resolved or left out. Using the <if test="myFlag" />
construct this all works well.
Now, the only problem that I have is the following: When the flag indicates that the places should be resolved but there are no places connected with the entity then the resulting collection is empty (so far so good). However, when the flag indicates that the places should not be resolved, the resulting collection is also empty.
It is no longer decidable whether the field "places" is empty because there are simply no places or because they weren't being resolved at all. What I would like to have is some mechanism that sets the field "places" to ´null´ instead of returning an empty collection in the case that the flag that decides whether the places should be resolved is set to false.
EDIT:
Some more code to better understand the example
// MyEntity.java
public class MyEntity {
private int id;
private String name;
private List<MyPlace> places;
}
// MyQuery.java
public class MyQuery {
private boolean myFlag;
// getter & setter
}
// MyComponent.java
public class MyComponent {
private MyMapper myMapper;
public void findByQuery(MyQuery myQuery) {
List<MyEntity> myEntities = myMapper.mySelectStatement();
MyEntity firstEntity = myEntities.get(0);
List<Place> places = firstEntity.getPlaces();
if(places.isEmpty()) {
System.out.println("Hm I wonder why they are empty");
}
}
}
// MyMapper.java
public interface MyMapper {
List<MyEntity> mySelectStatement(MyQuery myQuery);
}
// MyMapper.xml
// result map from above
<select id="mySelectStatement" parameterType="MyQuery" resultMap="myMap">
SELECT * FROM MY_ENTITY
<if test="myFlag">
LEFT OUTER JOIN PLACES ON .....
</if>
</select>
And some clarification: This all works in principle. The only problem that I have is that I can not distinguish between an empty collection "places" that is empty because there were no entries in the table AND an empty collection places that is empty because they were not supposed to be resolved in the first place.
My current solution is to check in MyComponent after the method call whether the query that was passed in has the flag set to false. If that is the case, the "places" variable is manually set to null.

org.apache.solr.common.SolrException: TransactionLog doesn't know how to serialize class org.bson.types.ObjectId; try implementing ObjectResolver?

When performing a data import from mongodb, Solr throws the following error:
org.apache.solr.common.SolrException: TransactionLog doesn't know how to serialize class org.bson.types.ObjectId; try implementing ObjectResolver?
at org.apache.solr.update.TransactionLog$1.resolve(TransactionLog.java:100)
at org.apache.solr.common.util.JavaBinCodec.writeVal(JavaBinCodec.java:234)
at org.apache.solr.common.util.JavaBinCodec.writeSolrInputDocument(JavaBinCodec.java:589)
at org.apache.solr.update.TransactionLog.write(TransactionLog.java:395)
at org.apache.solr.update.UpdateLog.add(UpdateLog.java:532)
at org.apache.solr.update.UpdateLog.add(UpdateLog.java:516)
at org.apache.solr.update.DirectUpdateHandler2.doNormalUpdate(DirectUpdateHandler2.java:320)
at org.apache.solr.update.DirectUpdateHandler2.addDoc0(DirectUpdateHandler2.java:239)
at org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:194)
at org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:67)
at org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
at org.apache.solr.update.processor.DistributedUpdateProcessor.doLocalAdd(DistributedUpdateProcessor.java:979)
at org.apache.solr.update.processor.DistributedUpdateProcessor.versionAdd(DistributedUpdateProcessor.java:1192)
at org.apache.solr.update.processor.DistributedUpdateProcessor.processAdd(DistributedUpdateProcessor.java:748)
at org.apache.solr.update.processor.LogUpdateProcessorFactory$LogUpdateProcessor.processAdd(LogUpdateProcessorFactory.java:103)
at org.apache.solr.handler.dataimport.SolrWriter.upload(SolrWriter.java:80)
at org.apache.solr.handler.dataimport.DataImportHandler$1.upload(DataImportHandler.java:254)
at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:526)
at org.apache.solr.handler.dataimport.DocBuilder.buildDocument(DocBuilder.java:414)
at org.apache.solr.handler.dataimport.DocBuilder.doFullDump(DocBuilder.java:329)
at org.apache.solr.handler.dataimport.DocBuilder.execute(DocBuilder.java:232)
at org.apache.solr.handler.dataimport.DataImporter.doFullImport(DataImporter.java:415)
at org.apache.solr.handler.dataimport.DataImporter.runCmd(DataImporter.java:474)
at org.apache.solr.handler.dataimport.DataImporter.lambda$runAsync$0(DataImporter.java:457)
at java.lang.Thread.run(Thread.java:748)
My Solr version is 6.6.0. What could be the reason for the error and how can it be resolved?
I came across this issue while trying to import data from multiple collections in mongoDB.
Assuming you are not using mongo-connector, I used the following to import data.
Solr-6.6.0
solr-dataimporthandler-6.6.0
mongo-java-driver-3.5.0
Solr Mongo importer
Since the returned '_id' is of type ObjectId, my work around solution was to convert the '_id' to String before indexing it into solr and while querying with respect to '_id', convert it to type ObjectId before running the query.
Download the solr mongo importer and make the following changes.
MongoMapperTransformer.java
public class MongoMapperTransformer extends Transformer {
#Override
public Object transformRow(Map<String, Object> row, Context context) {
for (Map<String, String> map : context.getAllEntityFields()) {
String mongoFieldName = map.get(MONGO_FIELD);
String mongoId = map.get(MONGO_ID);
if (mongoFieldName == null)
continue;
String columnFieldName = map.get(DataImporter.COLUMN);
//If the field is ObjectId convert it into String
if (mongoId != null && Boolean.parseBoolean(mongoId)) {
Object srcId = row.get(columnFieldName);
row.put(columnFieldName, srcId.toString());
}
else{
row.put(columnFieldName, row.get(mongoFieldName));
}
}
return row;
}
public static final String MONGO_FIELD = "mongoField";
//To identify the _id field
public static final String MONGO_ID = "objectIdToString";
}
Next, Replace the function
public Iterator <Map<String, Object>> getData(String query){...}
in MongoDataSource.java with the following:
#Override
public Iterator<Map<String, Object>> getData(String query) {
DBObject queryObject = new BasicDBObject();
/* If querying by _id, since the id is a string now,
* it has to be converted back to type ObjectId() using the
* constructor
*/
if(query.contains("_id")){
#SuppressWarnings("unchecked")
Map<String, String> queryWithId = (Map<String, String>) JSON.parse(query);
String id = queryWithId.get("_id");
queryObject = new BasicDBObject("_id", new ObjectId(id));
}
else{
queryObject = (DBObject) JSON.parse(query);
}
LOG.debug("Executing MongoQuery: " + query.toString());
long start = System.currentTimeMillis();
mongoCursor = this.mongoCollection.find(queryObject);
LOG.trace("Time taken for mongo :"
+ (System.currentTimeMillis() - start));
ResultSetIterator resultSet = new ResultSetIterator(mongoCursor);
return resultSet.getIterator();
}
After these changes you can build the jar using ant.
Copy the jars (solr mongo importer and the mongo-java-driver) into the lib directory. I copied them into ${solr-install-dir}/contrib/dataimport-handler/lib
Add the lib directives in solr-config.xml for the above jars:
<lib dir="${solr.install.dir:../../../..}/contrib/dataimporthandler/lib" regex=".*\.jar" />
Finally, here's an example of the mongo collections and data-config.xml
User collection
{
"_id" : ObjectId("56e9c892e4b0355017b2fa0f"),
"name" : "User1",
"phone" : "123456789"
}
Address collection
{
"_id" : ObjectId("56e9c892e4b0355017b2fa0f"),
"address" : "#666, Maiden street"
}
data-config.xml
Do not forget to mention objectIdToString="true" for the _id field so that the MongoMapperTransformer can stringify the id.
<dataConfig>
<dataSource name="MyMongo"
type="MongoDataSource"
database="test"
/>
<document name="UserDetails">
<!-- if query="" then it imports everything -->
<entity name="users"
processor="MongoEntityProcessor"
query=""
collection="user"
datasource="MyMongo"
transformer="MongoMapperTransformer">
<field column="_id" name="id" mongoField="_id" objectIdToString="true" />
<field column="phone" name="phone" mongoField="phone"/>
<entity name="address"
processor="MongoEntityProcessor"
query="{_id:'${users._id}'}"
collection="address"
datasource="MyMongo"
transformer="MongoMapperTransformer">
<field column="address" name="adress" mongoField="address"/>
</entity>
</entity>
</document>
</dataConfig>
The managed-schema will have the id field as string.
Also, if you have nested objects in mongodb you will have to use script transformers to index them in solr.
Hope this helps,
Good luck !
According to the error message,
You need to implement JavaBinCodec.ObjectResolver for org.bson.types.ObjectId type, so Solr will know how to serialize instances of this class.
JavaBinCodec.ObjectResolver Documentation
public static interface JavaBinCodec.ObjectResolver Allows extension
of JavaBinCodec to support serialization of arbitrary data types.
Implementors of this interface write a method to serialize a given
object using an existing JavaBinCodec
Once you write your JavaBinCodec.ObjectResolver implementation you should register it using JavaBinCodec
JavaBinCodec Documentation
public class JavaBinCodec extends Object Defines a space-efficient
serialization/deserialization format for transferring data.
JavaBinCodec has built in support many commonly used types. This
includes primitive types (boolean, byte, short, double, int, long,
float), common Java containers/utilities (Date, Map, Collection,
Iterator, String, Object[], byte[]), and frequently used Solr types
(NamedList, SolrDocument, SolrDocumentList). Each of the above types
has a pair of associated methods which read and write that type to a
stream.
Classes that aren't supported natively can still be
serialized/deserialized by providing an JavaBinCodec.ObjectResolver
object that knows how to work with the unsupported class. This allows
JavaBinCodec to be used to marshall/unmarshall arbitrary content.
NOTE -- JavaBinCodec instances cannot be reused for more than one
marshall or unmarshall operation.

A strange phenomenon when use dozer in jpa project,why Mapping annotation in lazy load object can't work?

I met a very strange phenomenon when using dozer in jpa project.
I have a UserSupplier object and a Supplier object.
UserSupplier:
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "supplier_id", nullable = false)
private Supplier supplier;
In my code I first query a UserSupplier List, then convert it to SupplierList.
List<Supplier> supplierList = new ArrayList<>(usList.size());
usList.forEach(us -> supplierList.add(us.getSupplier()));
Then I convert SupplierList to SupplierView List and return it to Caller.
BeanMapper.mapList(supplierList, SupplierView.class);
My dozer configure in these objects like below
Supplier:
#Id
#GeneratedValue
#Mapping("supplierId")
private int id;
SupplierView:
private int supplierId;
Very funny, supplierId in SupplierView always 0(default int value),but other fileds can convert successfully, only id field fail. I don't why is this, why only id field can't convert to supplierId, but other fields could?
For above problem, there are below solutions
1. Change field name (supplierId to id):
Supplier:
// #Mapping("supplierId")
private int id;
SupplierView:
private int id;
but Caller(front-end) have to change code.
2. Change fetchType to eager:
UserSupplier:
#ManyToOne
private Supplier supplier;
After reading dozer documentation, I find some thing. After trying it, I got another solution.
That is add a dozer.properties into classpath, content inside is
org.dozer.util.DozerProxyResolver=org.dozer.util.HibernateProxyResolver
More detail please see
http://dozer.sourceforge.net/documentation/proxyhandling.html
This is probably because JPA uses proxy objects for lazy loading of single entity reference. Proxy object is effectively a subclass of your entity class. I guess that dozer can find #Mapping annotation only on fields declared in the class of given object, and not on fields defined in parent classes. Dozer project states that annotation mapping is experimental. Therefore it is possible that it does not cover mapping class hierarchies well.
I suggest to try configure mapping of supplierId by other means (XML, dozer mapping API) and see if it works. If all fails, you could write a custom MapperAware converter between Supplier and SupplierView. You would map source object to target object using supplied mapper, and finilize it by copying value of id to supplierId.

I need to check for validity of data before deep mapping with dozer, can I?

I'm using dozer to map between my Model Entities and my DTOs.
Now I'm facing with the problem that I need to map some properties of classA.classC to different properties of classB, but first I need to check for inconsistency, because if I don't classC will throws exception and the mapping will not work.
So assume that I have:
class ClassA {
private String name;
private ClassC c;
public ClassC getC() throws ValidityException;
}
class ClassB {
private String code;
private Integer value;
}
class ClassC {
private String name;
private Integer value;
// Getters & Setters below
}
So now I want to map like this:
<mapping>
<class-a>ClassA</class-a>
<class-b>ClassB</class-b>
<field>
<a>c.name</a>
<b>code</b>
</field>
<field>
<a>c.value</a>
<b>value</b>
</field>
</mapping>
if access to ClassC instance from ClassA instance throws exception, I will need to map null for both b properties.
From what I was reading I assume that I should use a CustomConverter in order to access ClassC instance catch the exception and map null in that cases, but not sure how can I implement this kind of converter.
Anyone could give me some ideas about how this can be implemented using Dozer?
Are you sure you wrote the correct mapping? Because ,
<field>
<a>c.name</a>
<b>name</b>
In above snippet, you wrote name for classB. Actually it should be code.

Spring MVC <form:options> selected value

I got relation "Many to one" between my CPVCode adn OrderType:
public class CPVCode {
#Id
#GeneratedValue
private int id;
private String cpv_code;
private String description;
#ManyToOne
#JoinColumn(name="id_parent")
private OrderType orderType;
//getters na setters: ...
}
Everything works well, but I NEED to displays selected value in my form:
<form:select path="orderType" items="${orderTypes }" itemLabel="title" itemValue="id" ></form:select>
It seems to work almost good: It displays list of all OrderTypes ( by ${orderTypes} which returns array of that objects type), it saves proper values by Hibernate, BUT thereis no way to select current value of orderType after refreshing...
your passing a list to a selectbox, so it iterates over the list. You need to change which bean the selectbox references - a single value oderType from CPVcode.
And also possibly change the selectbox to a different html form element ?