InstantiationException using Dozer - dozer

Using Dozer 5.4.0 to do mappings and am running into an InstantiationException from the DozerConverter abstract class. Here is my class, for the most part...BTW - tried both List and Map as well as HashMap and ArrayList - not expecting a difference and not disappointed!!
public class DozerJAXBElementConverter extends DozerConverter<ArrayList<JAXBElement<String>>, HashMap<String, String>> {
public DozerJAXBElementConverter(Class<ArrayList<JAXBElement<String>>> prototypeA, Class<HashMap<String, String>> prototypeB) {
super(prototypeA, prototypeB);
}
#Override
public ArrayList<JAXBElement<String>> convertFrom(HashMap<String, String> sourceStringMap, ArrayList<JAXBElement<String>> destJaxbList) {
//Nothign to convert?
if(sourceStringMap == null || sourceStringMap.isEmpty()) return null;
//Instantiate list if not already
if(destJaxbList == null) destJaxbList = new ArrayList<JAXBElement<String>>();
//convert
Iterator<Entry<String, String>> setIterator = sourceStringMap.entrySet().iterator();
while(setIterator.hasNext()){
Entry<String,String> e = setIterator.next();
if(e != null){
destJaxbList.add(new JAXBElement<String>(new QName(e.getKey(), DozerJAXBElementConverter.NAMESPACE), String.class, e.getValue()));
}
}
return destJaxbList;
}
#Override
public HashMap<String, String> convertTo(ArrayList<JAXBElement<String>> sourceJaxbList, HashMap<String, String> destStringMap) {
//Nothing to convert?
if(sourceJaxbList == null || sourceJaxbList.isEmpty()) return null;
//Instantiate list if not already
if(destStringMap == null) destStringMap = new HashMap<String,String>();
//convert
Iterator<JAXBElement<String>> i = sourceJaxbList.iterator();
while(i.hasNext()){
JAXBElement<String> element = i.next();
if(element != null){
destStringMap.put(element.getName().toString(), element.getValue());
}
}
return destStringMap;
}
}}
I implement the appropriate methods, all compiles and I get the following traceat runtime - Any help is appreciated:
org.dozer.MappingException: java.lang.InstantiationException: gov.dhs.cbp.ctpat.pip.translate.DozerJAXBElementConverter
at org.dozer.util.MappingUtils.throwMappingException(MappingUtils.java:82) ~[dozer-5.4.0.jar:?]
at org.dozer.util.ReflectionUtils.newInstance(ReflectionUtils.java:360) ~[dozer-5.4.0.jar:?]
at org.dozer.MappingProcessor.mapUsingCustomConverter(MappingProcessor.java:971) ~[dozer-5.4.0.jar:?]
at org.dozer.MappingProcessor.mapFromFieldMap(MappingProcessor.java:345) ~[dozer-5.4.0.jar:?]
at org.dozer.MappingProcessor.mapField(MappingProcessor.java:288) [dozer-5.4.0.jar:?]
at org.dozer.MappingProcessor.map(MappingProcessor.java:248) [dozer-5.4.0.jar:?]
at org.dozer.MappingProcessor.map(MappingProcessor.java:197) [dozer-5.4.0.jar:?]
at org.dozer.MappingProcessor.mapCustomObject(MappingProcessor.java:495) [dozer-5.4.0.jar:?]
at org.dozer.MappingProcessor.mapOrRecurseObject(MappingProcessor.java:446) [dozer-5.4.0.jar:?]
at org.dozer.MappingProcessor.mapFromFieldMap(MappingProcessor.java:342) [dozer-5.4.0.jar:?]
at org.dozer.MappingProcessor.mapField(MappingProcessor.java:288) [dozer-5.4.0.jar:?]
at org.dozer.MappingProcessor.map(MappingProcessor.java:248) [dozer-5.4.0.jar:?]
at org.dozer.MappingProcessor.map(MappingProcessor.java:197) [dozer-5.4.0.jar:?]
at org.dozer.MappingProcessor.map(MappingProcessor.java:187) [dozer-5.4.0.jar:?]
at org.dozer.MappingProcessor.map(MappingProcessor.java:124) [dozer-5.4.0.jar:?]
at org.dozer.MappingProcessor.map(MappingProcessor.java:119) [dozer-5.4.0.jar:?]
at org.dozer.DozerBeanMapper.map(DozerBeanMapper.java:120) [dozer-5.4.0.jar:?]

I simplified to :
public class DozerJAXBElementConverter extends DozerConverter<List, Map> {
public DozerJAXBElementConverter(Class<List> prototypeA, Class<Map> prototypeB) {
super(prototypeA, prototypeB);
}
public DozerJAXBElementConverter(){
super(List.class, Map.class);
}
to get it working - not sure why the original impl didn't work...

Related

Mixing Kafka Streams DSL with Processor API to get offset

I am trying to find a way to log the offset when an exception occurs.
Here is what I am trying to achieve:
void createTopology(StreamsBuilder builder) {
builder.stream(topic, Consumed.with(Serdes.String(), new JsonSerde()))
.filter(...)
.mapValues(value -> {
Map<String, Object> output;
try {
output = decode(value.get("data"));
} catch (DecodingException e) {
LOGGER.error(e.getMessage());
// TODO: LOG OFFSET FOR FAILED DECODE HERE
return new ArrayList<>();
}
...
return output;
})
.filter((k, v) -> !(v instanceof List && ((List<?>) v).isEmpty()))
.to(sink_topic);
}
I found this: https://docs.confluent.io/platform/current/streams/developer-guide/dsl-api.html#streams-developer-guide-dsl-transformations-stateful
and it is in my understanding that I need to use the Processor API but still haven't found a solution for my issue.
A ValueTransfomer can also access the offset via the ProcessorContext passed via init, and I believe it's much easier.
Here is the solution, as suggested by IUSR: https://stackoverflow.com/a/73465691/14945779 (thank you):
static class InjectOffsetTransformer implements ValueTransformer<JsonObject, JsonObject> {
private ProcessorContext context;
#Override
public void init(ProcessorContext context) {
this.context = context;
}
#Override
public JsonObject transform(JsonObject value) {
value.addProperty("offset", context.offset());
return value;
}
#Override
public void close() {
}
}
void createTopology(StreamsBuilder builder) {
builder.stream(topic, Consumed.with(Serdes.String(), new JsonSerde()))
.filter(...)
.transformValues(InjectOffsetTransformer::new)
.mapValues(value -> {
Map<String, Object> output;
try {
output = decode(value.get("data"));
} catch (DecodingException e) {
LOGGER.warn(String.format("Error reading from topic %s. Last read offset %s:", topic, lastReadOffset), e);
return new ArrayList<>();
}
lastReadOffset = value.get("offset").getAsLong();
return output;
})
.filter((k, v) -> !(v instanceof List && ((List<?>) v).isEmpty()))
.to(sink_topic);
}

#Inject issues in AttributeConverter in JPA 2.2/Java EE 8/Glassfish v5

JPA 2.2 should support #Inject in AttributeConverter, as described in the specification:
Attribute converter classes in Java EE environments support dependency injection through the Contexts
and Dependency Injection API (CDI) [ 7 ] when CDI is enabled[51]. An attribute converter class that
makes use of CDI injection may also define lifecycle callback methods annotated with the PostConstruct
and PreDestroy annotations. These methods will be invoked after injection has taken
place and before the attribute converter instance is destroyed respectively.
But when I convert my JPA 2.1 conveter sample to JPA 2.2, it does not work.
The original Converter worked both Glassfish v4 and v5:
#Converter
public class ListToStringConveter implements AttributeConverter<List<String>, String> {
//#Inject Logger log;
#Override
public String convertToDatabaseColumn(List<String> attribute) {
if (attribute == null || attribute.isEmpty()) {
return "";
}
return StringUtils.join(attribute, ",");
}
#Override
public List<String> convertToEntityAttribute(String dbData) {
if (dbData == null || dbData.trim().length() == 0) {
return new ArrayList<String>();
}
String[] data = dbData.split(",");
return Arrays.asList(data);
}
}
To taste the injection support of AttributeConverter in JPA 2.2, I extracted the conversion logic to anther CDI bean. And tried to run the codes in Glassfish v5(Java EE 8 Reference implementation).
#Converter(autoApply = false)
public class TagsConverter implements AttributeConverter<List<String>, String> {
// private static final Logger LOG = Logger.getLogger(TagsConverter.class.getName());
//
// #Override
// public String convertToDatabaseColumn(List<String> attribute) {
// if (attribute == null || attribute.isEmpty()) {
// return "";
// }
// return String.join( ",", attribute);
// }
//
// #Override
// public List<String> convertToEntityAttribute(String dbData) {
// if (dbData == null || dbData.trim().length() == 0) {
// return new ArrayList<>();
// }
//
// String[] data = dbData.split(",");
// return Arrays.asList(data);
// }
#Inject
Logger LOG;
#Inject
ConverterUtils utils;
#PostConstruct
public void postConstruct(){
LOG.log(Level.INFO, "calling #PostConstruct");
}
#PreDestroy
public void preDestroy(){
LOG.log(Level.INFO, "calling #PreDestroy");
}
#Override
public String convertToDatabaseColumn(List<String> attribute) {
LOG.log(Level.FINEST, "utils injected: {0}", utils != null);
if (attribute == null || attribute.isEmpty()) {
return "";
}
return utils.listToString(attribute);
}
#Override
public List<String> convertToEntityAttribute(String dbData) {
if (dbData == null || dbData.trim().length() == 0) {
return Collections.<String>emptyList();
}
return utils.stringToList(dbData);
}
}
And ConverterUtils class.
#ApplicationScoped
public class ConverterUtils {
public String listToString(List<String> tags) {
return join(",", tags);
}
public List stringToList(String str) {
return Arrays.asList(str.split(","));
}
}
In the TagsConverter, the expected ConverterUtils are not injected, and always get null when called it, a NPE threw.
The complete codes can be found here.
Update: I found I had created an issue on EclipseLink bugzilla 4 years ago.

Refresh suggestion list on change - cn1 autocomplete

I've implemented a custom autocomplete text field in a cn1 app, but I've noticed it only loads the suggestions list once, after that any change in the text doesn't trigger a change in the list, and the getSuggestionModel() is never called again. How can I achieve this (in my mind, basic) functionality?
This is my autocomplete class:
public class ForumNamesAutocomplete extends AutoCompleteTextField {
List<String>suggestions = new LinkedList<String>();
List<Map<String,Object>> fData;
StateMachine mac;
int currentIndex;
String prevText;
public static final String KEY_FORUM_NAME = "name";
public static final String KEY_FORUM_ID = "id";
public static final String KEY_FORUM_DESC = "desc";
public ForumNamesAutocomplete(StateMachine sm){
super();
mac = sm;
if(sm.forumData != null){
fData = mac.forumData;
}
}
#Override
protected boolean filter(String text) {
if(text.equals(prevText)){
return false;
}
setSuggestionList(text);
fireDataChanged(DataChangedListener.CHANGED, text.length());
prevText = text;
return true;
}
#Override
public void fireDataChanged(int type, int index) {
super.fireDataChanged(type, index);
}
public void setSuggestionList(String s){
if(suggestions == null){
suggestions = new LinkedList<String>();
}else{
suggestions.clear();
}
LinkedList<String> descList = new LinkedList<String>();
for(int i = 0;i<fData.size();i++){
boolean used = false;
Map<String,Object> forumMap = fData.get(i);
if(((String)forumMap.get(KEY_FORUM_NAME)).indexOf(s) != -1){
suggestions.add((String)forumMap.get(KEY_FORUM_NAME));
used = true;
}
if(!used && ((String)forumMap.get(KEY_FORUM_DESC)).indexOf(s) != -1){
descList.add((String)forumMap.get(KEY_FORUM_NAME));
}
}
suggestions.addAll(descList);
}
#Override
protected ListModel<String> getSuggestionModel() {
return new DefaultListModel<String>(suggestions);
}
}
This used to be simpler and seems to be a bit problematic now as explained in this issues.
Technically what you need to do is return one model and then mutate said model/fire modified events so everything will refresh. This is non-trivial and might not work correctly for all use cases so ideally we should have a simpler API to do this as we move forward.
After additional debugging, I saw that the getSuggestionModel() method was being called only during initialization, and whatever the suggestion list (in suggestion object) was at that point, it remained so. Instead I needed to manipulate the underlying ListModel object:
public class ForumNamesAutocomplete extends AutoCompleteTextField {
ListModel<String>myModel = new ListModel<String>();
...
#Override
protected boolean filter(String text) {
if(text.length() > 1){
return false;
}
setSuggestionList(text);
return true;
}
private void setSuggestionList(String s){
if(myModel == null){
myModel = new ListModel<String>();
}else{
while(myModel.getSize() > 0)
myModel.removeItem(0);
}
for(int i = 0;i<fData.size();i++){
boolean used = false;
Map<String,Object> forumMap = fData.get(i);
if(((String)forumMap.get(KEY_FORUM_NAME)).indexOf(s) != -1){
myModel.addItem((String)forumMap.get(KEY_FORUM_NAME));
used = true;
}
if(!used && ((String)forumMap.get(KEY_FORUM_DESC)).indexOf(s) != -1){
myModel.addItem((String)forumMap.get(KEY_FORUM_NAME));
}
}
}
...
}

spring data mongodb converter

I am using spring data mongo-db 1.4.1.RELEASE.
My entity 'Event' has a getter method which is calculated based on other properties:
public int getStatus() {
return (getMainEventId() == null) ? (elapseTimeInMin() < MINIMUM_TIME ? CANDIDATE :
VALID) : POINTER;
}
I wanted the property 'status' to be persisted only through the getter ,so I wrote converters:
#WritingConverter
public class EventWriteConverter implements Converter<Event ,BasicDBObject > {
static final Logger logger = LoggerFactory.getLogger(EventWriteConverter.class.getCanonicalName());
public BasicDBObject convert(Event event) {
logger.info("converting " +event );
if (event.getMainEventId() != null)
return new BasicDBObject("mainEventId", event.getMainEventId() );
BasicDBObject doc = new BasicDBObject("status",event.getStatus()).
append("updated_date",new Date()).
append("start",event.getS0()).
append("end",event.getS1()).
append("location",event.getLocation()).
;
BasicDBList list = new BasicDBList();
doc.append("access_points",event.getHotPoints());
return doc;
}
#ReadingConverter
public class EventReadConverter implements Converter<BasicDBObject, Event> {
#Inject
HotPointRepositry hotRepositry;
static final Logger logger = LoggerFactory.getLogger(EventReadConverter.class.getCanonicalName());
public Event convert(BasicDBObject doc) {
logger.info(" converting ");
Event event = new Event();
event.setId(doc.getObjectId("_id"));
event.setS0(doc.getDate("start"));
event.setS1(doc.getDate("end"));
BasicDBList dblist = (BasicDBList) doc.get("hot_points");
if (dblist != null) {
for (Object obj : dblist) {
ObjectId hotspotId = ((BasicDBObject) obj).getObjectId("_id");
event.addHot(hotRepositry.findOne(hotId));
}
}
dblist = (BasicDBList) doc.get("devices");
if (dblist != null) {
for (Object obj : dblist)
event.addDevice(obj.toString());
}
event.setMainEventId(doc.getObjectId("mainEventId"));
return event;
}
}
My test mongo configuration is
#Profile("test")
#Configuration
#EnableMongoRepositories(basePackages = "com.echo.spring.data.mongo")
#ComponentScan(basePackages = "com.echo.spring.data.mongo" )
public class MongoDbTestConfig extends AbstractMongoConfiguration {
static final Logger logger = LoggerFactory.getLogger(MongoDbTestConfig.class.getCanonicalName());
#Override
protected String getDatabaseName() {
return "echo";
}
#Override
public Mongo mongo() {
return new Fongo("echo-test").getMongo();
}
#Override
protected String getMappingBasePackage() {
return "com.echo.spring.data.mongo";
}
#Bean
#Override
public CustomConversions customConversions() {
logger.info("loading custom converters");
List<Converter<?, ?>> converterList = new ArrayList<Converter<?, ?>>();
converterList.add(new EventReadConverter());
converterList.add(new EventWriteConverter());
CustomConversions cus = new CustomConversions(converterList);
return new CustomConversions(converterList);
}
}
And my test (using fongo) is
ActiveProfiles("test")
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = MongoDbTestConfig.class )
public class SampleMongoApplicationTests {
#Test
#ShouldMatchDataSet(location = "/MongoJsonData/events.json")
public void shouldSaveEvent() throws IOException {
URL url = Resources.getResource("MongoJsonData/events.json");
List<String> lines = Resources.readLines(url,Charsets.UTF_8);
for (String line : lines) {
Event event = objectMapper.readValue(line.getBytes(),Event.class);
eventRepository.save(event);
}
}
I can see the converters are loaded when the configuration customConversions() is called
I added logging and breakpoints in the convert methods but they do not seems to be
called when I run or debug, though they are loaded .
What am I doing wrong ?
I had a similar situation, I followed Spring -Mongodb storing/retrieving enums as int not string
and I need both the converter AND converterFactory wired to get it working.

DataFlavor in JavaFX not recognized correctly

I'm experiencing a problem when D&D a custom object from Swing to JavaFX and I'm wondering if I'm doing something wrong or its probably a Java FX bug.
My Transferable has been defined as the following:
public class TransferableEmployee implements Transferable {
public static final DataFlavor EMPLOYEE_FLAVOR = new DataFlavor(Employee[].class, "Employee");
public static final DataFlavor DEFINITION_FLAVOR = new DataFlavor(PropertyDefinition[].class, "Definition");
private static final DataFlavor FFLAVORS [] = {EMPLOYEE_FLAVOR, DEFINITION_FLAVOR};
private Employee[] employees;
private PropertyDefinition[] propertyDefinitions;
public MintTransferableEmployee(Employee[] employees, PropertyDefinition[] propertyDefinitions) {
this.employees = employees != null ? employees.clone() : null;
this.propertyDefinitions = propertyDefinitions != null ? propertyDefinitions.clone() : null;
}
public DataFlavor[] getTransferDataFlavors() {
return FFLAVORS.clone();
}
public Object getTransferData(DataFlavor aFlavor) throws UnsupportedFlavorException {
Object returnObject = null;
if (aFlavor.equals(EMPLOYEE_FLAVOR)) {
returnObject = employees;
}
else if(aFlavor.equals(DEFINITION_FLAVOR)){
returnObject = propertyDefinitions;
}
else{
throw new UnsupportedFlavorException(aFlavor);
}
return returnObject;
}
public boolean isDataFlavorSupported(DataFlavor aFlavor) {
boolean lReturnValue = false;
for (int i=0, n=FFLAVORS.length; i<n; i++) {
if (aFlavor.equals(FFLAVORS[i])) {
lReturnValue = true;
break;
}
}
return lReturnValue;
}
}
I've created an imageView (FX Component) where I added the setOnDragOver just as the following:
employeePhotoImageView.setOnDragOver(new EventHandler<DragEvent>() {
#Override
public void handle(DragEvent event) {
System.out.println("dragOver");
event.getDragboard().getContentTypes();
event.getDragboard().getContent(DataFormat.lookupMimeType("application/x-java-serialized-object"));
}
});
The getContentTypes() returns a Map with [[application/x-java-serialized-object]], so now I try to get the Content, and this only returns the List of PropertyDefinition but no Employee at all (which in this case, is the one I need).
If I remove the data of the PropertyDefinition in the transferable, the employee is returned in the getContent(DataFormat) method.
For me, this means that JavaFX only works with 1 DataFlavor or somehow it is only returning the last flavor found in the Transferable.
Any clues on this?
Thanks in advanced...