Springboot Generic JPA AttributeConverter - spring-data-jpa

I have below sample code, i am trying to write generic JPA converter which could convert,
Collection of user defined objects to Json
vice versa
Below is sample code I was trying to achieve the result but looks like it's not correct.
Please take a look.
To be more clear i need like below
List To string
Json String to List
Please suggest
#Converter(autoApply = true)
public class SetJsonConverter<E extends Collections> implements AttributeConverter<E, Object> {
#Override
public Object convertToDatabaseColumn(E e) {
return null;
}
#Override
public E convertToEntityAttribute(Object o) {
ObjectMapper objectMapper=new ObjectMapper();
return null;
}
}

JPA will not automatically handle generic converters. Each collection type and element type will require subclassing. You will need to define the base converter the following way:
public class AbstractJsonConverter<T, C extends Collection<T>> implements AttributeConverter<C, String> {
private final ObjectMapper objectMapper;
private final TypeReference<C> collectionType;
public AbstractJsonConverter(ObjectMapper objectMapper, Class<T> elementType, TypeReference<C> collectionType) {
this.objectMapper = objectMapper;
this.collectionType = collectionType;
}
#Override
public String convertToDatabaseColumn(C collection) {
try {
return objectMapper.writeValueAsString(collection);
} catch (JsonProcessingException e) {
throw new RuntimeException(e);
}
}
#Override
public C convertToEntityAttribute(String jsonString) {
try {
return objectMapper.readValue(jsonString, collectionType);
} catch (IOException e) {
throw new RuntimeException();
}
}
}
You then define specific converters as:
#Converter(autoApply = true)
public class UserSetConverter extends AbstractJsonConverter<User, Set<User>> {
public UserSetConverter(ObjectMapper objectMapper) {
super(objectMapper, User.class, new TypeReference<Set<User>>() {});
}
}

Related

Flink - KafkaSink not writing data to kafka topic

I'm trying to read JSON events from Kafka, aggregate it on a eventId and its category and write them to a different kafka topic through flink. The program is able to read messages from kafka, but KafkaSink is not writing the data back to the other kafka topic. I'm not sure on the mistake I'm doing. Can someone please check and let me know, where I'm wrong. Here is the code I'm using.
KafkaSource<EventMessage> source = KafkaSource.<EventMessage>builder()
.setBootstrapServers(LOCAL_KAFKA_BROKER)
.setTopics(INPUT_KAFKA_TOPIC)
.setGroupId(LOCAL_GROUP)
.setStartingOffsets(OffsetsInitializer.earliest())
.setValueOnlyDeserializer(new InputDeserializationSchema())
.build();
WindowAssigner<Object, TimeWindow> windowAssigner = TumblingEventTimeWindows.of(WINDOW_SIZE);
DataStream<EventMessage> eventStream = env.fromSource(source, WatermarkStrategy.noWatermarks(), "Event Source");
DataStream<EventSummary> events =
eventStream
.keyBy(eventMessage -> eventMessage.getCategory() + eventMessage.getEventId())
.window(windowAssigner)
.aggregate(new EventAggregator())
.name("EventAggregator test >> ");
KafkaSink<EventSummary> sink = KafkaSink.<EventSummary>builder()
.setBootstrapServers(LOCAL_KAFKA_BROKER)
.setRecordSerializer(KafkaRecordSerializationSchema.builder()
.setTopic(OUTPUT_KAFKA_TOPIC)
.setValueSerializationSchema(new OutputSummarySerializationSchema())
.build())
.setDeliverGuarantee(DeliveryGuarantee.AT_LEAST_ONCE)
.build();
events.sinkTo(sink);
These are the POJO's I've created for input message and output.
# EventMessage POJO
public class EventMessage implements Serializable {
private Long timestamp;
private int eventValue;
private String eventId;
private String category;
public EventMessage() { }
public EventMessage(Long timestamp, int eventValue, String eventId, String category) {
this.timestamp = timestamp;
this.eventValue = eventValue;
this.eventId = eventId;
this.category = category;
}
.....
}
# EventSummary POJO
public class EventSummary {
public EventMessage eventMessage;
public int sum;
public int count;
public EventSummary() { }
....
}
These are the deserialization and serialization schemas I'm using.
public class InputDeserializationSchema implements DeserializationSchema<EventMessage> {
static ObjectMapper objectMapper = new ObjectMapper();
#Override
public EventMessage deserialize(byte[] bytes) throws IOException {
return objectMapper.readValue(bytes, EventMessage.class);
}
#Override
public boolean isEndOfStream(EventMessage inputMessage) {
return false;
}
#Override
public TypeInformation<EventMessage> getProducedType() {
return TypeInformation.of(EventMessage.class);
}
}
public class OutputSummarySerializationSchema implements SerializationSchema<EventSummary> {
static ObjectMapper objectMapper = new ObjectMapper();
Logger logger = LoggerFactory.getLogger(OutputSummarySerializationSchema.class);
#Override
public byte[] serialize(EventSummary eventSummary) {
if (objectMapper == null) {
objectMapper.setVisibility(PropertyAccessor.FIELD, JsonAutoDetect.Visibility.ANY);
objectMapper = new ObjectMapper();
}
try {
String json = objectMapper.writeValueAsString(eventSummary);
return json.getBytes();
} catch (com.fasterxml.jackson.core.JsonProcessingException e) {
logger.error("Failed to parse JSON", e);
}
return new byte[0];
}
}
I'm using this aggregator for aggregating the JSON messages.
public class EventAggregator implements AggregateFunction<EventMessage, EventSummary, EventSummary> {
private static final Logger log = LoggerFactory.getLogger(EventAggregator.class);
#Override
public EventSummary createAccumulator() {
return new EventSummary();
}
#Override
public EventSummary add(EventMessage eventMessage, EventSummary eventSummary) {
eventSummary.eventMessage = eventMessage;
eventSummary.count += 1;
eventSummary.sum += eventMessage.getEventValue();
return eventSummary;
}
#Override
public EventSummary getResult(EventSummary eventSummary) {
return eventSummary;
}
#Override
public EventSummary merge(EventSummary summary1, EventSummary summary2) {
return new EventSummary(null,
summary1.sum + summary2.sum,
summary1.count + summary2.count);
}
}
Can someone help me on this?
Thanks in advance.
In order for event time windowing to work, you must specify a proper WatermarkStrategy. Otherwise, the windows will never close, and no results will be produced.
The role that watermarks play is to mark a place in a stream, and indicate that the stream is, at that point, complete through some specific timestamp. Until receiving this indicator of stream completeness, windows continue to wait for more events to be assigned to them.
To simply the debugging the watermarks, you might switch to a PrintSink until you get the watermarking working properly. Or to simplify debugging the KafkaSink, you could switch to using processing time windows until the sink is working.

How to work with PGpoint for Geolocation using PostgreSQL?

I found a lot of answers that suggest to use spatial data with Hibernate spatial data geolocation but I want to know if that is the best because I found that PostgreSQL works with PGpoint for GeoLocation. I implemented but it doesn't work because doesn't save.
ERROR: column "location" is of type point but expression is of type character varying
I have the same question but nobody answered him. So let me add other question below if nobody knows about this question.
As suggestion I'd want to know what is the best way to use Geo data on Spring Boot Context
Thanks! have a good day.
There is no way to save/update/get/ PGpoint object directly,
Then you have to create your own user type for supporting PGpoint in order to convert it, before this is saved, UserType is a class of Hibernate which allows to create custom type in order to convert it before to save on database.
Here is code that you need to implement:
First: Need to create a class that implements of UserType:
public class PGPointType implements UserType {
#Override
public int[] sqlTypes() {
return new int[]
{
Types.VARCHAR
};
}
#SuppressWarnings("rawtypes")
#Override
public Class<PGpoint> returnedClass() {
return PGpoint.class;
}
#Override
public boolean equals(Object obj, Object obj1) {
return ObjectUtils.equals(obj, obj1);
}
#Override
public int hashCode(Object obj) {
return obj.hashCode();
}
#Override
public Object nullSafeGet(ResultSet resultSet, String[] names, SharedSessionContractImplementor sharedSessionContractImplementor, Object o) throws SQLException {
if (names.length == 1) {
if (resultSet.wasNull() || resultSet.getObject(names[0]) == null) {
return null;
} else {
return new PGpoint(resultSet.getObject(names[0]).toString());
}
}
return null;
}
#Override
public void nullSafeSet(PreparedStatement statement, Object value, int index, SharedSessionContractImplementor sharedSessionContractImplementor) throws SQLException {
if (value == null) {
statement.setNull(index, Types.OTHER);
} else {
statement.setObject(index, value, Types.OTHER);
}
}
#Override
public Object deepCopy(Object obj) {
return obj;
}
#Override
public boolean isMutable() {
return Boolean.FALSE;
}
#Override
public Serializable disassemble(Object obj) {
return (Serializable) obj;
}
#Override
public Object assemble(Serializable serializable, Object obj) {
return serializable;
}
#Override
public Object replace(Object obj, Object obj1, Object obj2) {
return obj;
}
}
Second: Need to add on entity header #TypeDef annotation, add a name and the PGPointType that you created it and on some field header of type PGpoint, add #Type annotation with the name that you created it:
#TypeDef(name = "type", typeClass = PGPointType.class)
#Entity
public class Entity {
#Type(type = "type")
private PGpoint pgPoint;
// Getters and setters
}
Kind regards.

Kafka custom deserializer converting to Java object

I'm using Spring Kafka integration and I've my own value generic serializer/deserializer as shown below
Serializer:
public class KafkaSerializer<T> implements Serializer<T> {
private ObjectMapper mapper;
#Override
public void close() {
}
#Override
public void configure(final Map<String, ?> settings, final boolean isKey) {
mapper = new ObjectMapper();
}
#Override
public byte[] serialize(final String topic, final T object) {
try {
return mapper.writeValueAsBytes(object);
} catch (final JsonProcessingException e) {
throw new IllegalArgumentException(e);
}
}
}
Deserializer:
public class KafkaDeserializer<T> implements Deserializer<T> {
private ObjectMapper mapper;
#Override
public void close() {
}
#Override
public void configure(final Map<String, ?> settings, final boolean isKey) {
mapper = new ObjectMapper();
}
#Override
public T deserialize(final String topic, final byte[] bytes) {
try {
return mapper.readValue(bytes, new TypeReference<T>() {
});
} catch (final IOException e) {
throw new IllegalArgumentException(e);
}
}
}
The serializer is working perfectly but when it comes to deserialization of values while consuming message I get a LinkedHashMap instead of desired object, please enlighten me where I'm mistaking, thanks in advance.
Some situations need be confirmed:
your Serializer is works
the Deserializer is just works but it returned a LinkedHashMap instead of a object that you expected, right? and you can't convert that LinkedHashMap to your object.
I find the question transfers to How to Convert/Cast a LinkedHashMap to a Object, and you used ObjectMapper. If all situations can be confirmed, I found here a good post may be answer your question Casting LinkedHashMap to Complex Object
mapper.convertValue(desiredObject, new TypeReference<type-of-desiredObject>() { })
ObjectMapper's API at [here](https://fasterxml.github.io/jackson-databind/javadoc/2.3.0/com/fasterxml/jackson/databind/ObjectMapper.html#convertValue(java.lang.Object, com.fasterxml.jackson.core.type.TypeReference))
And I hopes I don't missing your intention, and you can complement necessary situations, so someone or me can improve this answer.

mongoTemplate is null. Cannot understand why

I'm trying to parse xml file and save some info in mongodb. I get the file to parse and give it to my #Controller. Here is the code of the POST method of #Controller:
#RequestMapping(value = TracksGeopointsRoutes.TRACKS, method = RequestMethod.POST)
public String tracks(#RequestParam MultipartFile file){
TracksGeopointsDoc tracksGeopointsDoc = new TracksGeopointsDoc();
try {
tracksGeopointsDoc.setFile(tracksGeopointsService.convert(file));
} catch (IOException e) {
e.printStackTrace();
}
mongoTemplate.save(tracksGeopointsDoc);
new MySaxParser(tracksGeopointsDoc.getFile().getAbsolutePath()); // here I give my file to my parser
return "com.ub.geopoints_test.index";
}
And my parser:
#Component
public class MySaxParser extends DefaultHandler{
#Autowired
MongoTemplate mongoTemplate;
private List<DotGeopointsDoc> dotGeopointsDocList;
String xmlFileName;
private String tmpValue;
private DotGeopointsDoc currentDotGeopointsDoc;
private DotGeopointsDoc dotGeopointsDoc;
String bookXmlFileName;
public MySaxParser() {
}
public MySaxParser(String bookXmlFileName) {
this.xmlFileName = bookXmlFileName;
dotGeopointsDocList = new ArrayList<DotGeopointsDoc>();
dotGeopointsDoc = new DotGeopointsDoc();
parseDocument();
}
private void parseDocument() {
// parse
SAXParserFactory factory = SAXParserFactory.newInstance();
try {
SAXParser parser = factory.newSAXParser();
parser.parse(xmlFileName, this);
} catch (ParserConfigurationException e) {
System.out.println("ParserConfig error");
} catch (SAXException e) {
System.out.println("SAXException : xml not well formed");
} catch (IOException e) {
System.out.println("IO error");
}
}
#Override
public void startElement(String s, String s1, String elementName, Attributes attributes) throws SAXException {
if (elementName.equalsIgnoreCase("trkpt")) {
dotGeopointsDoc.setId(new ObjectId());
dotGeopointsDoc.setLat(attributes.getValue("lat"));
dotGeopointsDoc.setLon(attributes.getValue("lon"));
}
}
#Override
public void endElement(String s, String s1, String element) throws SAXException {
if (element.equals("trkpt")) {
dotGeopointsDocList.add(dotGeopointsDoc);
mongoTemplate.save(dotGeopointsDoc); // here I'm getting NullPointerException. My mongoTemplate is null
dotGeopointsDoc = new DotGeopointsDoc();
}
}
#Override
public void characters(char[] ac, int i, int j) throws SAXException {
tmpValue = new String(ac, i, j);
}
}
Really don't understand why my mongoTemplate is null. Cause in my #Controller it is not. Could anyone help me?
Its because Spring doesn't know about your MySaxParser. You should not instentiate by it your self, what you did here in your controller:
new MySaxParser(tracksGeopointsDoc.getFile().getAbsolutePath());
This is how your controller should look like:
#Autowired
private MySaxParser mySaxParser;//this is how you can inject a spring managed object
#RequestMapping(value = TracksGeopointsRoutes.TRACKS, method = RequestMethod.POST)
public String tracks(#RequestParam MultipartFile file) {
TracksGeopointsDoc tracksGeopointsDoc = new TracksGeopointsDoc();
try {
tracksGeopointsDoc.setFile(tracksGeopointsService.convert(file));
} catch (IOException e) {
e.printStackTrace();
}
mongoTemplate.save(tracksGeopointsDoc);
mySaxParser.setUpMySaxParser(tracksGeopointsDoc.getFile().getAbsolutePath());
return "com.ub.geopoints_test.index";
}
and this is how your xml parser should be changed:
#Service//this is more of a service then a component
public class MySaxParser extends DefaultHandler {
#Autowired
private MongoTemplate mongoTemplate;
private List<DotGeopointsDoc> dotGeopointsDocList;
private String xmlFileName;
private String tmpValue;
private DotGeopointsDoc currentDotGeopointsDoc;
private DotGeopointsDoc dotGeopointsDoc;
private String bookXmlFileName;
//you do not need constuctors here just a "setup method ex."
public void setUpMySaxParser(String bookXmlFileName) {
this.xmlFileName = bookXmlFileName;
dotGeopointsDocList = new ArrayList<DotGeopointsDoc>();
dotGeopointsDoc = new DotGeopointsDoc();
parseDocument();
}
private void parseDocument() {
// parse
SAXParserFactory factory = SAXParserFactory.newInstance();
try {
SAXParser parser = factory.newSAXParser();
parser.parse(xmlFileName, this);
} catch (ParserConfigurationException e) {
System.out.println("ParserConfig error");
} catch (SAXException e) {
System.out.println("SAXException : xml not well formed");
} catch (IOException e) {
System.out.println("IO error");
}
}
#Override
public void startElement(String s, String s1, String elementName, Attributes attributes) throws SAXException {
if (elementName.equalsIgnoreCase("trkpt")) {
dotGeopointsDoc.setId(new ObjectId());
dotGeopointsDoc.setLat(attributes.getValue("lat"));
dotGeopointsDoc.setLon(attributes.getValue("lon"));
}
}
#Override
public void endElement(String s, String s1, String element) throws SAXException {
if (element.equals("trkpt")) {
dotGeopointsDocList.add(dotGeopointsDoc);
mongoTemplate.save(dotGeopointsDoc); // here I'm getting NullPointerException. My mongoTemplate is null
dotGeopointsDoc = new DotGeopointsDoc();
}
}
#Override
public void characters(char[] ac, int i, int j) throws SAXException {
tmpValue = new String(ac, i, j);
}
}

JPA Criteria API group_concat usage

I am currently working on a report which needs a group_concat for one of the fields.
CriteriaQuery<GameDetailsDto> criteriaQuery = criteriaBuilder
.createQuery(GameDetailsDto.class);
Root<BetDetails> betDetails = criteriaQuery.from(BetDetails.class);
Expression<String> betSelection = betDetails.get("winningOutcome");
criteriaQuery.multiselect(
// other fields to select
criteriaBuilder.function("group_concat", String.class, betSelection),
// other fields to select
);
//predicate, where clause and other filters
TypedQuery<GameDetailsDto> typedQuery = entityManager.createQuery(criteriaQuery);
this throws a null pointer exception on the line:
TypedQuery<GameDetailsDto> typedQuery = entityManager.createQuery(criteriaQuery);
did i incorrectly use the function method of the criteriaBuilder?
the documentations says:
function(String name, Class<T> type, Expression<?>... args);
I figured out how to do this with Hibernate-jpa-mysql:
1.) created a GroupConcatFunction class extending org.hibernate.dialect.function.SQLFunction (this is for single column group_concat for now)
public class GroupConcatFunction implements SQLFunction {
#Override
public boolean hasArguments() {
return true;
}
#Override
public boolean hasParenthesesIfNoArguments() {
return true;
}
#Override
public Type getReturnType(Type firstArgumentType, Mapping mapping)
throws QueryException {
return StandardBasicTypes.STRING;
}
#Override
public String render(Type firstArgumentType, List arguments,
SessionFactoryImplementor factory) throws QueryException {
if (arguments.size() != 1) {
throw new QueryException(new IllegalArgumentException(
"group_concat shoudl have one arg"));
}
return "group_concat(" + arguments.get(0) + ")";
}
}
2.) i created the CustomMySql5Dialect class extending org.hibernate.dialect.MySQL5Dialect and registered the group_concat class created in step 1
3.) On the app context, i updated the jpaVendorAdapter to use the CustomMySql5Dialect as the databasePlatform
4.) Finally to use it
criteriaBuilder.function("group_concat", String.class,
sampleRoot.get("sampleColumnName"))
Simple solution: instead of creating the whole class, just use SQLFunctionTemplate.
new SQLFunctionTemplate(StandardBasicTypes.STRING, "group_concat(?1)")
and then register this function in your own SQL dialect (eg. in constructor)
public class MyOwnSQLDialect extends MySQL5Dialect {
public MyOwnSQLDialect() {
super();
this.registerFunction("group_concat", new SQLFunctionTemplate(StandardBasicTypes.STRING, "group_concat(?1)"));
}
}
Suggested property:
spring.jpa.properties.hibernate.metadata_builder_contributor = com.inn.core.generic.utils.SqlFunctionsMetadataBuilderContributor
and class:
import org.hibernate.boot.MetadataBuilder;
import org.hibernate.boot.spi.MetadataBuilderContributor;
import org.hibernate.dialect.function.StandardSQLFunction;
import org.hibernate.type.StandardBasicTypes;
import org.springframework.stereotype.Component;
#Component
public class SqlFunctionsMetadataBuilderContributor implements MetadataBuilderContributor {
#Override
public void contribute(MetadataBuilder metadataBuilder) {
metadataBuilder.applySqlFunction("config_json_extract",
new StandardSQLFunction("json_extract", StandardBasicTypes.STRING));
metadataBuilder.applySqlFunction("JSON_UNQUOTE",
new StandardSQLFunction("JSON_UNQUOTE", StandardBasicTypes.STRING));
metadataBuilder.applySqlFunction("group_concat",
new StandardSQLFunction("group_concat", StandardBasicTypes.STRING));
}
}