Serialize/Deserialize generic types in Spring Cloud Kafka Streams - apache-kafka

The main purpose is to read a stream from a topic, apply some transformations and then send two events to other topics. For that we are using Kstream.branch() function and using functional style programming. The code is:
Input POJO:
#Data
#NoArgsConstructor
#AllArgsConstructor
#JsonInclude(JsonInclude.Include.NON_NULL)
#JsonIgnoreProperties(ignoreUnknown = true)
public class FooInput {
#JsonProperty("field1")
private String field1;
#JsonProperty("field2")
private String field2;
}
Output POJO:
#Getter
#Setter
#ToString
#EqualsAndHashCode
public class FooEvent<T> extends EventInfo {
#JsonProperty(value = "entity")
private T entity;
#Builder
private FooEvent(T entity, String eventId, OffsetDateTime eventTime, Action eventAction, String eventSourceSystem, String eventEntityName) {
super(eventId, eventTime, eventAction, eventSourceSystem, eventEntityName);
this.entity = entity;
}
public FooEvent() {
super();
}
}
#Setter
#Getter
#ToString
#AllArgsConstructor
#NoArgsConstructor
public abstract class EventInfo {
#JsonProperty(value = "eventId")
private String eventId;
#JsonProperty(value = "eventTime")
private OffsetDateTime eventTime;
#JsonProperty(value = "eventAction")
private Action eventAction;
#JsonProperty(value = "eventSourceSystem")
private String eventSourceSystem;
#JsonProperty(value = "eventEntityName")
private String eventEntityName;
}
#Data
#NoArgsConstructor
#AllArgsConstructor
#JsonInclude(JsonInclude.Include.NON_NULL)
#JsonIgnoreProperties(ignoreUnknown = true)
public class Bar {
#JsonProperty("field1")
private String field1;
#JsonProperty("field2")
private String field2;
#JsonProperty("field3")
private String field3;
}
Processor function:
#Bean
public Function<KStream<String, FooInput>, KStream<String, FooEvent<Bar>>[]> process() {
Predicate<String, FooEvent<Bar>> predicate1=
(key, value) -> value.getEntity().getField1().equalsIgnoreCase("test1");
Predicate<String, FooEvent<Bar>> predicate2=
(key, value) -> value.getEntity().getField1().equalsIgnoreCase("test2");
return input -> {
input
...
.branch(predicate1, predicate2);
};
}
The binds are declared in appplication.properties:
Input:
spring.cloud.stream.bindings.process-in-0.destination=topic0
spring.cloud.stream.bindings.process-in-0.content-type=application/json
Output:
spring.cloud.stream.bindings.process-out-0.destination=topic1
spring.cloud.stream.bindings.process-out-0.content-type=application/json
spring.cloud.stream.bindings.process-out-1.destination=topic2
spring.cloud.stream.bindings.process-out-1.content-type=application/json
The issue is when the application evaluates the predicate. It appears that it tries to convert to FooEvent<Bar>. It converts the eventId, eventTime, eventAction, ... fields just fine but when it comes to the entity field (in this case Bar) it stores the values on a HashMap (instead of creating a new Bar object and setting the proper fields) which leads me to believe that Spring default Serde (JsonSerde) is doing something wrong. Any suggestions on how to solve generic types Serde problem in Kafka Streams?

Related

spring data mongodb mapping dynamic field same field

I have this model:
#Getter
#Setter
#NoArgsConstructor
#AllArgsConstructor
#Document(collection = "dyn_fields")
public class Field<T extends Object> extends AuditModel implements Serializable{
private static final long serialVersionUID = -6630923680212053917L;
#Id
private String id;
private ObjectId companyId;
private T value;
private String label;
private LocalDate validationDate;
public Field(T value) {
this.value = value;
}
}
the Value field can be of any type: String, Integer, Double and c ...
the insertion in the mongodb database works correctly as shown in the image:
how can I do a mapping of my records not knowing the type of the value field?
I'm currently casting all fields, I don't think that's the correct procedure.
You can use instanceof to evaluate the type of the property.
example
if(value instanceof String) {
// String mapping logic
}
if(value instanceof Integer) {
// Integer mapping logic
}

Spring Data JPA #OneToOne mapping is not projected

This question is already phrased as an issue here: https://github.com/spring-projects/spring-data-jpa/issues/2369 but for lack of a reaction there I am copying the contents of that issue here, hoping that somebody might find what's wrong with my code or confirm that this could be a bug:
I've set up an example project here that showcases what seems to be a bug in Spring Data projections: https://github.com/joheb-mohemian/gs-accessing-data-jpa/tree/primary-key-join-column-projection-bug/complete
I have a Customer entity that has a OneToOne mapping to an Address entity:
#Entity
public class Customer {
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private Long id;
private String firstName;
private String lastName;
#OneToOne(mappedBy = "customer", cascade = CascadeType.ALL)
#PrimaryKeyJoinColumn
private Address address;
//...
}
#Entity
public class Address {
#Id
#Column(name = "customer_id")
private Long id;
#OneToOne
#MapsId
#JoinColumn(name = "customer_id")
private Customer customer;
private String street;
//...
}
Then there are simple projection interfaces:
public interface CustomerProjection {
String getFirstName();
String getLastName();
AddressProjection getAddress();
}
public interface AddressProjection {
String getStreet();
}
But when I try to fetch a projected entity from a repository method like this one:
public interface CustomerRepository extends CrudRepository<Customer, Long> {
//...
<T> T findById(long id, Class<T> type);
}
, getAddress() on the projection will be null, whereas getAddress() when fetching the entity type is populated correctly. Of these two unit tests, only testEntityWithOneToOne()will be successful:
#BeforeEach
void setUpData() {
customer = new Customer("first", "last");
Address address = new Address(customer, "street");
customer.setAddress(address);
entityManager.persist(address);
entityManager.persist(customer);
}
#Test
void testEntityWithOneToOne() {
Customer customerEntity = customers.findById(customer.getId().longValue());
assertThat(customerEntity.getAddress()).isNotNull();
}
#Test
void testProjectionWithOneToOne() {
CustomerProjection customerProjection = customers.findById(customer.getId(), CustomerProjection.class);
assertThat(customerProjection.getAddress()).isNotNull();
}
What's the problem here?

How do I use JPA's #EmbeddedId with #Query and no #Table?

I have a the following...
#Query(
value="SELECT MAX(LOG_DATE) AS LOG_DATE, REGION_NAME, HOST_NAME, MIN(REGION_MIN_TIME) AS REGION_MIN_TIME, MAX(REGION_MAX_TIME) AS REGION_MAX_TIME,SUM(TOTAL_TIME_TAKEN) AS TOTAL_TIME_TAKEN, SUM(REGION_API_COUNT) AS REGION_API_COUNT,AVG(TOTAL_TIME_TAKEN/REGION_API_COUNT) AS AVG_RES_TIME, MAX(LST_SRC_UPDT_DATE) AS LST_SRC_UPDT_DATE FROM MY_SCHEMA.GEMFIRE_REGION_USAGE GROUP BY REGION_NAME,HOST_NAME",
nativeQuery = true
)
List<GemfireStatAggregate> findAggregates();
#Getter
#AllArgsConstructor
#NoArgsConstructor
#Entity
public class GemfireStatAggregate {
#EmbeddedId
private GemfireStatId id;
#Column(name="REGION_MIN_TIME")
private String regionMinTime;
}
#Embeddable
#Getter
#AllArgsConstructor
#NoArgsConstructor
public class GemfireStatId implements Serializable {
#Column(name = "LOG_DATE")
private Date loggedDate;
#Column(name="REGION_NAME")
private String regionName;
#Column(name="HOST_NAME")
private String hostName;
}
But when I run I get the following...
Failed to convert from type [java.lang.Object[]] to type [com.me.GemfireStatAggregate] for value '{...data redacted but looks good...}'; nested exception is org.springframework.core.convert.ConverterNotFoundException: No converter found capable of converting from type [java.sql.Timestamp] to type [com.me.GemfireStatAggregate]
Why is this happening?
Update
This does work but is ugly and I don't like it...
public List<GemfireStatAggregate> getAggregateData() {
List<GemfireStatAggregate> result = new ArrayList<>();
for(Object[] arr : repo.findAggregates()){
GemfireStatId id = new GemfireStatId(
(Timestamp) Array.get(arr, 0),
(String) Array.get(arr, 1),
(String) Array.get(arr, 2)
);
result.add(new GemfireStatAggregate(
id,
(String) Array.get(arr, 3)
));
}
return result;
}
add #Temporal annotation on your loggedDate field
#Temporal(TemporalType.DATE)
#Column(name = "LOG_DATE")
private Date loggedDate;

microservice seems to work, but there's no data displayed

I have a simple app that's supposed to connect to postgres and display content of one of the tables.
I installed postgres, created a table and inserted a row, but nothing is shown when I run it.
This are my application.properties
spring.datasource.url=jdbc:postgresql://localhost:5432/postgres
spring.datasource.username=name
spring.datasource.password=pass
spring.jpa.properties.hibernate.dialect = org.hibernate.dialect.PostgreSQLDialect
spring.jpa.hibernate.ddl-auto = update
spring.jpa.properties.hibernate.jdbc.lob.non_contextual_creation=true
and this is repository interface
#Repository
public interface TaxonRepository extends CrudRepository<Taxon, Long> {
}
and the model
#Entity
#Table(name = "dim_taxon")
public class Taxon{
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
#Getter #Setter
private Long id;
#Getter #Setter
private String name;
#Getter #Setter
private String value;
#Getter #Setter
private String reserve;
#Getter #Setter
private String source;
}
My service
#Service
public class TaxonService implements TaxonServiceI{
#Autowired
private TaxonRepository repository;
#Override
public List<Taxon> findAll() {
return (List<Taxon>) repository.findAll();
}
}
and controller
#Controller
public class TaxonController {
#Autowired
private TaxonServiceI taxonService;
#RequestMapping(value="/showTaxons")
public String homePage(Model model){
List<Taxon> taxons = taxonService.findAll();
model.addAttribute("taxons", taxons);
return "index";
}
}
I tried to add an object manually to check if there was a problem with the html or smth
List<Taxon> taxons = new ArrayList<>();
Taxon taxon1 = new Taxon();
taxon1.setName("a");
taxon1.setReserve("a");
taxon1.setSource("a");
taxon1.setValue("a");
taxons.add(taxon1);
model.addAttribute("taxons", taxons);
but html is fine. Seems like this
List<Taxon> taxons = taxonService.findAll();
doesn't work. What's the problem here? There aren't actually any errors.
My table and the data.
You are not adding your loaded List<Taxon> to the model.
#RequestMapping(value="/showTaxons")
public String homePage(Model model){
List<Taxon> taxons = taxonService.findAll();
return "index";
}
Just returns the page to render, without modifying the model.
So this should work
#RequestMapping(value="/showTaxons")
public String homePage(Model model){
model.add(taxonService.findAll());
return "index";
}
In the end I added a few more anotations
#Data
#NoArgsConstructor
#AllArgsConstructor
#Validated
#Entity
#Table(name = "table name")
And explicit mapping for columns
#Column(name = "column_name")
this helped

Picketlink with custom model and long Id

I have a existing Model and want to use it with Picketlink. But I am using Long as #Id field. But Picketlink expect this to be a String field. I have found some hints to use another entity which maps to the corresponding entity of my model. But actually I don't now how to do it.
I have a base class, which all entities derive from:
#MappedSuperclass
public abstract class AbstractEntity implements Serializable, Cloneable {
#Id
#Identifier
#Column(name = "SID")
private Long sid;
#Column(name = "INSERT_TIME")
private Date insertTime;
#Column(name = "UPDATE_TIME")
private Date updateTime;
// getters and setters
}
And a derived realm entity:
#Entity
#IdentityManaged(Realm.class)
public class RealmEntity extends AbstractEntity {
#AttributeValue
private String name;
#PartitionClass
private String typeName;
#ConfigurationName
private String configurationName;
#AttributeValue
private boolean enforceSSL;
#AttributeValue
private int numberFailedLoginAttempts;
// getters and setters
}
And the mapping class for Picketlink looks as follows:
#IdentityPartition(supportedTypes = {
Application.class,
User.class,
Role.class
})
public class Realm extends AbstractPartition {
#AttributeProperty
private boolean enforceSSL;
#AttributeProperty
private int numberFailedLoginAttempts;
private Realm() {
this(null);
}
public Realm(String name) {
super(name);
}
}
The PartitionManager is defined as follows:
builder
.named("default.config")
.stores()
.jpa()
.supportType(User.class, Role.class, Application.class, Realm.class)
.supportGlobalRelationship(Grant.class, ApplicationAccess.class)
.mappedEntity(App.class, AppUserRole.class, AppRole.class, AppUser.class, UserEntity.class, RelationshipIdentityTypeEntity.class, RealmEntity.class)
.addContextInitializer((context, store) -> {
if (store instanceof JPAIdentityStore) {
if (!context.isParameterSet(JPAIdentityStore.INVOCATION_CTX_ENTITY_MANAGER)) {
context.setParameter(JPAIdentityStore.INVOCATION_CTX_ENTITY_MANAGER, entityManager);
}
}
});
When I try to create a new Realm Hibernate throws an error while trying to load the Realm because the #Id is defined as Long but the #Identifier of the Picketlink model is a String.
this.shsRealm = new Realm(REALM_SHS_NAME);
this.shsRealm.setEnforceSSL(true);
this.shsRealm.setNumberFailedLoginAttempts(3);
this.partitionManager.add(this.shsRealm);
java.lang.IllegalArgumentException: Provided id of the wrong type for class de.logsolut.common.picketlink.model.RealmEntity. Expected: class java.lang.Long, got class java.lang.String
How can I map the JPA model correctly to Picketlink?