Pointing consumer to proper input when declaring a custom router? - spring-cloud

First time using the custom router function which would act as a filter as mentioned in the docs here and I was wondering, who do I redirect the input to?
In the past I had only the consumer which is declared as follows:
#Bean
public Consumer<Message<MyClass<T>>> myFunction() {
return ...;
}
Together with the properties set as:
spring:
cloud:
stream:
bindings:
myFunction-in-0: "input"
input:
destination: "my-kinesis"
binder: kinesis
group: "my-group"
But now, having a custom router declared, I'm confused if that is still the proper input as myFunction-in-0.
The custom router function returns the following FunctionRoutingResult:
return new MessageRoutingCallback.FunctionRoutingResult(
myCondition ? "myFunction" : "devNull"
);
Also, added the devNull sink for cases where the condition is not met.
#Bean
public Consumer<Message<MyClass<?>>> devNull() {
return ...;
}
EDIT: the error thrown is as follows:
Failed to bind properties under 'spring.cloud.stream.bindings.myfunction-in-0' to org.springframework.cloud.stream.config.BindingProperties:
Reason: org.springframework.core.convert.ConverterNotFoundException: No converter found capable of converting from type [java.lang.String] to type [org.springframework.cloud.stream.config.BindingProperties]
Action:
Update your application's configuration
spring-cloud-stream version: 3.2.4
spring-boot-starter-parent version: 2.6.6
spring-cloud-dependencies version: 2021.0.3
Could it be the generics in the received message? Can it be automatically parsed into the POJO?

Related

Routing events type (Avro-SpecificRecordBase) to right Consumer from one topic in reactive programming

I use
spring-cloud-stream:3.2.2
spring-cloud-stream-binder-kafka:3.2.5
spring-cloud-stream-binder-kafka-streams:3.2.5
I want to write consumer kafka in reactive programming. I work with avro schema registry.
In my case i have multiple events type in one topic. My consumer consume all type, but i want to write one consumer per events type.
In your documentation i found some information concerning Routing. In reactive mode i can use routing-expression in application.yml only. But it's not working for me.
Can you help me ? I tried several things, but i don't find why it's not working.
My 2 Consumer consume all events type not specific.
My two consumer:
#Bean
public Consumer<FirstRankPaymentAgreed> testAvroConsumer() {
return firstRankPaymentAgreed -> {
log.error("test reception event {} ", firstRankPaymentAgreed.getState().getCustomerOrderId());
};
}
#Bean
public Consumer<CustomerOrderValidated> devNull() {
return o -> {
log.error("devNull ");
};
}
my application.yml ( i try lot of simple test)
spring:
cloud:
stream:
function:
routing:
enabled: true
definition: testAvroConsumer;devNull
# routing-expression: "'true'.equals('true') ? devNull : testAvroConsumer;" #"payload['type'] == 'CustomerOrderValidated' ? devNull : testAvroConsumer;"
bindings:
testAvroConsumer-in-0:
destination: tempo-composer-event
devNull-in-0:
destination: tempo-composer-event
kafka:
binder:
brokers: localhost:9092
auto-create-topics: false
consumer-properties:
value:
subject:
name:
strategy: io.confluent.kafka.serializers.subject.TopicRecordNameStrategy
key.deserializer: org.apache.kafka.common.serialization.StringDeserializer
value.deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
schema.registry.url: http://localhost:8081
specific.avro.reader: true
function:
# routing-expression: "'true'.equals('true') ? devNull : testAvroConsumer;"
# routing-expression: "payload['type'] == 'CustomerOrderValidated' ? devNull : testAvroConsumer;"
definition: testAvroConsumer;devNull
Routing and reactive doesn't really mix well.
Unlike the imperative functions which play a role of a message handler (invoked each time there is a message), reactive Reactive functions are initialization functions that connect user flux/mono with system. They are only invoked once during the startup of the applications. After that the stream is processed by reactive API and s-c-stream as a framework plays no additional role (as if it didn't exist in the first place)
So, RoutingFunction mixed with reactive acts as if it was reactive. The expression is evaluated only once during startup and from that point the function is selected and the entire stream is forwarded to that function.
Consider changing your functions to imperative.
I found an uggly solution but it's work. I have an interface EventHandler<?>. All handler class extend this interface. Handler force generic type with the right avro type. I ask Spring to find the right handler.
var beanNames = context.getBeanNamesForType(ResolvableType.forClassWithGenerics(EventHandler.class, message.getPayload().getClass()));
if (beanNames.length > 0) {
var bean = (EventHandler) context.getBean(beanNames[0]);

spring cloud stream custom value deserializer does not work

I have this simple spring cloud stream Function
#Configuration
public class ItemProcessor {
#Bean
public Serde<Wish> WishSerde(){
Serde<Wish> wishSerde = DebeziumSerdes.payloadJson(Wish.class);
wishSerde.configure(Collections.singletonMap("from.field", "after"), false);
return wishSerde;
}
#Bean
public Serde<Long> KeySerde(){
final Serde<Long> keySerde = DebeziumSerdes.payloadJson(Long.class);
keySerde.configure(Collections.emptyMap(), true);
return keySerde;
}
#Bean
public Function<KStream<Long, Wish>, KStream<Long, Wish>> processItems() {
return (models) -> models
.peek((k, v) -> System.out.println(k + ": " + v));
}
}
And here is the Wish model
#Data
#NoArgsConstructor
#AllArgsConstructor
#ToString
public class Wish {
public long wish_id;
public long user_id_fk;
public long item_id_fk;
public long wish_status;
}
And the application.yml
spring.cloud:
function.definition: processItems
stream:
bindings:
processItems-in-0:
destination: source.wish
processItems-out-0:
destination: processed.wish
kafka:
streams:
binder:
brokers: 127.0.0.1:9092
I am using configured bean for Serde as described in the documentation and using Debezium JsonSerde as described here to deserialize objects created by Debezium.
The incoming message is like this:
{"wish_id":759}|{"before":null,"after":{"wish_id":759,"user_id_fk":2,"item_id_fk":823,"wish_status":1},"source":{"version":"1.6.0.Final","connector":"mysql","name":"JDP","ts_ms":1635151905000,"snapshot":"false","db":"jdb","sequence":null,"table":"wish","server_id":1,"gtid":null,"file":"mysql-bin.000008","pos":2694699,"row":0,"thread":null,"query":null},"op":"c","ts_ms":1635151886089,"transaction":null}
where the key and value are separated by |.
I need the content of the 'after' field to use as data for the Wish model and this Serde with this config suppose to do that. But I get the following error at the runtime:
Exception in thread "processItems-applicationId-062d6a97-b543-47ea-b938-b1b520a6faa8-StreamThread-1" org.apache.kafka.streams.errors.StreamsException: Deserialization exception handler is set to fail upon a deserialization error. If you would rather have the streaming pipeline continue after a deserialization error, please set the default.deserialization.exception.handler appropriately.
at org.apache.kafka.streams.processor.internals.RecordDeserializer.deserialize(RecordDeserializer.java:82)
at org.apache.kafka.streams.processor.internals.RecordQueue.updateHead(RecordQueue.java:176)
at org.apache.kafka.streams.processor.internals.RecordQueue.addRawRecords(RecordQueue.java:112)
at org.apache.kafka.streams.processor.internals.PartitionGroup.addRawRecords(PartitionGroup.java:185)
at org.apache.kafka.streams.processor.internals.StreamTask.addRecords(StreamTask.java:895)
at org.apache.kafka.streams.processor.internals.TaskManager.addRecordsToTasks(TaskManager.java:1008)
at org.apache.kafka.streams.processor.internals.StreamThread.pollPhase(StreamThread.java:812)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:625)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:564)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:523)
Caused by: java.lang.RuntimeException: com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "before" (class ir.jdro.kafkaStream.rdbAggregator.model.Wish), not marked as ignorable (4 known properties: "wish_status", "wish_id", "user_id_fk", "item_id_fk"])
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: ir.jdro.kafkaStream.rdbAggregator.model.Wish["before"])
at io.debezium.serde.json.JsonSerde$JsonDeserializer.deserialize(JsonSerde.java:95)
at org.apache.kafka.common.serialization.Deserializer.deserialize(Deserializer.java:60)
at org.apache.kafka.streams.processor.internals.SourceNode.deserializeValue(SourceNode.java:58)
at org.apache.kafka.streams.processor.internals.RecordDeserializer.deserialize(RecordDeserializer.java:66)
Once I set "unknown.properties.ignored", true for Serde configuration and prevent application from raising the exception I get the following output result:
759|{"wish_id":0,"user_id_fk":0,"item_id_fk":0,"wish_status":0}
Which shows the key deserialization works fine but the value deserialization isn't.
I can't find where I am wrong!

Spring Boot Reactive MongoDB API with GraphQL - "Java class is not a List or generic type information was lost"

I was browsing through a lot of articles and blogs to find the proper way to get the following running. I already achived the following:
Spring Boot Application - works
Imperative Spring Data MongoDB Connection to my Mongo Atlas Cluster - works
Spring GraphQL Starter implementation (Resolvers, etc.) - works
Now I want to implement my last requirement. To have GraphQL subscriptions working I need to integrate the Spring Data MongoDB Reactive dependency and create a new GraphQL Resolver for Subscriptions. Here is the code that I added to the already working app (hopefully the code fragments give enough info to help me out).
Gradle.kt
implementation("org.springframework.boot:spring-boot-starter-data-mongodb-reactive")
MyApp.kt
#SpringBootApplication
#EnableReactiveMongoRepositories(basePackages = ["com.myapp"])
#EnableMongoRepositories(basePackages = ["com.myapp"])
class MyApp
fun main(args: Array<String>) {
runApplication<MyApp>(*args)
}
SubscriptionResolver.kt
#Component
class SubscriptionResolver(
private val characterReactiveRepository: CharacterReactiveRepository
) : GraphQLSubscriptionResolver {
fun allCharacters(): Flux<Character> {
return characterReactiveRepository.findAll()
}
}
CharacterReactiveRepository.kt
interface CharacterReactiveRepository : ReactiveMongoRepository<Character, String>
character.graphqls
type Subscription {
allCharacters: [Character]!
}
Error
SchemaClassScannerError: Unable to match type definition (NonNullType{type=ListType{type=TypeName{name='Character'}}}) with java type (reactor.core.publisher.Flux<com.backend.domain.Character>): Java class is not a List or generic type information was lost: reactor.core.publisher.Flux<com.backend.domain.Character>
Detailed Exception
https://pastebin.com/sEWmDaTE
Edit 1
#Component
class SubscriptionResolver(
private val characterReactiveRepository: CharacterReactiveRepository
) : GraphQLSubscriptionResolver {
fun allCharacters(): Publisher<Character> {
return characterReactiveRepository.findAll()
}
}
According to the sample, you should return:
#Component
class SubscriptionResolver(
private val characterReactiveRepository: CharacterReactiveRepository
) : GraphQLSubscriptionResolver {
fun allCharacters(): Publisher<List<Character>> {
return characterReactiveRepository.findAll()
}
}
Example application you can find here https://github.com/graphql-java-kickstart/samples/tree/master/spring-boot-webflux
My problem was that I defined the subscription's method's return value as [Character]! meaning an array. But since Flux<Character> or Publisher<Character> is not an array but a single type in that context the resolving failed all the time.
Changing the schema to the following helped:
type Subscription {
allCharacters: Character
}

Using Kafka with Micronaut

Are there any example projects showing how to use Kafka with Micronaut? I am having problems with getting it to work.
I have the following producer:
#KafkaClient
interface AppClient {
#Topic("topic-name")
void sendMessage(#KafkaKey String id, Event event)
}
and listener:
#KafkaListener(
groupId="group-id",
offsetReset = OffsetReset.EARLIEST
)
class AppListener {
#Topic("topic-name")
void onMessage(Event event) {
// do stuff
}
}
My application.yml contains:
kafka:
bootstrap:
servers: localhost:2181
and application-test.yml (is this right and should it be in the same directory as application.yml?. Also unsure how the embedded server should be used):
kafka:
# embedded:
# enabled: true
# topics: promo-api-promotions
bootstrap:
servers: localhost:9092
My test looks like:
#MicronautTest
class AppSpec extends Specification {
#Shared
#AutoCleanup
EmbeddedServer server = ApplicationContext.run(EmbeddedServer)
#Shared
private AppClient appClient =
server.applicationContext.getBean(AppClient)
def 'The upload endpoint is called'() {
// test here
appClient.sendMessage(id, event)
// other test stuff
}
The main problems I am having are:
My consumer is not consuming from my topic. I can see the producer creates the topic in Kafka and the client group is created, but the offset stays at 0.
I am having problems when the test is started up where it looks as if two instances of the client are created and therefore the MBean registration fails (also, if I try to use the embedded Kafka, I get a different message about port 9092 already being in use because it tries to start the server up twice):
javax.management.InstanceAlreadyExistsException:
kafka.consumer:type=app-info,id=app-kafka-client-app-listener
at com.sun.jmx.mbeanserver.Repository.addMBean(Repository.java:437)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerWithRepository(DefaultMBeanServerInterceptor.java:1898)
Managed to fix the second problem - the object passed into the listener did not have a #JsonCreator. I found this out by trying to use the Jackson object mapper to construct the object from it's JSON while playing around.
If anyone else has the same problem - make sure that the object model works with Jackson before going any further!
You should add the embedded configuration kafka.embedded.enabled to a map with configuration and pass it to the ApplicationContext.run method.
Map<String, Object> config = Collections.
unmodifiableMap(new HashMap<String, Object>() {
{
put(AbstractKafkaConfiguration.EMBEDDED, true);
put(AbstractKafkaConfiguration.EMBEDDED_TOPICS, "test_topic");
}
});
try (ApplicationContext ctx = ApplicationContext.run(config)) {
The consumer consumes from Kafka in another thread and you have to wait for a while until your AppListener catches up.
You can see a short example in KafkaProducerListenerTest
Remember the Kafka dependencies described in the Micronaut doc: Embedding Kafka

Parse HL7 v2.3 REF message with local customizations in HAPI

I am trying parse a HL7 REF I12 message with local customization(NZ).
When I tried using the GenericParser, I keep getting Validation exceptions.
For example for the segment below, I keep get the output
ca.uhn.hl7v2.validation.ValidationException: Validation failed:
Primitive value '(08)569-7555' requires to be empty or a US phone
number
PRD|PP|See T Tan^""^""^^""|""^^^^""^New Zealand||(08)569-7555||14134^NZMC
My question is:
Is there a way to avoid the validation by using the conformance class
generator
Is it possible to create own validation classes using
CustomModelClasses?
In either case, is there any example code for that or tutorial example documentation?
If disabling validation altogether is an option for your application, then you can set the validation context to use NoValidation.
See this thread in the hapi developers mailing list: http://sourceforge.net/p/hl7api/mailman/message/31244500/
Here is an example of how to disable validation:
HapiContext context = new DefaultHapiContext();
context.setValidationContext(new NoValidation());
GenericParser parser = context.getGenericParser();
String message = ...
try {
parser.parse(message);
} catch (Exception e) {
e.printStackTrace();
}
If you still require validation, but just want to change the validator for specific rules, then you'll have to create your own implementation of ValidationContext. This would be done by sub classing ca.uhn.hl7v2.validation.builder.support.NoValidationBuilder and overriding the configure method and use this to instantiate an instance of ValidationContextImpl.
For an example of how to implement the configure method in your subclass of NoValidationBuilder, see the source code for ca.uhn.hl7v2.validation.builder.support.DefaultValidationBuilder. This is the default validation context that is generating the error message you're seeing. To make it easier for you, I'm including the class listing here:
public class DefaultValidationBuilder extends DefaultValidationWithoutTNBuilder {
#Override
protected void configure() {
super.configure();
forAllVersions()
.primitive("TN")
.refersToSection("Version 2.4 Section 2.9.45")
.is(emptyOr(usPhoneNumber()));
}
}
Notice this is the implementation of the usPhoneNumber method defined in BuilderSupport:
public Predicate usPhoneNumber() {
return matches("(\\d{1,2} )?(\\(\\d{3}\\))?\\d{3}-\\d{4}(X\\d{1,5})?(B\\d{1,5})?(C.*)?",
"a US phone number");
}