Adding Custom Headers in Kafka Message - apache-kafka

I am sending a file as a message by converting it to a byte array using kafka producer.
I also need to add some headers for the message, for example file name, timestamps etc so at the consumer end I can process the message based on file name and other headers.
What I am currently doing is creating a object and wrapping the raw message and headers in it and sending the object in a byte array as a message.
I would like to know if there is a way by which I can add custom headers while publishing the message?

Kafka v0.11.0.0 adds support for custom headers.
You can add them when creating a ProducerRecord like this:
new ProducerRecord(key, value, headers, ...), where headers is of type Iterable<Header>
For more details see:
https://issues.apache.org/jira/browse/KAFKA-4208
https://cwiki.apache.org/confluence/display/KAFKA/KIP-82+-+Add+Record+Headers

Record level headers were introduced from Kafka 0.11.0. We can send a list of Headers in each record.
List<Header> headers = Arrays.asList(new RecordHeader("header_key", "header_value".getBytes()));
ProducerRecord<String, String> record = new ProducerRecord<>("topic", null, "key", "value", headers);

You can create your own small java application to send the message with headers to kafka.
Write the following code in intellij or any supporting IDE:-
public static void main(String[] args) throws JsonProcessingException,
InterruptedException {
Properties props=new Properties();
props.setProperty(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,"localhost:9092");
props.setProperty(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
JsonSerializer.class.getName());
props.setProperty(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,JsonSerializer.class.getName());
KafkaProducer<String, JsonNode> producer=new KafkaProducer<String, JsonNode>(props);
String json = "{ \"f1\" : \"v1\" } ";
ObjectMapper mapper = new ObjectMapper();
JsonNode jsonNode = mapper.readTree(json);
ProducerRecord<String,JsonNode> record =new ProducerRecord<String,
JsonNode>("test topic", jsonNode);
record.headers().add(new RecordHeader("key","value1".getBytes()));
producer.send(record);
Thread.sleep(10000);
}
String json = "{ \"f1\" : \"v1\" } "- this is the key and value we want to send to kafka using objectMapper and converting it into jsonNode form.
record.headers().add(new RecordHeader("key","value1".getBytes()))-This is the key and value of headers data that we are sending to kafka.
To verify your data you can check the topic in the kafka control center and verify the headers sent.

Kafka is agnostic to the message content and doesn't provide any special means to enrich it so this is something you need to do yourself. A common way of dealing with these things is to use a structured format such as json, avro or similar where you are free to define the necessary fields and could easily add metadata to your message and ship it off to the Kafka brokers.
This answer is outdated as of Kafka 0.11, please see other answers.

Another Solution
ProducerRecord<String, String> producerRecord = new ProducerRecord<>("bizops", "value");
producerRecord.headers().add("client-id", "2334".getBytes(StandardCharsets.UTF_8));
producerRecord.headers().add("data-file", "incoming-data.txt".getBytes(StandardCharsets.UTF_8));
// Details left out for clarity
producer.send(producerRecord);
https://www.confluent.io/blog/5-things-every-kafka-developer-should-know/#adding-headers

I've been through similar problems with projects I've worked on so I created this simple library to help tackle that: https://github.com/leandronunes85/messaging. For now contains an Avro based implementation but it can be extended to use any other serialization framework of your choice.
You just have to create a (de)serializer for the objects you want to have on the stream (Avro based or not) and let AvroMessageSerializer work its magic.
This is still a very young library but I feel it can save many people a lot of time!

Related

Send message to dynamic kafka topic in helidon

In Quarkus/small rye, we can send message to dynamic topic. Please check the below link for the example.
https://beyondvelocity.blog/2022/01/05/dynamic-kafka-topics-in-quarkus/
Kindly suggest how can we implement same in Helidon
I could not find equivalent api or classes in helidon to send message to dynamic topic.
There's nothing preventing you from using KafkaProducer + ProducerRecord classes directly and calling send method with any topic String parameter
Otherwise, just create the Channel with the topic name when you need a dynamic value
KafkaConnector kafkaConnector = KafkaConnector.create();
messaging = Messaging.builder()
.publisher(
Channel.<String>builder()
.subscriberConfig(KafkaConnector.configBuilder()
.bootstrapServers(kafkaServer)
.topic("some random string")
.keySerializer(StringSerializer.class)
.valueSerializer(StringSerializer.class)
.build()
).build(),
Multi.just("test1", "test2").map(Message::of) // example messages
)
.connector(kafkaConnector)
.build()
.start();
https://helidon.io/docs/v2/#/se/reactivemessaging/04_kafka

Error handling in Spring Cloud Kafka Streams

I'm using Spring Cloud Stream with Kafka Streams. Let's say I have a processor which is a Function which converts a KStream of Strings to a KStream of CityProgrammes. It invokes an API to find the City by name and an other transformation which finds any events near that city.
Now the problem is that any error happens during the transformation, the whole application stops. I want to send that one particular message to a DLQ and move along. I've been reading for days and everyone suggests to handle errors within the called services but that is a nonesense in my opinion, plus I still need to return a KStream: how do I do that within a catch?
I also looked at UncaughtExeptionHandler but it is not aware of the message and only able to restart the processing which won't skip this invalid message.
This might sound like an A-B problem so the question rephrased: how do I maintain the flow in a KStream when an exception occurs and send the invalid item to the DLQ?
When it comes to the application-level errors you have, it is up to the application itself how the error is handled. Kafka Streams and the Spring Cloud Stream binder mainly support deserialization and serialization errors at the framework level. Although that is the case, I think your scenario can be handled. If you are using Kafka Client prior to 2.8, here is an SO answer I gave before on something similar: https://stackoverflow.com/a/66749750/2070861
If you are using Kafka/Streams 2.8, here is an idea that you can use. However, the code below should only be used as a starting point. Adjust it according to your use case. Read more on how branching works in Kafka Streams 2.8. The branching API is significantly refactored in 2.8 from the prior versions.
public Function<KStream<?, String>, KStream<?, Foo>> convert() {
Foo[] foo = new Foo[0];
return input -> {
final Map<String, ? extends KStream<?, String>> branches =
input.split(Named.as("foo-")).branch((key, value) -> {
try {
foo[0] = new Foo(); // your API call for CitiProgramme converion here, possibly.
return true;
}
catch (Exception e) {
Message<?> message = MessageBuilder.withPayload(value).build();
streamBridge.send("to-my-dlt", message);
return false;
}
}, Branched.as("bar"))
.defaultBranch();
final KStream<?, String> kStream = branches.get("foo-bar");
return kStream.map((key, value) -> new KeyValue<>("", foo[0]));
};
}
}
The default branch is ignored in this code because that only contains the records that threw exceptions. Those were handled by the catch statement above in which we send the records to a DLT programmatically. Finally, we get the good records and map them to a new KStream and send it through the outbound.

Create custom DefaultKafkaHeaderMapper

When I send a record to kafka topic consumer recieves "nativeHeaders" with some unnecessary header (which HeaderMethodArgumentResolver can not even cast to Map).
I'm looking for some way to override HeaderMethodArgumentResolver method "getNativeHeaders" to exclude this garbage header and don't know how to provide this subclass to the spring.
There's an original method from org.springframework.messaging.handler.annotation.support.HeaderMethodArgumentResolver :
private Map<String, List<String>> getNativeHeaders(Message<?> message) {
return (Map)message.getHeaders().get("nativeHeaders");
}
Where this call:
message.getHeaders().get("nativeHeaders");
returns this:
https://ibb.co/qrvMNMk
(as you see there's extra field "headerValue" apart from key-value, which prevents casting)
Send record by kafkaTemplate like this:
kafkaTemplate.send(new ProducerRecord<String, TempContractEntity>(topics.getSubmit(), tempContractEntity));
Consumer gets messages by #KafkaListener annotation:
#KafkaListener(topics = "#{settingsService.getTopics()}")
public void processMessage(OrchestratorRequestImpl orchestratorRequest,
#Header(KafkaHeaders.RECEIVED_TOPIC) String topicName) throws Throwable{//...}
Generally I want to find a way to pre-process kafka headers
The NonTrustedHeaderType indicates that something sent a message with that header and it's class is not trusted. This would not happen with the type of send you show - there is no Message<?> involved there, so something is missing from the picture in your question.
One thing you could do is add a ConsumerInterceptor to the consumer configuration and weed out the unwanted header in the onConsume() method.
But you should really figure out who's sending it.

How to inject KafkaTemplate in Quarkus

I'm trying to inject a KafkaTemplate to send a single message. I'm developing a small function that lies outside the reactive approach.
I can only find examples that use #Ingoing and #Outgoing from Smallrye but I don't need a KafkaStream.
I tried with Kafka-CDI but I'm unable to inject the SimpleKafkaProducer.
Any ideas?
For Clement's answer
It seems the right direction, but executing orders.send("hello"); I receive this error:
(vert.x-eventloop-thread-3) Unhandled exception:java.lang.IllegalStateException: Stream not yet connected
I'm consuming from my topic by command line, Kafka is up and running, if I produce manually I can see the consumed messages.
It seems relative to this sentence by the doc:
To use an Emitter for the stream hello, you need a #Incoming("hello")
somewhere in your code (or in your configuration).
I have this code in my class:
#Incoming("orders")
public CompletionStage<Void> consume(KafkaMessage<String, String> msg) {
log.info("Received message (topic: {}, partition: {}) with key {}: {}", msg.getTopic(), msg.getPartition(), msg.getKey(), msg.getPayload());
return msg.ack();
}
Maybe I've forgotten some configurations?
So, you just need to use an Emitter:
#Inject
#Stream("orders") // Emit on the channel 'orders'
Emitter<String> orders;
// ...
orders.send("hello");
And in your application.properties, declare:
## Orders topic (WRITE)
mp.messaging.outgoing.orders.type=io.smallrye.reactive.messaging.kafka.Kafka
mp.messaging.outgoing.orders.topic=orders
mp.messaging.outgoing.orders.bootstrap.servers=localhost:9092
mp.messaging.outgoing.orders.key.serializer=org.apache.kafka.common.serialization.StringSerializer
mp.messaging.outgoing.orders.value.serializer=org.apache.kafka.common.serialization.StringSerializer
mp.messaging.outgoing.orders.acks=1
To avoid Stream not yet connected exception, as suggested by doc:
To use an Emitter for the stream hello, you need a #Incoming("hello")
somewhere in your code (or in your configuration).
Assuming you have something like this in your application.properties:
# Orders topic (READ)
smallrye.messaging.source.orders-r-topic.type=io.smallrye.reactive.messaging.kafka.Kafka
smallrye.messaging.source.orders-r-topic.topic=orders
smallrye.messaging.source.orders-r-topic.bootstrap.servers=0.0.0.0:9092
smallrye.messaging.source.orders-r-topic.key.deserializer=org.apache.kafka.common.serialization.StringDeserializer
smallrye.messaging.source.orders-r-topic.value.deserializer=org.apache.kafka.common.serialization.StringDeserializer
smallrye.messaging.source.orders-r-topic.group.id=my-group-id
Add something like this:
#Incoming("orders-r-topic")
public CompletionStage<Void> consume(KafkaMessage<String, String> msg) {
log.info("Received message (topic: {}, partition: {}) with key {}: {}", msg.getTopic(), msg.getPartition(), msg.getKey(), msg.getPayload());
return msg.ack();
}
Since Clement's answer the #Stream annotation has been deprecated. The #Channel annotation
must be used instead.
You can use an Emitter provided by the quarkus-smallrye-reactive-messaging-kafka dependency to produce message to a Kafka topic.
A simple Kafka producer implementation:
public class MyKafkaProducer {
#Inject
#Channel("my-topic")
Emitter<String> myEmitter;
public void produce(String message) {
myEmitter.send(message);
}
}
And the following configuration must be added to the application.properties file:
mp.messaging.outgoing.my-topic.connector=smallrye-kafka
mp.messaging.outgoing.my-topic.bootstrap.servers=localhost:9092
mp.messaging.outgoing.my-topic.value.serializer=org.apache.kafka.common.serialization.StringSerializer
This will produce string serialized messages to a kafka topic named my-topic.
Note that by default the name of the channel is also the name of the kafka topic in which the data will be produced. This behavior can be changed through the configuration. The supported configuration attributes are described in the reactive Messaging documentation

how to add protobuf file for topic in gcloud pubsub?

In google cloud's pubsub, I can see that on creating a new topic, I have to create a new message. Can I store a protobuf file there instead of having to write the whole structure of message in key-value pairs? For the protobuf code that shall be written, I mean this. If the protobuf isnt meant to be put on the gcloud pubsub, how can I use it with grpc client?
If you want to send a ProtoBuf message via the Publish API, you should do so by serializing it to a ByteString and then setting it as the message's data field. For example, if you are using the Java client library and you have a Publisher and obj of some ProtoBuf type, then you could do the following:
PubsubMessage message = PubsubMessage.newBuilder()
.setData(obj.toByteString())
.build();
ApiFuture<String> response = publisher.publish(message);
...
On the subscribe side, you would decode the message in your MessageReceiver:
public void receiveMessage(PubsubMessage message, AckReplyConsumer consumer) {
ProtoBufMessage obj;
try {
obj = ProtoBufMessage.parseFrom(message.getData());
} catch (Exception e) {
// Handle improperly encoded message
}
...
}