Kafka Java API fail to produce a message but didn't throw an exception - apache-kafka

I'm using the Java KafkaProducer API to produce a Kafka message.
I'm calling the send method that returns a Future and then a get to block and wait the result.
I was expecting that when the Producer failed to send a message, it will throw an exception but it's not happening.
I have just a log saying:
Error while fetching metadata with correlation id 218444 : {TOPIC=LEADER_NOT_AVAILABLE}
the message wasn't produced, but the API didn't return an error so I didn't have the opportunity to take an action and the message was lost.
How can I handle situations like this one?
Just to clarify, I'm not worried with the specific error because it was a temporary error. I'm worried because reading the API documentation it says that .send(message).get will thrown an exception when fail to send a message but it didn't happen.

Related

Flink deserialize Kafka message error doesn't ignore message

I have a Flink (v1.15) pipeline implementing a custom AbstractDeserializationSchema.
An exception is thrown for a bad message in the deserialize(byte[] message) method whereby I catch the exception and simply return null.
According to the Flink docs (see below) returning null should cause Flink to ignore this message and move onto the next. My example doesn't and reprocesses the message continuously.
Here is a sample of my deserializer. I did some debugging and it does indeed step into the if block and it returns null. The message is then simply reprocessed over and over.
According to the official Flink docs (v1.13 and below) returning null should cause Flink to ignore this message.
https://nightlies.apache.org/flink/flink-docs-release-1.13/docs/connectors/datastream/kafka/
This exceprt below is from the above link (v1.13) and I noticed from v1.14 / v1.15 on this has been removed (https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/datastream/kafka/)
This is the repeated flow of the task after the XML parsing error
Thank you

MassTransit Kafka Rider get raw message

I need to get raw message that was sent to kafka for logging.
For example, if validation context.Message was failed.
I tried answer from this Is there a way to get raw message from MassTransit?, but it doesn't work and context.TryGetMessage<JToken>() return null all the time.
The Confluent.Kafka client does not expose the raw message data, only the deserialized message type. Therefore, MassTransit does not have a message body accessible.

Publish messages that could not be de-serialized to DLT topic

I do not understand how messages that could not be de-serialized can be written to a DLT topic with spring kafka.
I configured the consumer according to the spring kafka docs and this works well for exceptions that occur after de-serialization of the message.
But when the message is not de-serializable a org.apache.kafka.common.errors.SerializationExceptionis thrown while polling for messages.
Subsequently, SeekToCurrentErrorHandler.handle(Exception thrownException, List<ConsumerRecord<?, ?>> records, ...) is called with this exception but with an empty list of records and is therefore unable to write something to DLT topic.
How can I write those messages to DLT topic also?
The problem is that the exception is thrown by the Kafka client itself so Spring doesn't get to see the actual record that failed.
That's why we added the ErrorHandlingDeserializer2 which can be used to wrap the actual deserializer; the failure is passed to the listener container and re-thrown as a DeserializationException.
See the documentation.
When a deserializer fails to deserialize a message, Spring has no way to handle the problem, because it occurs before the poll() returns. To solve this problem, version 2.2 introduced the ErrorHandlingDeserializer2. This deserializer delegates to a real deserializer (key or value). If the delegate fails to deserialize the record content, the ErrorHandlingDeserializer2 returns a null value and a DeserializationException in a header that contains the cause and the raw bytes. When you use a record-level MessageListener, if the ConsumerRecord contains a DeserializationException header for either the key or value, the container’s ErrorHandler is called with the failed ConsumerRecord. The record is not passed to the listener.
The DeadLetterPublishingRecoverer has logic to detect the exception and publish the failed record.

Spring Kafka Template send with retry based on different cases

I am using Spring Kafka's KafkaTemplate for sending message using the async way and doing proper error handing using callback.
Also, I have configured the Kafka producer to have maximum of retries (MAX_INTEGER).
However there maybe some errors which is related with avro serialization, but for those retry wouldn't help. So how can I escape those error without retries but for other broker related issues I want to do retry?
The serialization exception will occur before the message is sent, so the retries property is irrelevant in that case; it only applies when the message is actually sent.

Requeue JMS request with Mule

I use a JMS component with Mule for queues with ActiveMQ and I want that if a request is queued fails, it return to the queue to retry the last.
What should I configure to do that in Anypoint Studio?
Just requeueing your message doesn't sound like a good idea, imagine that you have a message that allways fails, this would then in a sense cause an endless recursion while trying to process the message.
It sounds more like what you are interested is the Rollback Exception Strategy. With this you can specify a maximum number of redeliveries and when that number is exceeded you could put the message on a DLQ(Dead letter queue) or similiar and preferably notify somebody about the failed message.
You can define a rollback exception strategy to ensure that a message that throws an exception in a flow is rolled back for reprocessing. Use a rollback exception strategy when you cannot correct an error when it occurs in a flow. Usually, you use a rollback exception strategy to handle errors that occur in a flow that involve a transaction. If the transaction fails, that is, if a message throws an exception while being processed, then the rollback exception strategy rolls back the transaction in the flow. If the inbound connector is transactional, Mule delivers the message to the inbound connector of the parent flow again to reattempt processing (that is, message redelivery).