How to Mock KafkaProducer from Junit method - apache-kafka

I have a producer publisherMethod which will use properties to Create KafkaProducer obj.
I am trying to write the junit for this method and I am not able to mock.
Found MockProducer Class on the web , but didn't get clear idea that How to use it.
Properties props = new Properties();
Producer<String, String> producer = new KafkaProducer<>(props); //this line I need to mock
Exception when I try to mock :
org.apache.kafka.common.KafkaException: Failed to construct kafka producer

Related

Sleuth tracing is not working for transactional Kafka producers

Currently, we are using transactional Kafka producers. What we have noticed is that the tracing aspect of Kafka is missing which means we don't get to see the instrumentation of Kafka producers thereby missing the b3 headers.
After going through the code, we found that the post processors are not invoked for transactional producers which means the TracingProducer is never created by the TraceProducerPostProcessor. Is there a reason for that? Also, what is the work around for enabling tracing for the transactional producers? It seems there is not a single place easily to create a tracing producer (DefaultKafkaProducerFactory #doCreateTxProducer is private)
Screen shot attached(DefaultKafkaProducerFactory class). In the screenshot you can see the post processors are invoked only for raw producer not for the case for transactional producer.
Your help will be much appreciated.
Thanks
DefaultKafkaProducerFactory#createRawProducer
??
createRawProducer() is called for both transactional and non-transactional producers:
Something else is going on.
EDIT
The problem is that sleuth replaces the producer with a different one, but factory discards that and uses the original.
https://github.com/spring-projects/spring-kafka/issues/1778
EDIT2
Actually, it's a good thing that we discard the tracing producer here; Sleuth also wraps the factory in a proxy and wraps the CloseSafeProducer in a TracingProducer; but I see the same result with both transactional and non-transactional producers...
#SpringBootApplication
public class So67194702Application {
public static void main(String[] args) {
SpringApplication.run(So67194702Application.class, args);
}
#Bean
public ApplicationRunner runner(ProducerFactory<String, String> pf) {
return args -> {
Producer<String, String> prod = pf.createProducer();
prod.close();
};
}
}
Putting a breakpoint on the close()...
Thanks Gary Russell for the very quick response. The createRawConsumer is effectivly called for both transactional and non transactional consumers.
Sleuth is using the TraceConsumerPostProcessor to wrap a Kafka consumer into a TracingConsumer. As the ProducerPostProcessor interface extends the Function interface, we may suppose the result of the function could/should be used but the createRawConsumer method of the DefaultKafkaProducerFactory is applying the post processors without using the return type. Causing the issue in this specific case.
So, couldn't we modify the implementation of the createRawConsumer to assign the result of the post processor. If not, wouldn't it be better to have post processors extending a Consumer instead of a Function?
Successful test made by overriding the createRawConsumer method as follow
#Override
protected Producer<K, V> createRawProducer(Map<String, Object> rawConfigs) {
Producer<K, V> kafkaProducer = new KafkaProducer<>(rawConfigs, getKeySerializerSupplier().get(), getValueSerializerSupplier().get());
for (ProducerPostProcessor<K, V> pp : getPostProcessors()) {
kafkaProducer = pp.apply(kafkaProducer);
}
return kafkaProducer;
}
Thank you for your help.

When to call KafkaProducer close method?

As Kafka document said that,
The producer is thread safe and sharing a single producer instance
across threads will generally be faster than having multiple
instances.
So I have following code and want to only have one instance of KafkaProducer for each send request. But when is the best place in the code to call close method on it? As I can't call close method in the send method. How should I write the code to handle?
public class Producer {
private final KafkaProducer<Integer, String> producer;
public Producer(String topic, Boolean isAsync) {
Properties props = new Properties();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, KafkaProperties.KAFKA_SERVER_URL + ":" + KafkaProperties.KAFKA_SERVER_PORT);
props.put(ProducerConfig.CLIENT_ID_CONFIG, "DemoProducer");
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, IntegerSerializer.class.getName());
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class.getName());
producer = new KafkaProducer<>(props);
}
public void send(String message) {
producer.send(new ProducerRecord<>(topic, messageNo, messageStr);
}
}
You can create the producer first and pass(inject) it to your web resource or resources.
Then you can use a shutdown hook to close the producer with its reference.
Even better if you can use life cycle stop hooks in your framework.
Example in dropwizard:
https://www.dropwizard.io/en/latest/manual/core.html?highlight=managed#managed-objects
Kafka producer implements the AutoClosable interface. So you can declare it within the try-with-resources block, and it should take care of releasing the resources when your code goes outside the scope of the block.
Do you ever need to change producer parameters at runtime? For example when changing the broker urls or during tests?
If you need it and you have a singletone producer, make sure to provide a hook to close and recreate the producer with new parameters.

Adding a ProducerInterceptor - Kafka

I am trying to intercept the messages before they get serialized, and I see that there is already a interface called ProducerInterceptor, that can be used in order to modify the records. After making a class which implements that interface and modifies the data, where do I need to put the new class, do I have to modify some files?
While creating the KafkaProducer object, pass it in the properties e.g.
Properties producerProps = new Properties();
producerProps.put(ProducerConfig.INTERCEPTOR_CLASSES_CONFIG, "fully qualified name of your interceptor class");
//... add other properties
KafkaProducer<String,String> kProd = new KafkaProducer<>(producerProps);

How to use Scaldi to inject an Akka Router?

I am creating the router from Akka configuration.
val router = context.actorOf(FromConfig.props(MyActor.props), "router")
I want to unit test the Actor that the router is in, and being able to inject the router into the Actor would be helpful.
Is it possible to instead inject this router using Scaldi? I know in the Scaldi module I can bind using new.
binding toProvider new OrderProcessor
But I can't seem to find a way to create bindings from config.
The properties can be injected.
In the Module
binding identifiedBy "props-from-config" to FromConfig.props(MyActor.props)
And in the Actor inject the props and create the actor.
private val propsFromConfig = inject[Props]("props-from-config")
val router: ActorRef = context.actorOf(propsFromConfig, "router")
Then in the unit test bind any props. The Actor is creating an actor from props, and does not know that the props are coming from config.

Mocking a static method which is written in Scala in Java Unit test

We are using Kafka Clients in a project.
I am trying to mock a static method from the Kafka client via JMockit :
new NonStrictExpectations() {
{
new MockUp<Consumer>()
{
#Mock
ConsumerConnector createJavaConsumerConnector(
ConsumerConfig c){
return null;
}
};
}
};
Looks like JMockit is not working due to some reason. I am sure of the syntax of JMockit for mocking static methods. This is the error :
java.lang.IllegalArgumentException: Matching real methods not found for the following mocks:
dispatcher.DispatcherTests$1$1#createJavaConsumerConnector(kafka.consumer.ConsumerConfig)
If this does not work because the Kafka client code is in Scala, how does my program work?
There are multiple ConsumerConnector classes - the one I was using was from the wrong package..Scala was not creating any problem here. It worked after using the right ConsumerConnector class.