Producer error in Scala Embedded Kafka with Kafka Streams - scala

I have a test that temperamentally leaves an open producer thread with a continuous error logging.
[2018-06-01 15:52:48,526] WARN [Producer clientId=test-deletion-stream-application-9d94ddd6-6f29-4364-890e-0d9676782edd-StreamThread-1-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2018-06-01 15:52:48,526] WARN [Producer clientId=test-deletion-stream-application-9d94ddd6-6f29-4364-890e-0d9676782edd-StreamThread-1-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2018-06-01 15:52:48,526] WARN [Producer clientId=test-deletion-stream-application-9d94ddd6-6f29-4364-890e-0d9676782edd-StreamThread-1-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2018-06-01 15:52:48,628] WARN [Producer clientId=test-deletion-stream-application-9d94ddd6-6f29-4364-890e-0d9676782edd-StreamThread-1-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2018-06-01 15:52:48,628] WARN [Producer clientId=test-deletion-stream-application-9d94ddd6-6f29-4364-890e-0d9676782edd-StreamThread-1-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2018-06-01 15:52:48,628] WARN [Producer clientId=test-deletion-stream-application-9d94ddd6-6f29-4364-890e-0d9676782edd-StreamThread-1-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2018-06-01 15:52:48,730] WARN [Producer clientId=test-deletion-stream-application-9d94ddd6-6f29-4364-890e-0d9676782edd-StreamThread-1-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2018-06-01 15:52:48,730] WARN [Producer clientId=test-deletion-stream-application-9d94ddd6-6f29-4364-890e-0d9676782edd-StreamThread-1-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2018-06-01 15:52:48,730] WARN [Producer clientId=test-deletion-stream-application-9d94ddd6-6f29-4364-890e-0d9676782edd-StreamThread-1-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2018-06-01 15:52:48,982] WARN [Producer clientId=test-deletion-stream-application-9d94ddd6-6f29-4364-890e-0d9676782edd-StreamThread-1-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2018-06-01 15:52:48,982] WARN [Producer clientId=test-deletion-stream-application-9d94ddd6-6f29-4364-890e-0d9676782edd-StreamThread-1-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2018-06-01 15:52:48,982] WARN [Producer clientId=test-deletion-stream-application-9d94ddd6-6f29-4364-890e-0d9676782edd-StreamThread-1-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
The test works, but sometimes fails like above.
test("My test") {
val topology = Application.getTopology(...)
val streams = new KafkaStreams(topology,properties)
withRunningKafka {
createCustomTopic(eventTopic)
val streamId = UUIDs.newUuid().toString
logger.info(s"Creating stream with Application ID: [$streamId]")
val streams = new KafkaStreams(topology, streamConfig(streamId, PropertiesConfig.asScalaMap(props)))
try {
publishToKafka(eventTopic, key = keyMSite1UID1, message = event11a)
// ... several more publishings
Thread.sleep(publishingDelay) // Give time to initialize
streams.start()
Thread.sleep(deletionDelay)
withConsumer[MyKey, MyEvent, Unit] { consumer =>
val consumedMessages: Stream[(MyKey, MyEvent)] =
consumer.consumeLazily[(MyKey, MyEvent)](eventTopic)
val messages = consumedMessages.take(20).toList
messages.foreach(tuple => logger.info("EVENT END: " + tuple))
messages.size should be(6)
// several assertions here
}
} finally {
streams.close()
}
}(config)
}
A particularity is that the streams application produces deletion events over the same topic it consumes from.
There are two similar tests in this suite. I execute the test suite under sbt, like so:
testOnly *MyTest
Four out of five executions leave a dangling thread posting those errors indefinitely. They appear in groups of 3, but I don't know why either.
I've tried setting delays after calls to close(), but it does not seem to help.
How to avoid dangling Producer threads?

In your test, you are creating two KafkaStreams instances, but you only close() one. I assume that the lacking Producer belongs to the instance you don't close. Note, that you need to call KafkaStreams#close() even if you never called KafkaStreams#start().

Related

Kafka client faced disconnected will reconnect again by again, anyway can disable it?

The client will reconnect again by itself again, anyway I can disable it?
Should I close Kafka client?
[2022-06-30 16:17:51,332] WARN [Consumer clientId=consumer-xxx, groupId=xxx] Bootstrap broker xxx:xxx (id: -1 rack: null) disconnected (org.apache.kafka.clients.NetworkClient)
[2022-06-30 16:17:52,387] WARN [Consumer clientId=consumer-pg-hudi001_5-1, groupId=pg-hudi001_5] Connection to node -1 (xxx) could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
The consumer will constantly poll and periodically refresh its metadata about what brokers are available from the cluster Controller.
The only way to stop it from trying to connect is to close() the consumer, yes.

kafka-server-start: The process cannot access the file because it is being used by another process

I'm trying to start Kafka server on my Windows machine:
kafka-server-start.bat config\server.properties
But the process throws an exception:
The process cannot access the file because it is being used by another
process.
But when I run:
kafka-topics.bat --list --bootstrap-server localhost:9092
It looks like Kafka is not running:
[2020-02-20 00:41:39,316] WARN [AdminClient clientId=adminclient-1]
Connection to node -1 (localhost/127.0.0.1:9092) could not be
established. Broker may not be available.
(org.apache.kafka.clients.NetworkClient) [2020-02-20 00:41:41,464]
WARN [AdminClient clientId=adminclient-1] Connection to node -1
(localhost/127.0.0.1:9092) could not be established. Broker may not be
available. (org.apache.kafka.clients.NetworkClient) [2020-02-20
00:41:43,614] WARN [AdminClient clientId=adminclient-1] Connection to
node -1 (localhost/127.0.0.1:9092) could not be established. Broker
may not be available. (org.apache.kafka.clients.NetworkClient)
And also when I run kafka-server-stop.bat:
No Instance(s) Available.

Kafka send to azure event hub

I've set up a kafka in my machine and I'm trying to set up Mirror Maker to consume from a local topic and mirror it to an azure event hub, but so far i've been unable to do it and I get the following error:
ERROR Error when sending message to topic dev-eh-kafka-test with key: null, value: 5 bytes with error: (org.apache.kafka.clients.producer.internals.ErrorLoggingCallback)
org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 60000 ms.
After some time I realized that this must be the producer part so I tried to simply use the kafka-console-producer tool directly to event hub and got the same error.
Here is my producer settings file:
bootstrap.servers=dev-we-eh-feed.servicebus.windows.net:9093
compression.type=none
max.block.ms=0
# for event hub
sasl.mechanism=PLAIN
security.protocol=SASL_SSL
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb://dev-we-eh-feed.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=*****”;
Here is the command to spin the producer:
kafka-console-producer.bat --broker-list dev-we-eh-feed.servicebus.windows.net:9093 --topic dev-eh-kafka-test
My event hub namespace has an event hub named dev-eh-kafka-test.
Has anyone been able to do it? Eventually the idea would be to SSL this with a certificate but first I need to be able to do the connection.
I tried using both Apacha Kafka 1.1.1 or the Confluent Kafka 4.1.3 (because this is the version the client is using).
==== UPDATE 1
Someone showed me how to get more logs and this seems to be the detailed version of the error
[2020-02-28 17:32:08,010] DEBUG [Producer clientId=console-producer] Initialize connection to node dev-we-eh-feed.servicebus.windows.net:9093 (id: -1 rack: null) for sending metadata request (org.apache.kafka.clients.NetworkClient)
[2020-02-28 17:32:08,010] DEBUG [Producer clientId=console-producer] Initiating connection to node dev-we-eh-feed.servicebus.windows.net:9093 (id: -1 rack: null) (org.apache.kafka.clients.NetworkClient)
[2020-02-28 17:32:08,010] DEBUG [Producer clientId=console-producer] Created socket with SO_RCVBUF = 32768, SO_SNDBUF = 102400, SO_TIMEOUT = 0 to node -1 (org.apache.kafka.common.network.Selector)
[2020-02-28 17:32:08,010] DEBUG [Producer clientId=console-producer] Completed connection to node -1. Fetching API versions. (org.apache.kafka.clients.NetworkClient)
[2020-02-28 17:32:08,010] DEBUG [Producer clientId=console-producer] Initiating API versions fetch from node -1. (org.apache.kafka.clients.NetworkClient)
[2020-02-28 17:32:08,010] DEBUG [Producer clientId=console-producer] Connection with dev-we-eh-feed.servicebus.windows.net/51.144.238.23 disconnected (org.apache.kafka.common.network.Selector)
java.io.EOFException
at org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:124)
at org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:93)
at org.apache.kafka.common.network.KafkaChannel.receive(KafkaChannel.java:235)
at org.apache.kafka.common.network.KafkaChannel.read(KafkaChannel.java:196)
at org.apache.kafka.common.network.Selector.attemptRead(Selector.java:559)
at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:495)
at org.apache.kafka.common.network.Selector.poll(Selector.java:424)
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:460)
at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:239)
at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:163)
at java.base/java.lang.Thread.run(Thread.java:830)
[2020-02-28 17:32:08,010] DEBUG [Producer clientId=console-producer] Node -1 disconnected. (org.apache.kafka.clients.NetworkClient)
So here is the configuration that worked (it seems I was missing client.id).
Also, it seems you can not choose the destination topic, it seems it must have the same name as the source...
bootstrap.servers=dev-we-eh-feed.servicebus.windows.net:9093
client.id=mirror_maker_producer
request.timeout.ms=60000
sasl.mechanism=PLAIN
security.protocol=SASL_SSL
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="$ConnectionString" password="Endpoint=sb://dev-we-eh-feed.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=******";

Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)

I have installed Kafka and doing some basic testing. I am able to create topics using scripts provided under Kafka-broker/bin folder.
But when I am trying to produce message getting below WARNing every time I run this. And no message is getting generated. Please advice.
[root#node2 bin]# ./kafka-console-producer.sh --broker-list localhost:9092 --topic test_master
>testmsg1
[2019-05-15 06:25:19,092] WARN [Producer clientId=console-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2019-05-15 06:25:19,197] WARN [Producer clientId=console-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2019-05-15 06:25:19,349] WARN [Producer clientId=console-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2019-05-15 06:25:19,562] WARN [Producer clientId=console-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2019-05-15 06:25:20,017] WARN [Producer clientId=console-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2019-05-15 06:25:20,876] WARN [Producer clientId=console-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2019-05-15 06:25:21,987] WARN [Producer clientId=console-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2019-05-15 06:25:22,957] WARN [Producer clientId=console-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2019-05-15 06:25:23,818] WARN [Producer clientId=console-producer] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
^Corg.apache.kafka.common.KafkaException: Producer closed while send in progress
at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:826)
at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:803)
at kafka.tools.ConsoleProducer$.send(ConsoleProducer.scala:75)
at kafka.tools.ConsoleProducer$.main(ConsoleProducer.scala:57)
at kafka.tools.ConsoleProducer.main(ConsoleProducer.scala)
Caused by: org.apache.kafka.common.KafkaException: Requested metadata update after close
at org.apache.kafka.clients.Metadata.awaitUpdate(Metadata.java:188)
at org.apache.kafka.clients.producer.KafkaProducer.waitOnMetadata(KafkaProducer.java:938)
at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:823)
... 4 more
Open Server.xml of each broker of your cluster and make following changes
Change the listeners=PLAINTEXT://:9092 to listeners=PLAINTEXT://<our ip address>:9092
Just remove your local host and write the port
EXAMPLE:
{--broker-list 172.0.0.1:9092} will be changed to { --broker-list :9092 }

Kafka-Broker not available after some time of message transfer

Connection to node 1 could not be established after _consumer_offset-49,
I am not able to solve the issue,till _consumer_offset-49 consumer is able to get message but after offset-49 , it show WARNING message i.e Connection to node 1 could not be established,broker not may not be available.
C:\kafka_2.11-2.0.0>.\bin\windows\kafka-console-producer.bat --broker-list localhost:9092 --topic test1
>hi
>hello
>hey
>whatsupp??
>how are u
>[2019-02-25 03:53:14,876] WARN [Producer clientId=console-producer] Connection to node 1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2019-02-25 03:53:15,982] WARN [Producer clientId=console-producer] Connection to node 1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
[2019-02-25 03:53:17,240] WARN [Producer clientId=console-producer] Connection to node 1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)
I solved the issue, I just changed the Java version to Java-1.8-181 and it worked.