Exception Occurred Subject not found error code - Confluent - apache-kafka

I can see an error in my logs that Subject with name A.Abc-key is not present.
I listed all the subjects and verified that the A.Abc-key is not present but the A.Abc-value is present
On checking property-key for same Topic i get following error :
./kafka-avro-console-consumer --bootstrap-server http://localhost:9092 --from-beginning --property print.key=true --topic A.Abc
null Processed a total of 1 messages
[2018-09-05 16:26:45,470] ERROR Unknown error when running consumer: (kafka.tools.ConsoleConsumer$:76)
org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id 80
Caused by: java.net.ConnectException: Connection refused (Connection refused)
at java.net.PlainSocketImpl.socketConnect(Native Method)
I am not sure hot to debug and fix this.

Your error is HTTP related, so make sure your registry is running on localhost since you have not specified it
and verified that the A.Abc-key is not present
Then your key is not Avro, but Avro console consumer will try to deserialize your keys as Avro if you add the print key property
You can try adding key-deserializer and if your registry is not on localhost, you must specify it
--property schema.registry.url="http://..." \
--property key-deserializer=org.apache.kafka.common.serialization.StringDeserializer \
--property print.key=true

Related

Use RecordNameStrategy using kafka-avro-console-producer and confluent schema registry

Using the kafka-avro-console-producer cli
When trying the following command
kafka-avro-console-producer \
--broker-list <broker-list> \
--topic <topic> \
--property schema.registry.url=http://localhost:8081 \
--property value.schema.id=419
--property auto.register=false
I have this error
org.apache.kafka.common.errors.SerializationException: Error retrieving Avro schema {...}
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Subject 'my-topic-name-value' not found.; error code: 40401
Since I’m not using TopicNameStrategy for my subjects but RecordNameStrategy. I would like to specify it, how can I find the property allowing to set the subject name to use by the cli please ?
Note: I, since I found this https://github.com/confluentinc/schema-registry/blob/a0a04628687a72ac6d01869d881a60fbde4177e7/avro-serializer/src/main/java/io/confluent/kafka/serializers/AbstractKafkaAvroSerDeConfig.java#L97 I already tried the following without much success
--property value.subject.name.strategy.default=io.confluent.kafka.serializers.subject.RecordNameStrategy
This worked
--property value.subject.name.strategy=io.confluent.kafka.serializers.subject.RecordNameStrategy
https://github.com/confluentinc/schema-registry/blob/master/schema-serializer/src/main/java/io/confluent/kafka/serializers/AbstractKafkaSchemaSerDeConfig.java#L136

Use kafka-avro-console-consumer with already registered schema - error 500

Using the kafka-avro-console-producer cli
When trying the following command
kafka-avro-console-producer \
--broker-list <broker-list> \
--topic <topic> \
--property schema.registry.url=http://localhost:8081 \
--property value.schema.id=419
I have this error
org.apache.kafka.common.errors.SerializationException: Error registering Avro schema {...}
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Internal Server Error; error code: 500
I can’t understand why is it trying to register the schema as the schema already exists and I’m trying to use it through its ID within the registry.
Note: my schema registry is in READ_ONLY mode, but as I said it should not be an issue right?
Basically I needed to ask the producer not to try to auto register the schema using this property
--property auto.register.schemas=false
Found out here Use kafka-avro-console-producer without autoregister the schema

Produce message to Remote Kafka

Trying to send message from my local to remote Kafka
server.properties file:-
listeners=PLAINTEXT://0.0.0.0:9092
advertised.listeners=PLAINTEXT://<PUBLIC_IP>:9092
Tried to produce using the command
./bin/kafka-console-producer.sh --broker-list <PUBLIC_IP>:9092 --topic first_topic
Tried to consume using the command
./bin/kafka-console-consumer.sh --bootstrap-server <PUBLIC_IP>:9092 --topic first_topic --from-beginning
Getting the following error while producing
org.apache.kafka.common.KafkaException: Producer closed while send in progress
at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:894)
at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:870)
at kafka.tools.ConsoleProducer$.send(ConsoleProducer.scala:71)
at kafka.tools.ConsoleProducer$.main(ConsoleProducer.scala:53)
at kafka.tools.ConsoleProducer.main(ConsoleProducer.scala)
Caused by: org.apache.kafka.common.KafkaException: Requested metadata update after close
at org.apache.kafka.clients.producer.internals.ProducerMetadata.awaitUpdate(ProducerMetadata.java:126)
at org.apache.kafka.clients.producer.KafkaProducer.waitOnMetadata(KafkaProducer.java:1032)
at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:891)
... 4 more
What else should be configured to produce to Kafka hosted in remote server. I have also opened the port 9092 and there is no authorization/authentication enabled.
The problem seems to be with Azure. Solved by changing the server.properties file with the following configuration:-
listener.security.protocol.map=INTERNAL:PLAINTEXT,EXTERNAL:PLAINTEXT
advertised.listeners=INTERNAL://:19092,EXTERNAL://<public_ip>:9092
listeners=INTERNAL://:19092,EXTERNAL://<public_ip>:9092

kafka-console-consumer with kerberos throws No enum constant exception

I have enabled Kerberos for Kafka from Ambari. For me, console producer is working with below command
/usr/hdp/current/kafka-broker/bin/kafka-console-producer.sh --broker-list HOSTNAME:6667 --topic test --producer-property security.protocol=SASL_PLAINTEXT
While consume data using below command I am getting below error
Caused by: java.lang.IllegalArgumentException: No enum constant org.apache.kafka.common.security.auth.SecurityProtocol.
Console consumer command what I have used is
/usr/hdp/current/kafka-broker/bin/kafka-console-consumer.sh --bootstrap-server HOSTNAME:6667 --topic test --consumer-property security.protocol SASL_PLAINTEXT

Error retrieving Avro schema for id 1, Subject not found.; error code: 40401

Caused by: org.apache.kafka.common.errors.SerializationException: Error retrieving Avro schema for id 1
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Subject not found.; error code: 40401
Confluent Version 4.1.0
I am consuming data from a couple of topics(topic_1, topic_2) using KTable, joining the data and then pushing the data onto another topic(topic_out) using KStream. (Ktable.toStream())
The data is in avro format
When I check the schema by using
curl -X GET http://localhost:8081/subjects/
I find
topic_1-value
topic_1-key
topic_2-value
topic_2-key
topic_out-value
but there is no subject with topic_out-key. Why is it not created?
output from topic_out:
kafka-avro-console-consumer --bootstrap-server localhost:9092 --from-beginning --property print.key=true --topic topic_out
"code1 " {"code":{"string":"code1 "},"personid":{"string":"=NA="},"agentoffice":{"string":"lic1 "},"status":{"string":"a"},"sourcesystem":{"string":"ILS"},"lastupdate":{"long":1527240990138}}
I can see the key being generated, but no subject for key.
Why is subject with key required?
I am feeding this topic to another connector (hdfs-sink) to push the data to hdfs but it fails with below error
Caused by: org.apache.kafka.common.errors.SerializationException: Error retrieving Avro schema for id 5\nCaused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Subject not found.; error code: 40401
when I look at the schema-registry.logs, I can see:
[2018-05-24 15:40:06,230] INFO 127.0.0.1 - -
[24/May/2018:15:40:06 +0530] "POST /subjects/topic_out-key?deleted=true HTTP/1.1" 404 51 9 (io.confluent.rest-utils.requests:77)
any idea why the subject topic_out-key not being created?
any idea why the subject topic_out-key not being created
Because the Key of your Kafka Streams output is a String, not an Avro encoded string.
You can verify that using kafka-console-consumer instead and adding --property print.value=false and not seeing any special characters compared to the same command when you do print the value (this is showing the data is binary Avro)
From Kafka Connect, you must use Kafka's StringConverter class for key.converter property rather than the Confluent Avro one