Null pointer exception in kafka schema registry - apache-kafka

When tried to start the kafka producer with avro schema , getting null pointer exception always. please help.
~$ sudo kafka-avro-console-producer --broker-list localhost:9092 --topic employeedetails --property parse.key=true --property value.schema= '{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}'
java.lang.NullPointerException
at org.apache.avro.Schema.parse(Schema.java:1225)
at org.apache.avro.Schema$Parser.parse(Schema.java:1032)
at org.apache.avro.Schema$Parser.parse(Schema.java:1020)
at io.confluent.kafka.formatter.AvroMessageReader.init(AvroMessageReader.java:136)
at kafka.tools.ConsoleProducer$.main(ConsoleProducer.scala:42)
at kafka.tools.ConsoleProducer.main(ConsoleProducer.scala)

Related

How to decode messages in MM2 offset sync topic?

In reference to the offset sync topic covered in KIP-382 that maintains the cluster-to-cluster offset mapping, while consuming the messages from mm2-offset-syncs.target.internal found them to be serialized.
Is there a way the output can be deserialized so its understandable using the kafka command line consumer?
./kafka-console-consumer.sh --bootstrap-server localhost:xxxx --topic mm2-offset-syncs.dest.internal --from-beginning
Yes, you can use OffsetSyncFormatter to deserialize the content of your offset syncs topics. For example:
./bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 \
--topic mm2-offset-syncs.target.internal \
--formatter org.apache.kafka.connect.mirror.formatters.OffsetSyncFormatter \
--from-beginning
The more details, see KIP-597: MirrorMaker2 internal topics Formatters

Use RecordNameStrategy using kafka-avro-console-producer and confluent schema registry

Using the kafka-avro-console-producer cli
When trying the following command
kafka-avro-console-producer \
--broker-list <broker-list> \
--topic <topic> \
--property schema.registry.url=http://localhost:8081 \
--property value.schema.id=419
--property auto.register=false
I have this error
org.apache.kafka.common.errors.SerializationException: Error retrieving Avro schema {...}
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Subject 'my-topic-name-value' not found.; error code: 40401
Since I’m not using TopicNameStrategy for my subjects but RecordNameStrategy. I would like to specify it, how can I find the property allowing to set the subject name to use by the cli please ?
Note: I, since I found this https://github.com/confluentinc/schema-registry/blob/a0a04628687a72ac6d01869d881a60fbde4177e7/avro-serializer/src/main/java/io/confluent/kafka/serializers/AbstractKafkaAvroSerDeConfig.java#L97 I already tried the following without much success
--property value.subject.name.strategy.default=io.confluent.kafka.serializers.subject.RecordNameStrategy
This worked
--property value.subject.name.strategy=io.confluent.kafka.serializers.subject.RecordNameStrategy
https://github.com/confluentinc/schema-registry/blob/master/schema-serializer/src/main/java/io/confluent/kafka/serializers/AbstractKafkaSchemaSerDeConfig.java#L136

Use kafka-avro-console-consumer with already registered schema - error 500

Using the kafka-avro-console-producer cli
When trying the following command
kafka-avro-console-producer \
--broker-list <broker-list> \
--topic <topic> \
--property schema.registry.url=http://localhost:8081 \
--property value.schema.id=419
I have this error
org.apache.kafka.common.errors.SerializationException: Error registering Avro schema {...}
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Internal Server Error; error code: 500
I can’t understand why is it trying to register the schema as the schema already exists and I’m trying to use it through its ID within the registry.
Note: my schema registry is in READ_ONLY mode, but as I said it should not be an issue right?
Basically I needed to ask the producer not to try to auto register the schema using this property
--property auto.register.schemas=false
Found out here Use kafka-avro-console-producer without autoregister the schema

kafka-console-consumer with kerberos throws No enum constant exception

I have enabled Kerberos for Kafka from Ambari. For me, console producer is working with below command
/usr/hdp/current/kafka-broker/bin/kafka-console-producer.sh --broker-list HOSTNAME:6667 --topic test --producer-property security.protocol=SASL_PLAINTEXT
While consume data using below command I am getting below error
Caused by: java.lang.IllegalArgumentException: No enum constant org.apache.kafka.common.security.auth.SecurityProtocol.
Console consumer command what I have used is
/usr/hdp/current/kafka-broker/bin/kafka-console-consumer.sh --bootstrap-server HOSTNAME:6667 --topic test --consumer-property security.protocol SASL_PLAINTEXT

Has anyone consumed a CNCF CloudEvent from Apache Kafka?

I have published several CNCF CloudEvents onto a Kafka Broker. I am trying to view them directly on the broker using this command:
kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic flink-test --from-beginning
I not been able to view any data. I have never had this issue with any other serialization format used. Anyone have any thought?
I think,you should include the key.deserializer and value.deserializer in your command that has been used while send the message to the broker.
Example-
kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic flink-test --from-beginning
--property key.deserializer=org.apache.kafka.common.serialization.StringDeserializer
--property value.deserializer=org.apache.kafka.common.serialization.LongDeserializer