We are currently working on a POC that deals with using a 'oneof' to have multiple events into the same topic. However, we seem to be getting a serialization exception when publishing to the union Kafka topics.
We are creating a union protobuf schema that calls the other event schemas using the oneof. These event schemas uses imports coming from google like (google/type/date.proto) that can't be added as references while evolving schemas in the registry.
Currently we are using 6.1.1 schema registry version and not sure if this is the cause or this is the way it works. Below is the error we are facing for your reference. We are not sure if there is any additional setup or configuration that is needed in such a scenario. Appreciate some advise on this !
org.apache.kafka.common.errors.SerializationException: Error
serializing Protobuf message at
io.confluent.kafka.serializers.protobuf.AbstractKafkaProtobufSerializer.serializeImpl(AbstractKafkaProtobufSerializer.java:106)
~[kafka-protobuf-serializer-6.1.1.jar:na] Caused by:
java.io.IOException: Incompatible schema syntax = "proto3"; ERROR
25580 --- [nio-8090-exec-1] o.a.c.c.C.[.[.[.[dispatcherServlet] :
Servlet.service() for servlet [dispatcherServlet] in context with path
[/kafka_producer_ri] threw exception [Request processing failed;
nested exception is org.apache.camel.CamelExecutionException:
Exception occurred during execution on the exchange:
Exchange[3497788AEF9624A-0000000000000000]] with root cause
java.io.IOException: Incompatible schema syntax = "proto3";
Thanks
Related
I am trying to read avro records stored in S3 in order to put them back in a kafka topic using the S3 source provided by confluent.
I already have the topics and the registry setup with the right schemas but when the connect S3 source tries to serialize the my records to the topics I get this error
Caused by: org.apache.kafka.common.errors.SerializationException:
Error registering Avro schema: ... at
io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:121)
at
io.confluent.connect.avro.AvroConverter$Serializer.serialize(AvroConverter.java:143)
at
io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:84)
... 15 more Caused by:
io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException:
Subject com-row-count-value is in read-only mode; error code: 42205
at
io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:292)
at
io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:352)
at
io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:495)
at
io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:486)
at
io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:459)
at
io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:214)
at
io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:276)
at
io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:252)
at
io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:75)
it seems that the connect producer does not try to get the schema_id if it exists but tries to write it but my registry is readonly.
Anyone knows if this is an issue or there are some configuration I am missing ?
If you're sure the correct schema for that subject is already registered by some other means, you can try to set auto.register.schemas to false in the serializer configuration.
See here for more details: https://docs.confluent.io/platform/current/schema-registry/serdes-develop/index.html#handling-differences-between-preregistered-and-client-derived-schemas
I was running confluent(v5.5.1) s3 sink connector with below config:
"value.converter":"io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url":"http://registryurl",
"value.converter.value.subject.name.strategy":"io.confluent.kafka.serializers.subject.RecordNameStrategy",
......
And got below error in the log like:
DEBUG Sending GET with input null to http://registryurl/schemas/ids/309?fetchMaxId=false
DEBUG Sending POST with input {......} to http://registryurl/MyRecordName?deleted=true
Caused by: org.apache.kafka.common.errors.SerializationException: Error retrieving Avro value schema version for id 309
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Subject not found.; error code: 40401
There are 2 questions that baffles me here:
Why the sink connector sends additional POST request to schema registry given it's just a consumer? And I have successfully received messages when using a standard kafka consumer, which ONLY sends a GET request to the schema registry.
As per this docs and official doc, the schema subject format will be like SubjectNamingStrategy-value or -key. However judging by the log, it does not suffix the request with "-value". I have tried all the 3 strategies and found ONLY the default TopicNameStrategy works as expected.
Appreciated if anyone could shed some light here.
Thanks a lot
I have a kafka producer and consumer on different services, the consumer code was rolled out and worked fine then today I rolled out the producer side changes and get the serialization exception here on the consumer. I use an confluent AVRO schema registry server, which had also been working fine until today.
org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id 59
Caused by: java.lang.ClassCastException: class org.apache.avro.util.Utf8 cannot be cast to class java.lang.String (org.apache.avro.util.Utf8 is in unnamed module of loader org.apache.catalina.loader.ParallelWebappClassLoader #77bd7fe7; java.lang.String is in module java.base of loader 'bootstrap')
at com.mydev.ret.lib.avro.mark.put(Mark.java:132)
As part of this the schema has changed but that is not the first time this has happened, what might be significant is the we are after moving to avro 1.9.1 AND kafka-avro-serializer-6.0.0
Any ideas, seeing the string and UTF issue makes me thing there might be an artifact mismatch across producer and consumer.
I tried to update Spring Kafka version but got exception
Spring Kafka version 2.3.4.RELEASE
Spring Boot version 2.2.2.RELEASE
Kafka-clients version 2.3.1
Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.kafka.core.KafkaTemplate]: No default constructor found; nested exception is java.lang.NoSuchMethodException: org.springframework.kafka.core.KafkaTemplate.<init>()
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:83)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateBean(AbstractAutowireCapableBeanFactory.java:1312)
... 101 more
Caused by: java.lang.NoSuchMethodException: org.springframework.kafka.core.KafkaTemplate.<init>()
at java.lang.Class.getConstructor0(Class.java:3082)
at java.lang.Class.getDeclaredConstructor(Class.java:2178)
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:78)
... 102 more
You need to show you code and configuration and the full stack trace (you should never edit/truncate stack traces here). The error seems quite clear:
Caused by: java.lang.NoSuchMethodException: org.springframework.kafka.core.KafkaTemplate.()
There is no no-arg constructor - it needs a producer factory; we need to see the code and configuration to figure out who's trying to create a template with no PF.
Normaly, Spring Boot will automatically configure a KafkaTemplate for you.
Thank you! The problem wa in the tests - I incorrectly determined the generic type of KafkaTemplate. I used
KafkaTemplate <String, Bytes> instead of KafkaTemplate<String, Message> which I am using in app code. So I suppose, test spring context could not define proper bean to autowire.
I'm using KStreams which have a AVRO schema and are hooked up with the schema registry. When I start processing the stream, I get a NullPointerException as follows:
Caused by: org.apache.kafka.common.errors.SerializationException: Error serializing Avro message
Caused by: java.lang.NullPointerException: null of int in field SCORE_THRSHLD_EXCD of gbl_au_avro
at org.apache.avro.generic.GenericDatumWriter.npe(GenericDatumWriter.java:145)
at org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:139)
at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:75)
at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:62)
at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:92)
at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:53)
at io.confluent.kafka.streams.serdes.avro.GenericAvroSerializer.serialize(GenericAvroSerializer.java:63)
at io.confluent.kafka.streams.serdes.avro.GenericAvroSerializer.serialize(GenericAvroSerializer.java:39)
at org.apache.kafka.streams.processor.internals.RecordCollectorImpl.send(RecordCollectorImpl.java:154)
at org.apache.kafka.streams.processor.internals.RecordCollectorImpl.send(RecordCollectorImpl.java:98)
at org.apache.kafka.streams.processor.internals.SinkNode.process(SinkNode.java:89)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:143)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:126)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:90)
Not sure what the issue is here. Both source and sink have schema repository linked to them and have schemas defined correctly.
Can you please suggest what I might be doing wrong?