Get 500 error on update Pulsar schema of type "JSON" using admin API - http-status-code-500

I am trying to update Pulsar schema of type "JSON" using Admin API.
I have a pulsar namespace "lol" and topic "sdf" with a single schema version
List of topic schema versions
I tried to update this schema by posting another JSON schema but received a 500 error.
Post request and response
A Pulsar logfile is empty. Pulsar stack trace has nothing informative.
09:41:36.930 [BookKeeperClientWorker-OrderedExecutor-0-0] INFO org.eclipse.jetty.server.RequestLog - 127.0.0.1 - - [20/Jul/2021:09:41:36 +0300] "POST /admin/v2/schemas/public/lol/sdf/schema HTTP/1.1" 500 565 "-" "PostmanRuntime/7.26.8" 159
When I try to update the schema of type "AVRO" in the same way, everything works fine and the schema version rises up.
Can anybody help to find the cause of such weired behavior?
Here is request body
{"type": "JSON", "schema": "{ \"$id\": \"https://example.com/person.schema.json\", \"$schema\": \"https://json-schema.org/draft/2020-12/schema\", \"title\": \"Person\", \"type\": \"object\", \"properties\": { \"firstName\": { \"type\": \"string\", \"description\": \"The person's first name.\" }, \"lastName\": { \"type\": \"string\", \"description\": \"The person's last name.\" }, \"age\": { \"description\": \"Age in years which must be equal to or greater than zero.\", \"type\": \"integer\", \"minimum\": 0 } }}", "properties": {} }
Here is the current schema definition "GET /admin/v2/schemas/public/lol/sdf/schema"
{
"version": 0,
"type": "JSON",
"timestamp": 0,
"data": "{ \"$id\": \"https://example.com/person.schema.json\", \"$schema\": \"https://json-schema.org/draft/2020-12/schema\", \"title\": \"Person\", \"type\": \"object\", \"properties\": { \"firstName\": { \"type\": \"string\", \"description\": \"The person's first name.\" }, \"surname\": { \"type\": \"string\", \"description\": \"The person's last name.\" }, \"age\": { \"description\": \"Age in years which must be equal to or greater than zero.\", \"type\": \"integer\", \"minimum\": 0 } }}",
"properties": {}

Related

.Net Confluent.Kafka.SchemaRegistry ValueSerializer error

Manually registered schema in SchemaRegistry using curl command. Schema registered is:
'{ "schema": "{ \"type\": \"record\", \"name\": \"Person\", \"namespace\": \"com.xxx\", \"fields\": [ { \"name\": \"firstName\", \"type\": \"string\" }, { \"name\": \"lastName\", \"type\": \"string\" }, { \"name\": \"age\", \"type\": \"long\" } ]}" }'
Created code in .Net refering to link https://github.com/confluentinc/confluent-kafka-dotnet/blob/master/examples/JsonSerialization/Program.cs but having below error:
One or more errors occurred. (Local: Value serialization error)
The JSON schema corresponding to the written data:
{"type":"record","name":"Person","namespace":"com.xxx","fields":[{"name":"firstName","type":"string"},{"name":"lastName","type":"string"},{"name":"age","type":"long"}]}

Invalid story format failed to parse story while posting Rasa X Http API

I am trying to create a story using POST in Postman tool and below is my story format .
I am using below format because in GET request I got the story in the same format.
{
"id": 65,
"name": "interactive_story_65",
"story": "35 interactive_story_65\n* emp_info\n - utter_employee",
"annotation": {
"user": "me",
"time": 1597919151.8836874962
},
"filename": "data\\stories.md"
}
But, I am getting below error:
{
"version": "0.31.0",
"status": "failure",
"message": "Failed to parse story.",
"reason": "StoryParseError",
"details": "Invalid story format. Failed to parse '## {\r\n \"id\": 65,\r\n \"name\": \"interactive_story_65\",\r\n \"story\": \"## interactive_story_65\\n* emp_info\\n - utter_employee\",\r\n \"annotation\": {\r\n \"user\": \"me\",\r\n \"time\": 1597919151.8836874962\r\n },\r\n \"filename\": \"data\\\\stories.md\"\r\n }'",
"help": null,
"code": 400
}
Attached is below screenshot:
enter image description here
Please help.
This endpoint is actually expecting plain markdown, with text/x-markdown as the content-type header. If you look closely at the docs, you'll see that you're using the response schema as the request schema - I did that too at first. The request schema is just a markdown string e.g.
curl --request PUT \
--url http://localhost:5002/api/stories \
--header 'authorization: Bearer <Token>' \
--header 'content-type: text/x-markdown' \
--data '## greet
* greet
- utter_greet\n'

Backward Comaptibility issue and uncertainity in Schema Registry

I have a use case where I have a JSON and I want to generate schema and record out of the JSON and publish a record.
I have configured the value serializer and Schema setting is Backward compatible.
First JSON
String json = "{\n" +
" \"id\": 1,\n" +
" \"name\": \"Headphones\",\n" +
" \"price\": 1250.0,\n" +
" \"tags\": [\"home\", \"green\"]\n" +
"}\n"
;
Version 1 schema registered.
Received message in avro console consumer.
Second JSON.
String json = "{\n" +
" \"id\": 1,\n" +
" \"price\": 1250.0,\n" +
" \"tags\": [\"home\", \"green\"]\n" +
"}\n"
;
Registered schema Successfully.
Sent message.
Now tried sending the JSON 1 sent successfully
Schema 3:
String json = "{\n" +
" \"id\": 1,\n" +
" \"name\": \"Headphones\",\n" +
" \"tags\": [\"home\", \"green\"]\n" +
"}\n"
;
Got error for this case.
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Schema being registered is incompatible with an earlier schema; error code: 409
How is that schema generated from 2nd JSON was registered and the
third one was rejected? Although I didn't have any Default key for the
deleted field? Is it that the Schema Registry always accepts the 1st
evolution? (2nd schema over 1st)
Schema in schema registry
Version 1 schema
{
"fields": [
{
"doc": "Type inferred from '1'",
"name": "id",
"type": "int"
},
{
"doc": "Type inferred from '\"Headphones\"'",
"name": "name",
"type": "string"
},
{
"doc": "Type inferred from '1250.0'",
"name": "price",
"type": "double"
},
{
"doc": "Type inferred from '[\"home\",\"green\"]'",
"name": "tags",
"type": {
"items": "string",
"type": "array"
}
}
],
"name": "myschema",
"type": "record" }
Version 2:
{
"fields": [
{
"doc": "Type inferred from '1'",
"name": "id",
"type": "int"
},
{
"doc": "Type inferred from '1250.0'",
"name": "price",
"type": "double"
},
{
"doc": "Type inferred from '[\"home\",\"green\"]'",
"name": "tags",
"type": {
"items": "string",
"type": "array"
}
}
],
"name": "myschema",
"type": "record" }
Let's go over the backwards compatibility rules... https://docs.confluent.io/current/schema-registry/avro.html#compatibility-types
First, the default isn't transitive, so version 3 only will look at version 2.
The backwards rule states you can delete fields or add optional fields (those with a default). I assume your schema generator tool doesn't know how to use optionals, so you're only allowed to delete, not add.
Between version 1 and 2, you've deleted the name field, which is valid.
Between version 2 and the incoming 3, it thinks you're trying to post a new schema which removes price (this is okay}, but adds a required name field, which is not allowed.

Adding a new field to Avro schema made it incompatible with an earlier version [duplicate]

I have a use case where I have a JSON and I want to generate schema and record out of the JSON and publish a record.
I have configured the value serializer and Schema setting is Backward compatible.
First JSON
String json = "{\n" +
" \"id\": 1,\n" +
" \"name\": \"Headphones\",\n" +
" \"price\": 1250.0,\n" +
" \"tags\": [\"home\", \"green\"]\n" +
"}\n"
;
Version 1 schema registered.
Received message in avro console consumer.
Second JSON.
String json = "{\n" +
" \"id\": 1,\n" +
" \"price\": 1250.0,\n" +
" \"tags\": [\"home\", \"green\"]\n" +
"}\n"
;
Registered schema Successfully.
Sent message.
Now tried sending the JSON 1 sent successfully
Schema 3:
String json = "{\n" +
" \"id\": 1,\n" +
" \"name\": \"Headphones\",\n" +
" \"tags\": [\"home\", \"green\"]\n" +
"}\n"
;
Got error for this case.
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Schema being registered is incompatible with an earlier schema; error code: 409
How is that schema generated from 2nd JSON was registered and the
third one was rejected? Although I didn't have any Default key for the
deleted field? Is it that the Schema Registry always accepts the 1st
evolution? (2nd schema over 1st)
Schema in schema registry
Version 1 schema
{
"fields": [
{
"doc": "Type inferred from '1'",
"name": "id",
"type": "int"
},
{
"doc": "Type inferred from '\"Headphones\"'",
"name": "name",
"type": "string"
},
{
"doc": "Type inferred from '1250.0'",
"name": "price",
"type": "double"
},
{
"doc": "Type inferred from '[\"home\",\"green\"]'",
"name": "tags",
"type": {
"items": "string",
"type": "array"
}
}
],
"name": "myschema",
"type": "record" }
Version 2:
{
"fields": [
{
"doc": "Type inferred from '1'",
"name": "id",
"type": "int"
},
{
"doc": "Type inferred from '1250.0'",
"name": "price",
"type": "double"
},
{
"doc": "Type inferred from '[\"home\",\"green\"]'",
"name": "tags",
"type": {
"items": "string",
"type": "array"
}
}
],
"name": "myschema",
"type": "record" }
Let's go over the backwards compatibility rules... https://docs.confluent.io/current/schema-registry/avro.html#compatibility-types
First, the default isn't transitive, so version 3 only will look at version 2.
The backwards rule states you can delete fields or add optional fields (those with a default). I assume your schema generator tool doesn't know how to use optionals, so you're only allowed to delete, not add.
Between version 1 and 2, you've deleted the name field, which is valid.
Between version 2 and the incoming 3, it thinks you're trying to post a new schema which removes price (this is okay}, but adds a required name field, which is not allowed.

Insert a note in a Google Sheet with google-sheets-api

I'm looking at the following documentation for Google Sheets and I'm wondering how do I add/update a cell with a Note attached to it.
https://developers.google.com/sheets/api/reference/rest/v4/spreadsheets.values/update
PUT https://sheets.googleapis.com/v4/spreadsheets/{spreadsheetId}/values/{range}
I can't seem to find a sample request body of what it looks like to add a note. I've tried to send a PUT request but of course it returned a 400.
{
"values": [
[
{
"note": "sample note"
}
]
]
}
This returns the following error:
{
"error": {
"code": 400,
"message": "Invalid values[17][0]: struct_value {\n fields {\n key: \"note\"\n value {\n string_value: \"sample note\"\n }\n }\n}\n",
"status": "INVALID_ARGUMENT"
}
}