I have a simple case class:
case class User(id: String, login: String, key: String)
i am add field "name"
case class User(id: String, login: String, name: String, key: String)
then add this field in avro schema (user.avsc)
{
"namespace": "test",
"type": "record",
"name": "User",
"fields": [
{ "name": "id", "type": "string" },
{ "name": "login", "type": "string" },
{ "name": "name", "type": "string" },
{ "name": "key", "type": "string" }
]
}
this class is used other case class:
case class AuthRequest(user: User, session: String)
chema (auth_request.avsc)
{
"namespace": "test",
"type": "record",
"name": "AuthRequest",
"fields": [
{ "name": "user", "type": "User" },
{ "name": "session", "type": "string" }
]
}
after that change my consumer start throws exceptons
Consumer.committableSource(consumerSettings, Subscriptions.topics("token_service_auth_request"))
.map { msg =>
Try {
val in: ByteArrayInputStream = new ByteArrayInputStream(msg.record.value())
val input: AvroBinaryInputStream[AuthRequest] = AvroInputStream.binary[AuthRequest](in)
val result: AuthRequest = input.iterator.toSeq.head !!!! here is exception
msg.committableOffset.commitScaladsl()
(msg.record.value(), result, msg.record.key())
} match {
case Success((a: Array[Byte], value: AuthRequest, key: String)) =>
log.info(s"listener got $msg -> $a -> $value")
context.parent ! value
case Failure(e) => e.printStackTrace()
}
}
.runWith(Sink.ignore)
java.util.NoSuchElementException: head of empty stream at
scala.collection.immutable.Stream$Empty$.head(Stream.scala:1104) at
scala.collection.immutable.Stream$Empty$.head(Stream.scala:1102) at
test.consumers.AuthRequestListener.$anonfun$new$2(AuthRequestListener.scala:39)
at scala.util.Try$.apply(Try.scala:209) at
test.consumers.AuthRequestListener.$anonfun$new$1(AuthRequestListener.scala:36)
at
test.consumers.AuthRequestListener.$anonfun$new$1$adapted(AuthRequestListener.scala:35)
at akka.stream.impl.fusing.Map$$anon$9.onPush(Ops.scala:51) at
akka.stream.impl.fusing.GraphInterpreter.processPush(GraphInterpreter.scala:519)
at
akka.stream.impl.fusing.GraphInterpreter.processEvent(GraphInterpreter.scala:482)
at
akka.stream.impl.fusing.GraphInterpreter.execute(GraphInterpreter.scala:378)
at
akka.stream.impl.fusing.GraphInterpreterShell.runBatch(ActorGraphInterpreter.scala:588)
at
akka.stream.impl.fusing.GraphInterpreterShell$AsyncInput.execute(ActorGraphInterpreter.scala:472)
at
akka.stream.impl.fusing.GraphInterpreterShell.processEvent(ActorGraphInterpreter.scala:563)
at
akka.stream.impl.fusing.ActorGraphInterpreter.akka$stream$impl$fusing$ActorGraphInterpreter$$processEvent(ActorGraphInterpreter.scala:745)
at
akka.stream.impl.fusing.ActorGraphInterpreter$$anonfun$receive$1.applyOrElse(ActorGraphInterpreter.scala:760)
at akka.actor.Actor.aroundReceive(Actor.scala:517) at
akka.actor.Actor.aroundReceive$(Actor.scala:515) at
akka.stream.impl.fusing.ActorGraphInterpreter.aroundReceive(ActorGraphInterpreter.scala:670)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:588) at
akka.actor.ActorCell.invoke(ActorCell.scala:557) at
akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258) at
akka.dispatch.Mailbox.run(Mailbox.scala:225) at
akka.dispatch.Mailbox.exec(Mailbox.scala:235) at
akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at
akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at
akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at
akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
I tried to clean builds and invalidate cache - i seems like some kind of caching previous version of schema in some there
Help please!
You need to make your change backward compatible making the new field nullable and adding a default value to it.
{
"namespace": "test",
"type": "record",
"name": "User",
"fields": [
{ "name": "id", "type": "string" },
{ "name": "login", "type": "string" },
{ "name": "name", "type": ["null", "string"], "default": null },
{ "name": "key", "type": "string" }
]
}
Related
trying to generate classes using avrohugger(https://github.com/julianpeeters/avrohugger#description)
Here is my schema:
{
"name": "test1",
"namespace": "test.testaero",
"type": "map",
"values": [
{
"type": "map",
"values": [
"boolean",
{
"type": "map",
"values": [
"null",
"string",
"boolean",
{
"type": "map",
"values": [
"null",
"string",
"boolean",
"int",
{
"type": "map",
"values": [
"null",
"string",
"int"
],
"default": null
}
],
"default": null
}
],
"default": null
}
]
}
]
}
And code :
object AvroParser extends App{
val inputPath = "app/dto/roman/src/main/resources/tests.avsc"
val outPutPath = "src/main/scala"
val schemaFile = new File(inputPath)
private val scalaTypes: AvroScalaTypes = SpecificRecord.defaultTypes.copy(map = avrohugger.types.ScalaMap)
val generator = new Generator(Standard, avroScalaCustomTypes = Some(scalaTypes))
generator.fileToFile(schemaFile, outPutPath)
}
My types in the schema is a map and I failing in function :
def getSchemaOrProtocols(
infile: File,
format: SourceFormat,
classStore: ClassStore,
classLoader: ClassLoader,
parser: Parser = schemaParser): List[Either[Schema, Protocol]] = {
def unUnion(schema: Schema) = {
schema.getType match {
case UNION => schema.getTypes().asScala.toList
case RECORD => List(schema)
case ENUM => List(schema)
case FIXED => List(schema)
case _ => sys.error("""Neither a record, enum nor a union of either.
|Nothing to map to a definition.""".trim.stripMargin)
}
}
where the type map not matching any of the types from below. How can I adopt schema or maybe i am not passing right arguments?
I am working on trying to correctly document our rest endpoints. As an example to get this working I created a sample "Healthcheck getStatus()" endpoint which is returning an object called "EndpointStatus" which has 3 fields (class is below). I was able to get this object documenting correctly and using the camel-swagger-java component and the below rest configuration / definition;
restConfiguration()
.apiContextPath(apiContextPath)
.apiProperty("api.title", "Camel Service").apiProperty("api.version", "1.0.0")
// and enable CORS
.apiProperty("cors", "true");
rest()
.path("/healthcheck")
.description("Health Check REST service")
.get("getStatus/{endpointName}")
.param()
.name("endpointName")
.type(RestParamType.path)
.allowableValues(
Stream.of(EndpointName.values())
.map(EndpointName::name)
.collect(Collectors.toList()))
.required(true)
.endParam()
.description("Get Camel Status")
.id("getStatus")
.outType(EndpointStatus.class)
.bindingMode(RestBindingMode.auto)
.responseMessage().code(200).message("Returns an EndpointStatus object representing state of a camel endpoint").endResponseMessage()
.to(CAMEL_STATUS_URI);
Here are the annotations I used on this class:
#ApiModel(description = "Endpoint Status Model")
public class EndpointStatus {
private boolean isAvailable;
private EndpointName name;
private long timestamp;
#ApiModelProperty(value = "Is the endpoint available", required = true)
public boolean isAvailable() {
return isAvailable;
}
public void setAvailable(boolean available) {
isAvailable = available;
}
#ApiModelProperty(value = "The name of the endpoint", required = true)
public EndpointName getName() {
return name;
}
public void setName(EndpointName name) {
this.name = name;
}
#ApiModelProperty(value = "The timestamp the endpoint was checked", required = true)
public long getTimestamp() {
return timestamp;
}
public void setTimestamp(long timestamp) {
this.timestamp = timestamp;
}
}
Along with the generated swagger documentation:
{
"swagger": "2.0",
"info": {
"version": "1.0.0",
"title": "Camel Service"
},
"host": "localhost:9000",
"tags": [
{
"name": "healthcheck",
"description": "Health Check REST service"
}
],
"schemes": [
"http"
],
"paths": {
"/healthcheck/getStatus/{endpointName}": {
"get": {
"tags": [
"healthcheck"
],
"summary": "Get Camel Status",
"operationId": "getStatus",
"parameters": [
{
"name": "endpointName",
"in": "path",
"required": true,
"type": "string",
"enum": [
"ENDPOINTA",
"ENDPOINTB"
]
}
],
"responses": {
"200": {
"description": "Returns an EndpointStatus object representing state of a camel endpoint",
"schema": {
"$ref": "#/definitions/EndpointStatus"
}
}
}
}
}
},
"definitions": {
"EndpointStatus": {
"type": "object",
"required": [
"available",
"name",
"timestamp"
],
"properties": {
"name": {
"type": "string",
"description": "The name of the endpoint",
"enum": [
"ENDPOINTA",
"ENDPOINTB"
]
},
"timestamp": {
"type": "integer",
"format": "int64",
"description": "The timestamp the endpoint was checked"
},
"available": {
"type": "boolean",
"description": "Is the endpoint available"
}
},
"description": "Endpoint Status Model"
}
}
}
However, when trying to move to use camel-openapi-java which supports OpenAPI Specification v3 with the same setup I am getting EndpointStatus without any fields / descriptions in my documentation.
{
"openapi": "3.0.2",
"info": {
"title": "SurePath Camel Service",
"version": "1.0.0"
},
"servers": [
{
"url": ""
}
],
"paths": {
"/healthcheck/getStatus/{endpointName}": {
"get": {
"tags": [
"healthcheck"
],
"parameters": [
{
"name": "endpointName",
"schema": {
"enum": [
"ENDPOINTA",
"ENDPOINTB"
],
"type": "string"
},
"in": "path",
"required": true
}
],
"responses": {
"200": {
"description": "Returns an EndpointStatus object representing state of a camel endpoint"
}
},
"operationId": "getStatus",
"summary": "Get Camel Status",
"x-camelContextId": "camel-1",
"x-routeId": "getStatus"
}
},
"/healthcheck/isAvailable": {
"get": {
"tags": [
"healthcheck"
],
"responses": {
"200": {
"description": "Returns status code 200 when Camel is available"
}
},
"operationId": "verb1",
"summary": "Is Camel Available",
"x-camelContextId": "camel-1",
"x-routeId": "route4"
}
}
},
"components": {
"schemas": {
"EndpointStatus": {
"type": "EndpointStatus",
"x-className": {
"format": "com.sample.bean.EndpointStatus",
"type": "string"
}
}
}
},
"tags": [
{
"name": "healthcheck",
"description": "Health Check REST service"
}
]
}
I have tried adding this into my responseMessage and it is still not documenting correctly;
responseMessage().code(200).responseModel(EndpointStatus.class).message("Returns an EndpointStatus object representing state of a camel endpoint").endResponseMessage()
Do I need different annotations / RestDefinition config to get this EndpointStatus class appearing correctly in the OpenAPI documentation?
This looks to be an issue at the moment with the camel-openapi-java component; waiting for a resolution from this jira https://issues.apache.org/jira/browse/CAMEL-15158
I am trying to write a function to calculate a diff between two avro schemas and generate another schema.
schema_one = {
"type": "record",
"name": "schema_one",
"namespace": "test",
"fields": [
{
"name": "type",
"type": "string"
},
{
"name": "id",
"type": "string"
}
]
}
schema_two = {
"type": "record",
"name": "schema_two",
"namespace": "test",
"fields": [
{
"name": "type",
"type": "string"
}
]
}
To get elements field in schema_one not in schema_two
import org.apache.avro.Schema._
import org.apache.avro.{Schema, SchemaBuilder}
val diff: Set[Schema.Field] = schema_one.getFields.asScala.toSet.filterNot(schema_two.getFields.asScala.toSet)
So far, so good.
I want to build a new schema from diff and I expect it to be:
schema_three = {
"type": "record",
"name": "schema_three",
"namespace": "test",
"fields": [
{
"name": "id",
"type": "string"
}
]
}
I cant seem to find any method within Avro SchemaBuilder to achieve this without having to explicitly provide named fields. i.e build Schema given Schema.Fields
For example:
SchemaBuilder.record("schema_three").namespace("test").fromFields(diff)
Is there a way to achieve this? Appreciate comments.
I was able to achieve this using the kite sdk "org.kitesdk" % "kite-data-core" % "1.1.0"
val schema_namespace = schema_one.getNamespace
val schema_name = schema_one.getName
val schemas = diff.map( f => {
SchemaBuilder
.record(schema_name)
.namespace(schema_namespace)
.fields()
.name(f.name())
.`type`(f.schema())
.noDefault()
.endRecord()
}
)
val schema_three = SchemaUtil.merge(schemas.asJava)
I try to familiarize myself with the Reindexing API of ElasticSearch and the use of Painless scripts.
I have the following model:
"mappings": {
"customer": {
"properties": {
"firstName": {
"type": "text",
"fields": {
"keyword": {
"ignore_above": 256,
"type": "keyword"
}
}
},
"lastName": {
"type": "text",
"fields": {
"keyword": {
"ignore_above": 256,
"type": "keyword"
}
}
},
"dateOfBirth": {
"type": "date"
}
}
}
}
I would like to reindex all documents from test-v1 to test-v2 and apply a few transformations on them (for example extract the year part of dateOfBirth, convert a date value to a timestamp, etc) and save the result as a new field. But I got an issue when I tried to access it.
When I made the following call, I got an error:
POST /_reindex?pretty=true&human=true&wait_for_completion=true HTTP/1.1
Host: localhost:9200
Content-Type: application/json
{
"source": {
"index": "test-v1"
},
"dest": {
"index": "test-v2"
},
"script": {
"lang": "painless",
"inline": "ctx._source.yearOfBirth = ctx._source.dateOfBirth.getYear();"
}
}
And the response:
{
"error": {
"root_cause": [
{
"type": "script_exception",
"reason": "runtime error",
"script_stack": [
"ctx._source.yearOfBirth = ctx._source.dateOfBirth.getYear();",
" ^---- HERE"
],
"script": "ctx._source.yearOfBirth = ctx._source.dateOfBirth.getYear();",
"lang": "painless"
}
],
"type": "script_exception",
"reason": "runtime error",
"script_stack": [
"ctx._source.yearOfBirth = ctx._source.dateOfBirth.getYear();",
" ^---- HERE"
],
"script": "ctx._source.yearOfBirth = ctx._source.dateOfBirth.getYear();",
"lang": "painless",
"caused_by": {
"type": "illegal_argument_exception",
"reason": "Unable to find dynamic method [getYear] with [0] arguments for class [java.lang.String]."
}
},
"status": 500
}
According to this tutorial Date fields are exposed as ReadableDateTime so they support methods like getYear, and getDayOfWeek. and indeed, the Reference mentions those as supported methods.
Still, the response mentions [java.lang.String] as the type of the dateOfBirth property. I could just parse it to e.g. an OffsetDateTime, but I wonder why it is a string.
Anyone has a suggestion what I'm doing wrong?
Considering 2 sets of data as follows:
JSON1=> {
"data": [
{"id": "1-abc",
"model": "Agile",
"status":"open"
"configuration": {
"state": "running",
"rootVolumeSize": "0.00000",
"count": "2",
"type": "large",
"platform": "Linux"
}
"stateId":"123-567"
}
]}
JSON2=>{
"data": [
{"id": "1-abc",
"model": "Agile",
"configuration": {
"state": "running",
"diskSize": "0",
"type": "small",
"platform":"Windows"
}
}
]}
I need to compare JSON1 and JSON2 based on the 1st field id and if they match , I need to merge JSON1 with JSON 2 retaining the existing values in JSON2( only append fields not present).
I have coded the same as below:
private def merger(JSON1: Seq[JSON], JSON2: Seq[JSON]):Seq[JSON] = {
val abcKey = JSON1.groupBy(_.id) map { case (k, v) => (k, v.head)
val mergedRecords = for {
xyzJSON<- JSON2
} yield (
abcKey.get(xyzJSON.id) match {
case Some(JSON1) => xyzJSON.copy(status = JSON1.status,
stateId = JSON1.stateId)
case None => xyzJSON.copy(origin = "N/A")
}
)
I am not able to derive at a solution for reconciling the fields within the configurationMap.
Expected result set should be like:
{
"data": [
{"id": "1-abc",
"model": "Agile",
"status":"open"
"configuration": {
"state": "running",
"diskSize": "0",
"rootVolumeSize": "0.00000",
"count": "2",
"type": "small",
"platform": "Windows",
}
"stateId":"123-567"
}
]}