Confluent Schema registry unable to save with nested object [duplicate] - apache-kafka

Whenever there is nested object in Avro class schema is not getting saved
always get exception like
org.apache.kafka.connect.errors.ConnectException: Tolerance exceeded in error handler
connect_1 | at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:223)
connect_1 | at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:149)
connect_1 | at org.apache.kafka.connect.runtime.WorkerSourceTask.convertTransformedRecord(WorkerSourceTask.java:330)
connect_1 | at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:356)
connect_1 | at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:258)
connect_1 | at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188)
connect_1 | at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:243)
connect_1 | at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
connect_1 | at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
connect_1 | at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
connect_1 | at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
connect_1 | at java.base/java.lang.Thread.run(Thread.java:829)
connect_1 | Caused by: org.apache.avro.SchemaParseException: Can't redefine: io.confluent.connect.avro.ConnectDefault
connect_1 | at org.apache.avro.Schema$Names.put(Schema.java:1550)
connect_1 | at org.apache.avro.Schema$NamedSchema.writeNameRef(Schema.java:813)
connect_1 | at org.apache.avro.Schema$RecordSchema.toJson(Schema.java:975)
connect_1 | at org.apache.avro.Schema$UnionSchema.toJson(Schema.java:1242)
connect_1 | at org.apache.avro.Schema$RecordSchema.fieldsToJson(Schema.java:1003)
connect_1 | at org.apache.avro.Schema$RecordSchema.toJson(Schema.java:987)
connect_1 | at org.apache.avro.Schema.toString(Schema.java:426)
connect_1 | at org.apache.avro.Schema.toString(Schema.java:398)
connect_1 | at org.apache.avro.Schema.toString(Schema.java:389)
connect_1 | at io.apicurio.registry.serde.avro.AvroKafkaSerializer.getSchemaFromData(AvroKafkaSerializer.java:108)
connect_1 | at io.apicurio.registry.serde.AbstractKafkaSerializer.lambda$serialize$0(AbstractKafkaSerializer.java:90)
connect_1 | at io.apicurio.registry.serde.LazyLoadedParsedSchema.getRawSchema(LazyLoadedParsedSchema.java:55)
connect_1 | at io.apicurio.registry.serde.DefaultSchemaResolver.resolveSchema(DefaultSchemaResolver.java:81)
connect_1 | at io.apicurio.registry.serde.AbstractKafkaSerializer.serialize(AbstractKafkaSerializer.java:92)
connect_1 | at io.apicurio.registry.serde.AbstractKafkaSerializer.serialize(AbstractKafkaSerializer.java:79)
connect_1 | at io.apicurio.registry.utils.converter.SerdeBasedConverter.fromConnectData(SerdeBasedConverter.java:111)
connect_1 | at org.apache.kafka.connect.storage.Converter.fromConnectData(Converter.java:64)
connect_1 | at org.apache.kafka.connect.runtime.WorkerSourceTask.lambda$convertTransformedRecord$3(WorkerSourceTask.java:330)
connect_1 | at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:173)
connect_1 | at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:207)
connect_1 | ... 11 more
Tools useed
Debezium
Apicurio Schema registry
Avro format

Related

Unable to register schame using Avro with Nested class

Whenever there is nested object in Avro class schema is not getting saved
always get exception like
org.apache.kafka.connect.errors.ConnectException: Tolerance exceeded in error handler
connect_1 | at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:223)
connect_1 | at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:149)
connect_1 | at org.apache.kafka.connect.runtime.WorkerSourceTask.convertTransformedRecord(WorkerSourceTask.java:330)
connect_1 | at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:356)
connect_1 | at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:258)
connect_1 | at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188)
connect_1 | at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:243)
connect_1 | at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
connect_1 | at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
connect_1 | at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
connect_1 | at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
connect_1 | at java.base/java.lang.Thread.run(Thread.java:829)
connect_1 | Caused by: org.apache.avro.SchemaParseException: Can't redefine: io.confluent.connect.avro.ConnectDefault
connect_1 | at org.apache.avro.Schema$Names.put(Schema.java:1550)
connect_1 | at org.apache.avro.Schema$NamedSchema.writeNameRef(Schema.java:813)
connect_1 | at org.apache.avro.Schema$RecordSchema.toJson(Schema.java:975)
connect_1 | at org.apache.avro.Schema$UnionSchema.toJson(Schema.java:1242)
connect_1 | at org.apache.avro.Schema$RecordSchema.fieldsToJson(Schema.java:1003)
connect_1 | at org.apache.avro.Schema$RecordSchema.toJson(Schema.java:987)
connect_1 | at org.apache.avro.Schema.toString(Schema.java:426)
connect_1 | at org.apache.avro.Schema.toString(Schema.java:398)
connect_1 | at org.apache.avro.Schema.toString(Schema.java:389)
connect_1 | at io.apicurio.registry.serde.avro.AvroKafkaSerializer.getSchemaFromData(AvroKafkaSerializer.java:108)
connect_1 | at io.apicurio.registry.serde.AbstractKafkaSerializer.lambda$serialize$0(AbstractKafkaSerializer.java:90)
connect_1 | at io.apicurio.registry.serde.LazyLoadedParsedSchema.getRawSchema(LazyLoadedParsedSchema.java:55)
connect_1 | at io.apicurio.registry.serde.DefaultSchemaResolver.resolveSchema(DefaultSchemaResolver.java:81)
connect_1 | at io.apicurio.registry.serde.AbstractKafkaSerializer.serialize(AbstractKafkaSerializer.java:92)
connect_1 | at io.apicurio.registry.serde.AbstractKafkaSerializer.serialize(AbstractKafkaSerializer.java:79)
connect_1 | at io.apicurio.registry.utils.converter.SerdeBasedConverter.fromConnectData(SerdeBasedConverter.java:111)
connect_1 | at org.apache.kafka.connect.storage.Converter.fromConnectData(Converter.java:64)
connect_1 | at org.apache.kafka.connect.runtime.WorkerSourceTask.lambda$convertTransformedRecord$3(WorkerSourceTask.java:330)
connect_1 | at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:173)
connect_1 | at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:207)
connect_1 | ... 11 more
Tools useed
Debezium
Apicurio Schema registry
Avro format

Kafka connect | Failed to deserialize data for topic | Error retrieving Avro key / value schema version for id | Subject not found error code: 40401

first of all thanks to #OneCricketeer for your support so far. I have tried so many configurations by now that I don't know what else I could try.
Using confluent connect-standalone worker.properties sink.properties to access an external stream.
Connection is working and i can see that a offset is loaded:
INFO [my_mysql_sink|task-0] [Consumer clientId=connector-consumer-my_mysql_sink-0, groupId=connect-my_mysql_sink] Setting offset for partition gamerboot.gamer.master.workouts.clubs.spieleranalyse-1 to the committed offset FetchPosition{
offset=2225 , offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[kafka8.pro.someurl.net:9093 (id: 8 rack: null)], epoch=0}} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:844)
But afterwards i receive an error when newly messages come in:
ERROR [my_mysql_sink|task-0] WorkerSinkTask{id=my_mysql_sink-0} Error converting message key in topic 'gamerboot.gamer.master.workouts.clubs.spieleranalyse' partition 1 at offset 2225 and timestamp 1641459346507: Failed to deserialize data for topic gamerboot.gamer.master.workouts.clubs.spieleranalyse to Avro:
Caused by: org.apache.kafka.common.errors.SerializationException: Error retrieving Avro key schema version for id 422
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Subject not found.; error code: 40401
I do not get this.
worker.properties:
key.converter=io.confluent.connect.avro.AvroConverter
value.converter=io.confluent.connect.avro.AvroConverter
sink.properties
#key.converter.enhanced.avro.schema.support=true
#key.converter=org.apache.kafka.connect.storage.StringConverter
key.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=https://schema-reg.pro.someurl.net
#value.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=https://schema-reg.pro.someurl.net
#key.converter.key.subject.name.strategy=io.confluent.kafka.serializers.subject.TopicRecordNameStrategy
#value.converter.value.subject.name.strategy=io.confluent.kafka.serializers.subject.TopicRecordNameStrategy
#key.converter.key.subject.name.strategy=io.confluent.kafka.serializers.subject.RecordNameStrategy
#value.converter.value.subject.name.strategy=io.confluent.kafka.serializers.subject.RecordNameStrategy
#pk.mode=record_key
#pk.fields=
As there is no pk set in mysql i wanted to record everything from stream.
As it says "Error retrieving Avro key schema version for id 422" I can see the following:
screenshot_subject_id
Do not wonder as it says JSON, this is just my ChromePlugin which interprets this as json.
Same is found for value. I also tried every combination in sink.properties which is commented out there.
I also be able to curl the latest schema for key and value (like):
curl -s
https://schema-reg.pro.someurl.net/subjects/gamerboot.gamer.master.club-com.ad.gamerboot.kafka.models.workouts.WorkoutKey/versions/latest|jq
{
"type": "record",
"name": "ClubWorkoutKey",
"namespace": "com.ad.gamerboot.kafka.models.workouts",
"fields": [
{
"name": "playerId",
"type": "string"
},
{
"name": "tagId",
"type": [
"null",
"string"
],
"default": null
}
]
}
It went a little further when I entered String Converter for key.converter and value.converter in sink.properties . But what must be wrong in my opinion, because Avro is passed here. With String there were then other problems, I would have had to set a pk and switch on delete etc.
Thanks for support.
*EDIT:
So, to me was given:
topic = gamerboot.gamer.master.workouts.clubs.spieleranalyse
schema.url = https://schema-reg.pro.someurl.net
as well as: schema id url:
https://schema-reg.pro.someurl.net/subjects/gamerboot.gamer.master.workouts-com.ad.gamerboot.kafka.models.workouts.WorkoutKickValue/versions/latest/schema
and:
https://schema-reg.pro.someurl.net/subjects/gamerboot.gamer.master.club-com.ad.gamerboot.kafka.models.workouts.WorkoutKickValue/versions/latest
For me its like a puzzle, i started with kafka 20 days ago. From there i tried the urls around and found the ones i posted for subject:
For Key: https://schema-reg.pro.someurl.net/subjects/gamerboot.gamer.master.club-com.ad.gamerboot.kafka.models.workouts.WorkoutKey/versions/latest/
Schema: {"subject":"gamerboot.gamer.master.club-com.ad.gamerboot.kafka.models.workouts.WorkoutKey","version":1,"id":422,"schema":"{\"type\":\"record\",\"name\":\"ClubWorkoutKey\",\"namespace\":\"com.ad.gamerboot.kafka.models.workouts\",\"fields\":[{\"name\":\"playerId\",\"type\":\"string\"},{\"name\":\"tagId\",\"type\":[\"null\",\"string\"],\"default\":null}]}"}
For Values: https://schema-reg.pro.someurl.net/subjects/gamerboot.gamer.master.club-com.ad.gamerboot.kafka.models.workouts.WorkoutKickValue/versions/latest/
and https://schema-reg.pro.someurl.net/subjects/gamerboot.gamer.master.club-com.ad.gamerboot.kafka.models.workouts.WorkoutPlayerMotionValue/versions/latest/
Schemas: {"subject":"gamerboot.gamer.master.club-com.ad.gamerboot.kafka.models.workouts.WorkoutKickValue","version":1,"id":423,"schema":"{\"type\":\"record\",\"name\":\"ClubWorkoutKickValue\",\"namespace\":\"com.ad.gamerboot.kafka.models.workouts\",\"fields\":[{\"name\":\"playerId\",\"type\":\"string\"},{\"name\":\"timestamp\",\"type\":{\"type\":\"long\",\"logicalType\":\"timestamp-millis\"}},{\"name\":\"tagId\",\"type\":[\"null\",\"string\"],\"default\":null},{\"name\":\"ballSpeed\",\"type\":[\"null\",\"int\"],\"default\":null},{\"name\":\"ballSpeedFloat\",\"type\":[\"null\",\"float\"],\"default\":null},{\"name\":\"ballSpeedZone\",\"type\":{\"type\":\"enum\",\"name\":\"BallSpeedZone\",\"symbols\":[\"COLD\",\"MEDIUM\",\"HOT\",\"FIRE\",\"INVALID\"]}},{\"name\":\"confidence\",\"type\":[\"null\",\"int\"],\"default\":null},{\"name\":\"ingestionTime\",\"type\":[\"null\",{\"type\":\"long\",\"logicalType\":\"timestamp-millis\"}],\"default\":null}]}"}
and: {"subject":"gamerboot.gamer.master.club-com.ad.gamerboot.kafka.models.workouts.WorkoutPlayerMotionValue","version":1,"id":424,"schema":"{\"type\":\"record\",\"name\":\"ClubWorkoutPlayerMotionValue\",\"namespace\":\"com.ad.gamerboot.kafka.models.workouts\",\"fields\":[{\"name\":\"playerId\",\"type\":\"string\"},{\"name\":\"timestamp\",\"type\":{\"type\":\"long\",\"logicalType\":\"timestamp-millis\"}},{\"name\":\"absoluteDistance\",\"type\":\"float\"},{\"name\":\"averageSpeed\",\"type\":\"float\"},{\"name\":\"peakSpeed\",\"type\":\"float\"},{\"name\":\"tagId\",\"type\":[\"null\",\"string\"],\"default\":null},{\"name\":\"installationId\",\"type\":[\"null\",\"string\"],\"default\":null},{\"name\":\"averageSpeedZone\",\"type\":[\"null\",{\"type\":\"enum\",\"name\":\"AverageSpeedZone\",\"symbols\":[\"SPRINT\",\"HIGH_SPEED_RUN\",\"RUN\",\"JOG\",\"WALK\",\"STAND\",\"INVALID\"]}],\"default\":null,\"aliases\":[\"speedZone\"]},{\"name\":\"peakSpeedZone\",\"type\":[\"null\",{\"type\":\"enum\",\"name\":\"PeakSpeedZone\",\"symbols\":[\"SPRINT\",\"HIGH_SPEED_RUN\",\"RUN\",\"JOG\",\"WALK\",\"STAND\",\"INVALID\"]}],\"default\":null},{\"name\":\"ingestionTime\",\"type\":[\"null\",{\"type\":\"long\",\"logicalType\":\"timestamp-millis\"}],\"default\":null}]}"}
MySQL table:
+------------------+----------------------------------------------------------------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+------------------+----------------------------------------------------------------------+------+-----+---------+-------+
| playerid | varchar(100) | YES | | NULL | |
| timestamp | mediumtext | YES | | NULL | |
| absoluteDistance | float | YES | | NULL | |
| avarageSpeed | float | YES | | NULL | |
| peakSpeed | float | YES | | NULL | |
| tagId | varchar(50) | YES | | NULL | |
| installationId | varchar(100) | YES | | NULL | |
| averageSpeedZone | enum('SPRINT','HIGH_SPEED_RUN','RUN','JOG','WALK','STAND','INVALID') | YES | | NULL | |
| peakSpeedZone | enum('SPRINT','HIGH_SPEED_RUN','RUN','JOG','WALK','STAND','INVALID') | YES | | NULL | |
| ballSpeed | int(11) | YES | | NULL | |
| ballSpeedFloat | float | YES | | NULL | |
| ballSpeedZone | enum('COLD','MEDIUM','HOT','FIRE','INVALID') | YES | | NULL | |
| confidence | int(11) | YES | | NULL | |
| ingestionTime | mediumtext | YES | | NULL | |
+------------------+----------------------------------------------------------------------+------+-----+---------+-------+
Data expected in MySQL:
+--------------------------------------+---------------+------------------+--------------+-----------+----------------+----------------+------------------+---------------+-----------+----------------+---------------+------------+---------------+
| playerid | timestamp | absoluteDistance | avarageSpeed | peakSpeed | tagId | installationId | averageSpeedZone | peakSpeedZone | ballSpeed | ballSpeedFloat | ballSpeedZone | confidence | ingestionTime |
+--------------------------------------+---------------+------------------+--------------+-----------+----------------+----------------+------------------+---------------+-----------+----------------+---------------+------------+---------------+
| 59a70d45-5c00-4bb6-966d-b961b78ef5c1 | 1641495873505 | 5.76953 | 1.1543 | 1.22363 | 0104FLHBN009XD | null | WALK | WALK | NULL | NULL | NULL | NULL | 1641496586458 |
| 59a70d45-5c00-4bb6-966d-b961b78ef5c1 | 1641484677624 | NULL | NULL | NULL | 0104FLHBN009XD | NULL | NULL | NULL | 37 | 37.0897 | COLD | 77 | 1641484896747 |
+--------------------------------------+---------------+------------------+--------------+-----------+----------------+----------------+------------------+---------------+-----------+----------------+---------------+------------+---------------+
Data from avro-console looks like for the db entries:
{"playerId":"59a70d45-5c00-4bb6-966d-b961b78ef5c1","timestamp":1641484677624,"tagId":{"string":"0104FLHBN009XD"},"ballSpeed":{"int":37},"ballSpeedFloat":{"float":37.08966},"ballSpeedZone":"COLD","confidence":{"int":77},"ingestionTime":{"long":1641484896747}}
{"playerId":"59a70d45-5c00-4bb6-966d-b961b78ef5c1","timestamp":1641495873505,"absoluteDistance":5.7695312,"averageSpeed":1.1542969,"peakSpeed":1.2236328,"tagId":{"string":"0104FLHBN009XD"},"installationId":null,"averageSpeedZone":{"com.ad.gamerboot.kafka.models.workouts.AverageSpeedZone":"WALK"},"peakSpeedZone":{"com.ad.gamerboot.kafka.models.workouts.PeakSpeedZone":"WALK"},"ingestionTime":{"long":1641496586458}}
This is a fresh actual confluent installation. I updated just some hours ago Avro to: kafka-connect-avro-converter:7.0.1
Schemas was changed by company regarding RecordNameStrategy. Everything is working now.
Thanks

Aggregating data in an array based on date

I'm trying to aggregate data based on timestamp. Basically I'd like to create an array for each day.
So lets say I've a query like so:
SELECT date(task_start) AS started, task_start
FROM tt_records
GROUP BY started, task_start
ORDER BY started DESC;
The output is:
+------------+------------------------+
| started | task_start |
|------------+------------------------|
| 2021-08-30 | 2021-08-30 16:45:55+00 |
| 2021-08-29 | 2021-08-29 06:47:55+00 |
| 2021-08-29 | 2021-08-29 15:41:50+00 |
| 2021-08-28 | 2021-08-28 12:59:20+00 |
| 2021-08-28 | 2021-08-28 14:50:55+00 |
| 2021-08-26 | 2021-08-26 20:46:44+00 |
| 2021-08-24 | 2021-08-24 16:28:05+00 |
| 2021-08-23 | 2021-08-23 16:22:41+00 |
| 2021-08-22 | 2021-08-22 14:01:10+00 |
| 2021-08-21 | 2021-08-21 19:45:18+00 |
| 2021-08-11 | 2021-08-11 16:08:58+00 |
| 2021-07-28 | 2021-07-28 17:39:14+00 |
| 2021-07-19 | 2021-07-19 17:26:24+00 |
| 2021-07-18 | 2021-07-18 15:04:47+00 |
| 2021-06-24 | 2021-06-24 19:53:33+00 |
| 2021-06-22 | 2021-06-22 19:04:24+00 |
+------------+------------------------+
As you can see the started column has repeating dates.
What I'd like to have is:
+------------+--------------------------------------------------+
| started | task_start |
|------------+--------------------------------------------------|
| 2021-08-30 | [2021-08-30 16:45:55+00] |
| 2021-08-29 | [2021-08-29 06:47:55+00, 2021-08-29 15:41:50+00] |
| 2021-08-28 | [2021-08-28 12:59:20+00, 2021-08-28 14:50:55+00] |
| 2021-08-26 | [2021-08-26 20:46:44+00] |
| 2021-08-24 | [2021-08-24 16:28:05+00] |
| 2021-08-23 | [2021-08-23 16:22:41+00] |
| 2021-08-22 | [2021-08-22 14:01:10+00] |
| 2021-08-21 | [2021-08-21 19:45:18+00] |
| 2021-08-11 | [2021-08-11 16:08:58+00] |
| 2021-07-28 | [2021-07-28 17:39:14+00] |
| 2021-07-19 | [2021-07-19 17:26:24+00] |
| 2021-07-18 | [2021-07-18 15:04:47+00] |
| 2021-06-24 | [2021-06-24 19:53:33+00] |
| 2021-06-22 | [2021-06-22 19:04:24+00] |
+------------+--------------------------------------------------+
I need a query to achieve that. Thank you.
You can use array_agg()
SELECT date(task_start) AS started, array_agg(task_start)
FROM tt_records
GROUP BY started
ORDER BY started DESC;
If you want a JSON array, rather than a native Postgres array, use jsonb_agg() instead

Getting NullPointerException from Play framework

I have been using play framework 2.2.4 version. While executing particular action getting NullPointerException, If its from my code I can fix it but it has come from play library.
could someone help on this??
# action #
public static Result getOverViewPage(final String errorMsg){
String autoBillStatusMsg = "";
//business logic is here
Logger.info("Final Auto-pay status message : "+autoBillStatusMsg);
return ok(rave_customer_index.render());
}
We are able to see the above logger statement. But after that we are getting NullPointerException
Exception StackTrace
INFO | jvm 1 | 2015/09/18 15:15:24 | [info] application - Final
Auto-pay status message :
INFO | jvm 1 | 2015/09/18 15:15:24 | [error] play - Cannot invoke
the action, eventually got an error: java.lang.NullPointerException
INFO | jvm 1 | 2015/09/18 15:15:24 | [error] application - INFO
| jvm 1 | 2015/09/18 15:15:24 | INFO | jvm 1 | 2015/09/18
15:15:24 | ! #6nfm0a093 - Internal server error, for (GET) [/] ->
INFO | jvm 1 | 2015/09/18 15:15:24 |
INFO | jvm 1 | 2015/09/18 15:15:24 |
play.api.Application$$anon$1: Execution
exception[[NullPointerException: null]]
INFO | jvm 1 | 2015/09/18 15:15:24 | at
play.api.Application$class.handleError(Application.scala:296)
~[com.typesafe.play.play_2.11-2.3.2.jar:2.3.2]
INFO | jvm 1 | 2015/09/18 15:15:24 | at
play.api.DefaultApplication.handleError(Application.scala:402)
[com.typesafe.play.play_2.11-2.3.2.jar:2.3.2]
INFO | jvm 1 | 2015/09/18 15:15:24 | at
play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$3$$anonfun$applyOrElse$4.apply(PlayDefaultUpstreamHandler.scala:320)
[com.typesafe.play.play_2.11-2.3.2.jar:2.3.2]
INFO | jvm 1 | 2015/09/18 15:15:24 | at
play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$3$$anonfun$applyOrElse$4.apply(PlayDefaultUpstreamHandler.scala:320)
[com.typesafe.play.play_2.11-2.3.2.jar:2.3.2]
INFO | jvm 1 | 2015/09/18 15:15:24 | at
scala.Option.map(Option.scala:145)
[org.scala-lang.scala-library-2.11.1.jar:na] INFO | jvm 1 |
2015/09/18 15:15:24 |
Caused by: java.lang.NullPointerException: null
INFO | jvm 1 | 2015/09/18 15:15:24 | at
java.net.URLEncoder.encode(URLEncoder.java:205) ~[na:1.7.0_21]
INFO | jvm 1 | 2015/09/18 15:15:24 | at
play.api.mvc.CookieBaker$$anonfun$3.apply(Http.scala:427)
~[com.typesafe.play.play_2.11-2.3.2.jar:2.3.2]
INFO | jvm 1 | 2015/09/18 15:15:24 | at
play.api.mvc.CookieBaker$$anonfun$3.apply(Http.scala:426)
~[com.typesafe.play.play_2.11-2.3.2.jar:2.3.2]
INFO | jvm 1 | 2015/09/18 15:15:24 | at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
~[org.scala-lang.scala-library-2.11.1.jar:na]
INFO | jvm 1 | 2015/09/18 15:15:24 | at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
~[org.scala-lang.scala-library-2.11.1.jar:na]
DEBUG | wrapperp | 2015/09/18 15:15:25 | send a packet PING : ping
INFO | jvm 1 | 2015/09/18 15:15:26 | WrapperManager Debug:
Received a packet PING : ping
INFO | jvm 1 | 2015/09/18 15:15:26 | WrapperManager Debug: Send a
packet PING : ping

How can I find the source file that causes a Scalac Compiler crash

I am in the process of upgrading a large project to 2.10.4 to 2.11.4. I have gotten a compiler crash, is there anyway to display the name of source file that is causing the crash? I am able to get the full trace with ' last everitt/compile:compile'
I am going to revert to erasing large chunks of the project to binary search out the cause of the crash.
Here is the trace btw:
java.lang.NullPointerException
at scala.tools.nsc.typechecker.NamesDefaults$class.baseFunBlock$1(NamesDefaults.scala:130)
at scala.tools.nsc.typechecker.NamesDefaults$class.transformNamedApplication(NamesDefaults.scala:375)
at scala.tools.nsc.Global$$anon$1.transformNamedApplication(Global.scala:450)
at scala.tools.nsc.typechecker.NamesDefaults$class.transformNamedApplication(NamesDefaults.scala:326)
at scala.tools.nsc.Global$$anon$1.transformNamedApplication(Global.scala:450)
at scala.tools.nsc.typechecker.Typers$Typer.tryNamesDefaults$1(Typers.scala:3298)
at scala.tools.nsc.typechecker.Typers$Typer.doTypedApply(Typers.scala:3367)
at scala.tools.nsc.typechecker.Typers$Typer.tryNamesDefaults$1(Typers.scala:3353)
at scala.tools.nsc.typechecker.Typers$Typer.doTypedApply(Typers.scala:3367)
at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$tryTypedApply$1$1.apply(Typers.scala:4404)
at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$tryTypedApply$1$1.apply(Typers.scala:4404)
at scala.tools.nsc.typechecker.Typers$Typer.silent(Typers.scala:689)
at scala.tools.nsc.typechecker.Typers$Typer.tryTypedApply$1(Typers.scala:4404)
at scala.tools.nsc.typechecker.Typers$Typer.normalTypedApply$1(Typers.scala:4449)
at scala.tools.nsc.typechecker.Typers$Typer.typedApply$1(Typers.scala:4484)
at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5242)
at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5259)
at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5295)
at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5322)
at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5269)
at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5273)
at scala.tools.nsc.typechecker.Typers$Typer.typedArg(Typers.scala:3094)
at scala.tools.nsc.typechecker.PatternTypers$PatternTyper$class.typedArgWithFormal$1(PatternTypers.scala:112)
at scala.tools.nsc.typechecker.PatternTypers$PatternTyper$$anonfun$2.apply(PatternTypers.scala:115)
at scala.tools.nsc.typechecker.PatternTypers$PatternTyper$$anonfun$2.apply(PatternTypers.scala:115)
at scala.runtime.Tuple2Zipped$$anonfun$map$extension$1.apply(Tuple2Zipped.scala:42)
at scala.runtime.Tuple2Zipped$$anonfun$map$extension$1.apply(Tuple2Zipped.scala:40)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.runtime.Tuple2Zipped$.map$extension(Tuple2Zipped.scala:40)
at scala.tools.nsc.typechecker.PatternTypers$PatternTyper$class.typedArgsForFormals(PatternTypers.scala:115)
at scala.tools.nsc.typechecker.Typers$Typer.typedArgsForFormals(Typers.scala:107)
at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$handleMonomorphicCall$1(Typers.scala:3384)
at scala.tools.nsc.typechecker.Typers$Typer.doTypedApply(Typers.scala:3409)
at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$tryTypedApply$1$1.apply(Typers.scala:4404)
at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$tryTypedApply$1$1.apply(Typers.scala:4404)
at scala.tools.nsc.typechecker.Typers$Typer.silent(Typers.scala:676)
at scala.tools.nsc.typechecker.Typers$Typer.tryTypedApply$1(Typers.scala:4404)
at scala.tools.nsc.typechecker.Typers$Typer.normalTypedApply$1(Typers.scala:4449)
at scala.tools.nsc.typechecker.Typers$Typer.typedApply$1(Typers.scala:4484)
at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5242)
at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5259)
at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5295)
at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5322)
at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5269)
at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5273)
at scala.tools.nsc.typechecker.Typers$Typer.typedAssign$1(Typers.scala:4136)
at scala.tools.nsc.typechecker.Typers$Typer.typedOutsidePatternMode$1(Typers.scala:5223)
at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5252)
at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5259)
at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5295)
at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5322)
at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5269)
at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5273)
at scala.tools.nsc.typechecker.Typers$Typer.typedBlock(Typers.scala:2341)
at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$typedOutsidePatternMode$1$1.apply(Typers.scala:5217)
at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$typedOutsidePatternMode$1$1.apply(Typers.scala:5217)
at scala.tools.nsc.typechecker.Typers$Typer.typedOutsidePatternMode$1(Typers.scala:5216)
at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5252)
at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5259)
at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5295)
at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5322)
at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5269)
at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5273)
at scala.tools.nsc.typechecker.Typers$Typer.transformedOrTyped(Typers.scala:5504)
at scala.tools.nsc.typechecker.Typers$Typer.typedDefDef(Typers.scala:2167)
at scala.tools.nsc.typechecker.Typers$Typer.typedMemberDef$1(Typers.scala:5207)
at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5258)
at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5295)
at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5322)
at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5269)
at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5273)
at scala.tools.nsc.typechecker.Typers$Typer.typedByValueExpr(Typers.scala:5351)
at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedStat$1(Typers.scala:2977)
at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$60.apply(Typers.scala:3081)
at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$60.apply(Typers.scala:3081)
at scala.collection.immutable.List.loop$1(List.scala:172)
at scala.collection.immutable.List.mapConserve(List.scala:188)
at scala.tools.nsc.typechecker.Typers$Typer.typedStats(Typers.scala:3081)
at scala.tools.nsc.typechecker.Typers$Typer.typedTemplate(Typers.scala:1880)
at scala.tools.nsc.typechecker.Typers$Typer.typedClassDef(Typers.scala:1721)
at scala.tools.nsc.typechecker.Typers$Typer.typedMemberDef$1(Typers.scala:5208)
at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5258)
at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5295)
at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5322)
at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5269)
at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5273)
at scala.tools.nsc.typechecker.Typers$Typer.typedByValueExpr(Typers.scala:5351)
at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedStat$1(Typers.scala:2977)
at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$60.apply(Typers.scala:3081)
at scala.tools.nsc.typechecker.Typers$Typer$$anonfun$60.apply(Typers.scala:3081)
at scala.collection.immutable.List.loop$1(List.scala:172)
at scala.collection.immutable.List.mapConserve(List.scala:188)
at scala.tools.nsc.typechecker.Typers$Typer.typedStats(Typers.scala:3081)
at scala.tools.nsc.typechecker.Typers$Typer.typedPackageDef$1(Typers.scala:4918)
at scala.tools.nsc.typechecker.Typers$Typer.typedMemberDef$1(Typers.scala:5211)
at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5258)
at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5295)
at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5322)
at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5269)
at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5273)
at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5347)
at scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3.apply(Analyzer.scala:102)
at scala.tools.nsc.Global$GlobalPhase$$anonfun$applyPhase$1.apply$mcV$sp(Global.scala:428)
at scala.tools.nsc.Global$GlobalPhase.withCurrentUnit(Global.scala:419)
at scala.tools.nsc.Global$GlobalPhase.applyPhase(Global.scala:428)
at scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3$$anonfun$run$1.apply(Analyzer.scala:94)
at scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3$$anonfun$run$1.apply(Analyzer.scala:93)
at scala.collection.Iterator$class.foreach(Iterator.scala:743)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1195)
at scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3.run(Analyzer.scala:93)
at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1338)
at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1325)
at scala.tools.nsc.Global$Run.compileSources(Global.scala:1320)
at scala.tools.nsc.Global$Run.compile(Global.scala:1418)
at xsbt.CachedCompiler0.run(CompilerInterface.scala:116)
at xsbt.CachedCompiler0.run(CompilerInterface.scala:95)
at xsbt.CompilerInterface.run(CompilerInterface.scala:26)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:101)
at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:47)
at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:41)
at sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply$mcV$sp(AggressiveCompile.scala:97)
at sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:97)
at sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:97)
at sbt.compiler.AggressiveCompile.sbt$compiler$AggressiveCompile$$timed(AggressiveCompile.scala:162)
at sbt.compiler.AggressiveCompile$$anonfun$3.compileScala$1(AggressiveCompile.scala:96)
at sbt.compiler.AggressiveCompile$$anonfun$3.apply(AggressiveCompile.scala:139)
at sbt.compiler.AggressiveCompile$$anonfun$3.apply(AggressiveCompile.scala:86)
at sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:38)
at sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:36)
at sbt.inc.IncrementalCommon.cycle(IncrementalCommon.scala:31)
at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:39)
at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:38)
at sbt.inc.Incremental$.manageClassfiles(Incremental.scala:66)
at sbt.inc.Incremental$.compile(Incremental.scala:38)
at sbt.inc.IncrementalCompile$.apply(Compile.scala:26)
at sbt.compiler.AggressiveCompile.compile2(AggressiveCompile.scala:153)
at sbt.compiler.AggressiveCompile.compile1(AggressiveCompile.scala:70)
at sbt.compiler.AggressiveCompile.apply(AggressiveCompile.scala:45)
at sbt.Compiler$.apply(Compiler.scala:74)
at sbt.Compiler$.apply(Compiler.scala:65)
at sbt.Defaults$.sbt$Defaults$$compileTaskImpl(Defaults.scala:777)
at sbt.Defaults$$anonfun$compileTask$1.apply(Defaults.scala:769)
at sbt.Defaults$$anonfun$compileTask$1.apply(Defaults.scala:769)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
at sbt.std.Transform$$anon$4.work(System.scala:63)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
at sbt.Execute.work(Execute.scala:235)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Edit: After erasing and commenting out lots of files I simplified it to this:
case class A(x: String)
case class B(a: A, y: String)
class C(var b: B) {
def a = b.a
b.copy(a = a.copy())
}
Edit2: Best I can tell is that there isn't a good option for this, I am going to move on, I filed a bug report for this issue (SI-9035). If anyone knows of the option / way to do this please post an answer! In the meantime I will just continue to use the method I have been using since 2.6: erase files and comment out lines until I get it down to a minimum example and submit a bug report. Thanks for the ideas.
In this case -verbose indicates it's done with phase packageobjects, and -Xshow-phases says typer is next. (Parser reports what it just parsed.)
-Ytyper-debug emits reams of generally inscrutable output, but shows what is undergoing typechecking.
That information can at least help your next search or gitter question.
Something like:
| | |-- b.copy(a = a.copy()) EXPRmode (site: method f in C)
| | | |-- b.copy BYVALmode-EXPRmode-FUNmode-POLYmode (silent: method f in C)
| | | | |-- b EXPRmode-POLYmode-QUALmode (silent: method f in C)
| | | | | \-> nope.B
| | | | \-> (a: nope.A, y: String)nope.B
| | | |-- qual$1 EXPRmode-POLYmode-QUALmode (silent: method f in C)
| | | | \-> qual$1.type (with underlying type nope.B)
| | | |-- (a: nope.A, y: String)nope.B EXPRmode-FUNmode-POLYmode-TAPPmode (silent: method f in C)
| | | | \-> (a: nope.A, y: String)nope.B
| | | |-- a = a.copy() : pt=nope.A EXPRmode (silent: method f in C)
| | | | |-- a EXPRmode-LHSmode (silent: method f in C)
| | | | | \-> nope.A
| | | | |-- C.this.a_$eq(a.copy()) : pt=nope.A EXPRmode (silent: method f in C)
| | | | | |-- C.this.a_$eq BYVALmode-EXPRmode-FUNmode-POLYmode (silent: method f in C)
| | | | | | \-> <error>
| | | | | |-- a.copy() : pt=<error> EXPRmode (silent: method f in C)
| | | | | | |-- a.copy BYVALmode-EXPRmode-FUNmode-POLYmode (silent: method f in C)
| | | | | | | |-- a EXPRmode-POLYmode-QUALmode (silent: method f in C)
| | | | | | | | \-> nope.A
| | | | | | | \-> (x: String)nope.A
| | | | | | |-- qual$2 EXPRmode-POLYmode-QUALmode (silent: method f in C)
| | | | | | | \-> qual$2.type (with underlying type nope.A)
| | | | | | |-- (x: String)nope.A EXPRmode-FUNmode-POLYmode-TAPPmode (silent: method f in C)
| | | | | | | \-> (x: String)nope.A
| | | | | | \-> nope.A
| | | | | \-> <error>
| | | | \-> <error>
error: java.lang.NullPointerException
at scala.tools.nsc.typechecker.Typers$Typer.noExpectedType$1(Typers.scala:3464)
might lead you to search for any bugs with "named arguments", such as this open bug; not your crasher, but similar players.
When the compiler "aborts", it offers more useful information than when it full-on crashes.