StackOverflow during typeprovider macro expansion - scala

I was playing around a bit with macros and I thought writing a json type provider would be a good start to get a deeper understanding of how all this works, but I hit a weird error that I can't seem to be able to figure out myself. Code is available on GitHub if you want to take a look at the whole stuff: https://github.com/qwe2/json-typeprovider/.
The problematic part is, I tried to make it as typesafe as I can, meaning I wanted to implement json arrays in such a way, that indexing into them would return the correct type (as a subsequent macro invocation). relevant methods of the code:
json to Tree method:
def jsonToTpe(value: JValue): Option[Tree] = value match {
case JNothing => None
case JNull => None
case JString(s) => Some(q"$s")
case JDouble(d) => Some(q"$d")
case JDecimal(d) => Some(q"scala.BigDecimal(${d.toString})")
case JInt(i) => Some(q"scala.BigInt(${i.toByteArray})")
case JLong(l) => Some(q"$l")
case JBool(b) => Some(q"$b")
case JArray(arr) =>
val arrTree = arr.flatMap(jsonToTpe)
val clsName = c.freshName[TypeName](TypeName("harraycls"))
val hArray =
q"""
class $clsName {
#_root_.com.example.json.provider.body(scala.Array[Any](..$arrTree))
def apply(i: Int): Any = macro _root_.com.example.json.provider.DelegatedMacros.arrApply_impl
#_root_.com.example.json.provider.body(scala.Array[Any](..$arrTree))
def toArray: scala.Array[Any] = macro _root_.com.example.json.provider.DelegatedMacros.selectField_impl
}
new $clsName {}
"""
Some(hArray)
case JSet(set) => Some(q"scala.Set(..${set.flatMap(jsonToTpe)})")
case JObject(fields) =>
val fs = fields.flatMap { case (k, v) =>
jsonToTpe(v).map(v => q"""
#_root_.com.example.json.provider.body($v) def ${TermName(k)}: Any =
macro _root_.com.example.json.provider.DelegatedMacros.selectField_impl""")
}
val clsName = c.freshName[TypeName](TypeName("jsoncls"))
Some(q"""
class $clsName {
..$fs
}
new $clsName {}
""")
}
reading the annotation:
class body(tree: Any) extends StaticAnnotation
def arrApply_impl(c: whitebox.Context)(i: c.Expr[Int]): c.Tree = {
import c.universe._
def bail(msg: String): Nothing = {
c.abort(c.enclosingPosition, msg)
}
def error(msg: String): Unit = {
c.error(c.enclosingPosition, msg)
}
val arrValue = selectField_impl(c)
val arrElems = arrValue match {
case q"scala.Array.apply[$tpe](..$elems)($cls)" => elems
case _ => bail("arr needs to be an array of constants")
}
val idx = i.tree match {
case Literal(Constant(ix: Int)) => ix
case _ => bail(s"i needs to be a constant Int, got ${showRaw(i.tree)}")
}
arrElems(idx)
}
def selectField_impl(c: whitebox.Context) : c.Tree = {
c.macroApplication.symbol.annotations.filter(
_.tree.tpe <:< c.typeOf[body]
).head.tree.children.tail.head
}
As you can see the way I tried to do it was, basically shove the actual array into a static annotation and when indexing it, I would dispatch that to another macro that can figure out the type. I got the idea from reading about vampire methods.
This is the json I'm trying to parse:
[
{"id": 1},
{"id": 2}
]
And this is how I invoke it:
val tpe3 = TypeProvider("arrayofobj.json")
println(tpe3.toArray.mkString(", "))
Reading an array of ints or an object of primitive fields works as expected but an array of objects throws a stackoverflow during compilation:
[error] /home/isti/projects/json-typeprovider/core/src/main/scala/com/example/Hello.scala:7:14: Internal error: unable to find the outer accessor symbol of object Hello
[error] object Hello extends App {
[error] ^
[error] ## Exception when compiling 1 sources to /home/isti/projects/json-typeprovider/core/target/scala-2.12/classes
[error] null
[error] java.lang.String.valueOf(String.java:2994)
[error] scala.collection.mutable.StringBuilder.append(StringBuilder.scala:200)
[error] scala.collection.TraversableOnce.$anonfun$addString$1(TraversableOnce.scala:359)
[error] scala.collection.immutable.List.foreach(List.scala:389)
[error] scala.collection.TraversableOnce.addString(TraversableOnce.scala:357)
[error] scala.collection.TraversableOnce.addString$(TraversableOnce.scala:353)
[error] scala.collection.AbstractTraversable.addString(Traversable.scala:104)
[error] scala.collection.TraversableOnce.mkString(TraversableOnce.scala:323)
[error] scala.collection.TraversableOnce.mkString$(TraversableOnce.scala:322)
[error] scala.collection.AbstractTraversable.mkString(Traversable.scala:104)
[error] scala.collection.TraversableOnce.mkString(TraversableOnce.scala:325)
[error] scala.collection.TraversableOnce.mkString$(TraversableOnce.scala:325)
[error] scala.collection.AbstractTraversable.mkString(Traversable.scala:104)
[error] scala.collection.TraversableOnce.mkString(TraversableOnce.scala:327)
[error] scala.collection.TraversableOnce.mkString$(TraversableOnce.scala:327)
[error] scala.collection.AbstractTraversable.mkString(Traversable.scala:104)
[error] xsbt.DelegatingReporter$.makePosition$1(DelegatingReporter.scala:89)
[error] xsbt.DelegatingReporter$.convert(DelegatingReporter.scala:94)
[error] xsbt.DelegatingReporter.info0(DelegatingReporter.scala:125)
[error] xsbt.DelegatingReporter.info0(DelegatingReporter.scala:102)
[error] scala.reflect.internal.Reporter.error(Reporting.scala:84)
[error] scala.reflect.internal.Reporting.globalError(Reporting.scala:69)
[error] scala.reflect.internal.Reporting.globalError$(Reporting.scala:69)
[error] scala.reflect.internal.SymbolTable.globalError(SymbolTable.scala:16)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerSelect(ExplicitOuter.scala:235)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
Edit: that's only the top of the stacktrace, there are a lot more of scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267).

Related

Scala json4s Deserialization for the abstract trait causes MappingExeption: Unexpected type info RefinedType

My code contains the following entities:
case class Source(uuid: String, `type`: String, parameters: Connector)
sealed trait Connector
case class Snowflake(
val username: String,
val password: String,
val host: String,
val role: Option[String],
val warehouse: Option[String],
val port: Option[String],
val db_name: String,
val schema: Option[String],
val activeModal: Option[String],
val use_ssh: Int,
val ssh_ip: Option[String],
val ssh_port: Option[Int],
val ssh_user: Option[String]
) extends Connector {
val options = Map(
("sfURL" -> s"${host}.snowflakecomputing.com"),
("sfUser" -> username),
("sfPassword" -> password),
("sfRole" -> role),
("sfDatabase" -> db_name),
("sfSchema" -> schema),
("sfWarehouse" -> warehouse))
}
case class MySQL(
...
) extends Connector {
...
}
case class File(
...
) extends Connector {
...
}
I'm trying to deserialize json representing different connector types.
Json can look like this:
val test = {
"uuid":"12314sdfds12",
"type":"snowflake",
"parameters": "{\"use_ssh\": 0,
\"schema\": \"YESDATA\",
\"activeModal\": \"showflake\",
\"warehouse\": \"COMPUTE_WH\",
\"role\": \"DIsdfasROLE\",
\"username\": \"Dixcxf\",
\"password\": \"dfgRvf65&Vyuhj65&\",
\"host\": \"wn789454.east-us-2.azure\",
\"db_name\": \"DSPSHARE2\"}"
}
The problem is that "parameters" field is a string type and json can contain more fields except uuid, type and parameters.
I came up with the custom serializer for the Source class:
case object SourceSerializer extends CustomSerializer[Source](format => (
{
case JObject(source) => {
implicit val formats: Formats = DefaultFormats
def getfield(key: String) = source.filter(field => field match {
case JField(`key`, JString(_)) => true
case _ => false
})
getfield("uuid") match {
case List(JField("uuid", JString(uuid))) => {
val JString(connectorType) = JObject(source) \ "type"
val JString(connectorParams) = JObject(source) \ "parameters"
connectorType match {
case "snowflake" => Source(uuid, connectorType, parse(connectorParams).extract[Snowflake])
}
}
case Nil => null
}}
case JNull => null
},
{ case op: Source => JString(op.getClass.getSimpleName.replace("$","")) }
))
Finally, I'm trying to use it like this:
parse(test).extract[Source]
It fails and I'm getting the error
[error] org.json4s.MappingException: Unexpected type info RefinedType(ClassSymbol(<refinement>, owner=0, flags=0, info=173 ,None),List(TypeRefType(ThisType(java.lang),java.lang.Object,List()), TypeRefType(ThisType(java.io),java.io.Serializable,List())))
[error] at org.json4s.reflect.package$.fail(package.scala:56)
[error] at org.json4s.reflect.ScalaSigReader$.findPrimitive$3(ScalaSigReader.scala:194)
[error] at org.json4s.reflect.ScalaSigReader$.findArgTypeForField(ScalaSigReader.scala:196)
[error] at org.json4s.reflect.ScalaSigReader$.readField(ScalaSigReader.scala:77)
[error] at org.json4s.reflect.Reflector$ClassDescriptorBuilder.$anonfun$fields$3(Reflector.scala:113)
[error] at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
[error] at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
[error] at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
[error] at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
[error] at scala.collection.TraversableLike.map(TraversableLike.scala:238)
[error] at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
[error] at scala.collection.AbstractTraversable.map(Traversable.scala:108)
[error] at org.json4s.reflect.Reflector$ClassDescriptorBuilder.fields(Reflector.scala:111)
[error] at org.json4s.reflect.Reflector$ClassDescriptorBuilder.properties(Reflector.scala:130)
[error] at org.json4s.reflect.Reflector$ClassDescriptorBuilder.result(Reflector.scala:272)
[error] at org.json4s.reflect.Reflector$.createDescriptorWithFormats(Reflector.scala:87)
[error] at org.json4s.reflect.Reflector$.$anonfun$describeWithFormats$1(Reflector.scala:70)
[error] at org.json4s.reflect.Memo.apply(Memo.scala:12)
[error] at org.json4s.reflect.Reflector$.describeWithFormats(Reflector.scala:70)
[error] at org.json4s.Extraction$.$anonfun$extract$10(Extraction.scala:456)
[error] at org.json4s.Extraction$.$anonfun$customOrElse$1(Extraction.scala:781)
[error] at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
[error] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
[error] at scala.PartialFunction$$anon$1.applyOrElse(PartialFunction.scala:257)
[error] at org.json4s.Extraction$.customOrElse(Extraction.scala:781)
[error] at org.json4s.Extraction$.extract(Extraction.scala:455)
[error] at org.json4s.Extraction$.extract(Extraction.scala:56)
[error] at org.json4s.ExtractableJsonAstNode$.extract$extension(ExtractableJsonAstNode.scala:22)
[error] at org.json4s.jackson.JacksonSerialization.read(Serialization.scala:62)
[error] at org.json4s.Serialization.read(Serialization.scala:31)
[error] at org.json4s.Serialization.read$(Serialization.scala:31)
[error] at org.json4s.jackson.JacksonSerialization.read(Serialization.scala:23)
[error] at org.divian.entities.ConnectorSerializer$$anonfun$$lessinit$greater$2$$anonfun$apply$3.applyOrElse(Entities.scala:236)
[error] at org.divian.entities.ConnectorSerializer$$anonfun$$lessinit$greater$2$$anonfun$apply$3.applyOrElse(Entities.scala:223)
[error] at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
[error] at org.json4s.CustomSerializer$$anonfun$deserialize$1.applyOrElse(CustomSerializer.scala:11)
[error] at org.json4s.CustomSerializer$$anonfun$deserialize$1.applyOrElse(CustomSerializer.scala:10)
[error] at org.json4s.Extraction$.customOrElse(Extraction.scala:781)
[error] at org.json4s.Extraction$.extract(Extraction.scala:455)
[error] at org.json4s.Extraction$.extract(Extraction.scala:56)
[error] at org.json4s.ExtractableJsonAstNode$.extract$extension(ExtractableJsonAstNode.scala:22)
[error] at org.divian.entities.SourceSerializer$$anonfun$$lessinit$greater$3$$anonfun$apply$5.applyOrElse(Entities.scala:273)
[error] at org.divian.entities.SourceSerializer$$anonfun$$lessinit$greater$3$$anonfun$apply$5.applyOrElse(Entities.scala:251)
[error] at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
[error] at org.json4s.CustomSerializer$$anonfun$deserialize$1.applyOrElse(CustomSerializer.scala:11)
[error] at org.json4s.CustomSerializer$$anonfun$deserialize$1.applyOrElse(CustomSerializer.scala:10)
[error] at org.json4s.Extraction$.customOrElse(Extraction.scala:781)
[error] at org.json4s.Extraction$.extract(Extraction.scala:455)
[error] at org.json4s.Extraction$.extract(Extraction.scala:56)
[error] at org.json4s.ExtractableJsonAstNode$.extract$extension(ExtractableJsonAstNode.scala:22)
[error] at Main$.delayedEndpoint$Main$1(main.scala:48)
[error] at Main$delayedInit$body.apply(main.scala:21)
[error] at scala.Function0.apply$mcV$sp(Function0.scala:39)
[error] at scala.Function0.apply$mcV$sp$(Function0.scala:39)
[error] at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:17)
[error] at scala.App.$anonfun$main$1$adapted(App.scala:80)
[error] at scala.collection.immutable.List.foreach(List.scala:392)
[error] at scala.App.main(App.scala:80)
[error] at scala.App.main$(App.scala:78)
[error] at Main$.main(main.scala:21)
[error] at Main.main(main.scala)
[error] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[error] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[error] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[error] at java.lang.reflect.Method.invoke(Method.java:498)
[error] stack trace is suppressed; run last Compile / run for the full output
[error] (Compile / run) org.json4s.MappingException: Unexpected type info RefinedType(ClassSymbol(<refinement>, owner=0, flags=0, info=173 ,None),List(TypeRefType(ThisType(java.lang),java.lang.Object,List()), TypeRefType(ThisType(java.io),java.io.Serializable,List())))
[error] Total time: 16 s, completed Jun 19, 2022 2:14:04 AM
I checked, that:
When Snowflake is not extended from Connector, everything is OK:
val ctest = """{"use_ssh": 0, "schema": "YESDATA", "activeModal": "showflake",
"warehouse": "COMPUTE_WH", "role": "DIv453SEFROLE", "username": "Disdfg324",
"password": "Vgdtj65&stghj65&", "host": "wn354673.east-us-2.azure", "db_name":
"DSPSHARE2"}"""
parse(ctest).extract[Snowflake]
/* prints correct instance of Snowflake */
It doesnt matter whether I implement it with abstract class, or sealed trait, or trait.
I've tried to use hints instead of the custom serializer for Connector - no success.
Tried to use different versions of json4s (just in case) - no success.
I will really appreciate any suggestions.
It turns out the problem was in the body of Snowflake.
Solved the problem by adding .getOrElse("") in the Option[] values of the options map:
case class Snowflake(
val username: String, //+
val password: String, // +
val host: String, //+
val role: Option[String], //+
val warehouse: Option[String], //+
val port: Option[String], // String?
val db_name: String, // +
val schema: Option[String], //+
val activeModal: Option[String],
val use_ssh: Int, // +
val ssh_ip: Option[String],
val ssh_port: Option[Int],
val ssh_user: Option[String]
) extends Connector {
val options = Map(
("sfURL" -> s"${host}.snowflakecomputing.com"),
("sfUser" -> username),
("sfPassword" -> password),
("sfDatabase" -> db_name),
("sfRole" -> role.getOrElse("")),
("sfSchema" -> schema.getOrElse("")),
("sfWarehouse" -> warehouse.getOrElse("")))
}
But still, it's an interesting behavior. I thought the Option[] constructor val is getting None value if there is no value for it in json. So I don't see a problem with creating a Map with None values.
For example, I'm able to create something like this:
val test = Map(
("key1" -> "stringValue"),
("key2" -> 123),
("key3" -> None)
)
It will be a Map[String, Any]. Don't understand why it causes the error in my case.
If someone can explain - it'll be wonderful.

Is Either.right = Right and Either.Left=Left?

At the next site:
https://typelevel.org/cats/datatypes/either.html
it is presented:
object EitherStyle {
def parse(s: String): Either[Exception, Int] =
if (s.matches("-?[0-9]+")) Either.right(s.toInt)
else Either.left(new NumberFormatException(s"${s} is not a valid integer."))
def reciprocal(i: Int): Either[Exception, Double] =
if (i == 0) Either.left(new IllegalArgumentException("Cannot take reciprocal of 0."))
else Either.right(1.0 / i)
def stringify(d: Double): String = d.toString
}
Yet, I am getting the error:
[error] /application/learningSBT/hello-world/src/main/scala/Main.scala:16:39: value right is not a member of object scala.util.Either
[error] if (s.matches("-?[0-9]+")) Either.right(s.toInt)
[error] ^
[error] /application/learningSBT/hello-world/src/main/scala/Main.scala:17:17: value left is not a member of object scala.util.Either
[error] else Either.left(new NumberFormatException(s"${s} is not a valid integer."))
[error] ^
[error] /application/learningSBT/hello-world/src/main/scala/Main.scala:21:14: value left is not a member of object scala.util.Either
[error] Either.left(new IllegalArgumentException("Cannot take reciprocal of 0."))
[error] ^
[error] /application/learningSBT/hello-world/src/main/scala/Main.scala:22:17: value right is not a member of object scala.util.Either
[error] else Either.right(1.0 / i)
[error] ^
[error] four errors found
[error] (Compile / compileIncremental) Compilation failed
[error] Total time: 2 s, completed Feb 12, 2020 10:25:02 AM
However, when I replaced Either.right with Right and Either.left with Left I got this code compiling:
object EitherStyle {
def parse(s: String): Either[Exception, Int] =
if (s.matches("-?[0-9]+")) Right(s.toInt)
else Left(new NumberFormatException(s"${s} is not a valid integer."))
def reciprocal(i: Int): Either[Exception, Double] =
if (i == 0)
Left(new IllegalArgumentException("Cannot take reciprocal of 0."))
else Right(1.0 / i)
def stringify(d: Double): String = d.toString
}
So, I wonder what makes this to happen.
This is an extension of cats to the standard Either object.
Import cats.syntax.either._ for this to work.

java.lang.NumberFormatException: For input string: "505c621128f97f31c5870f2a9e2d274fa432bd0

Running the sbt test, I've got the following error message:
error] java.lang.NumberFormatException: For input string: "505c621128f97f31c5870f2a9e2d274fa432bd0e"
[error] at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
[error] at java.lang.Long.parseLong(Long.java:589)
[error] at java.lang.Long.parseLong(Long.java:631)
[error] at scala.collection.immutable.StringLike.toLong(StringLike.scala:305)
[error] at scala.collection.immutable.StringLike.toLong$(StringLike.scala:305)
[error] at scala.collection.immutable.StringOps.toLong(StringOps.scala:29)
[error] at sbt.TestStatus$.$anonfun$read$1(TestStatusReporter.scala:42)
[error] at sbt.TestStatus$.$anonfun$read$1$adapted(TestStatusReporter.scala:42)
[error] at scala.collection.Iterator.foreach(Iterator.scala:937)
[error] at scala.collection.Iterator.foreach$(Iterator.scala:937)
[error] at scala.collection.AbstractIterator.foreach(Iterator.scala:1425)
[error] at sbt.TestStatus$.read(TestStatusReporter.scala:42)
[error] at sbt.TestStatusReporter.succeeded$lzycompute(TestStatusReporter.scala:20)
[error] at sbt.TestStatusReporter.succeeded(TestStatusReporter.scala:20)
[error] at sbt.TestStatusReporter.doComplete(TestStatusReporter.scala:31)
[error] at sbt.TestFramework$.$anonfun$createTestTasks$7(TestFramework.scala:240)
[error] at sbt.TestFramework$.$anonfun$createTestTasks$7$adapted(TestFramework.scala:240)
[error] at sbt.TestFramework$.$anonfun$safeForeach$1(TestFramework.scala:150)
[error] at sbt.TestFramework$.$anonfun$safeForeach$1$adapted(TestFramework.scala:149)
[error] at scala.collection.Iterator.foreach(Iterator.scala:937)
[error] at scala.collection.Iterator.foreach$(Iterator.scala:937)
[error] at scala.collection.AbstractIterator.foreach(Iterator.scala:1425)
[error] at scala.collection.IterableLike.foreach(IterableLike.scala:70)
[error] at scala.collection.IterableLike.foreach$(IterableLike.scala:69)
[error] at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
[error] at sbt.TestFramework$.safeForeach(TestFramework.scala:149)
[error] at sbt.TestFramework$.$anonfun$createTestTasks$1(TestFramework.scala:226)
[error] at sbt.Tests$.$anonfun$testTask$1(Tests.scala:231)
[error] at sbt.Tests$.$anonfun$testTask$1$adapted(Tests.scala:231)
[error] at sbt.std.TaskExtra$$anon$1.$anonfun$fork$2(TaskExtra.scala:110)
[error] at sbt.std.Transform$$anon$3.$anonfun$apply$2(System.scala:46)
[error] at sbt.std.Transform$$anon$4.work(System.scala:67)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:269)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
[error] at sbt.Execute.work(Execute.scala:278)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:269)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:178)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:37)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error] at java.lang.Thread.run(Thread.java:748)
[error] java.lang.NumberFormatException: For input string: "505c621128f97f31c5870f2a9e2d274fa432bd0e"
[info] ScalaTest
[info] Run completed in 522 milliseconds.
[info] Total number of tests run: 3
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 3, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 3, Failed 0, Errors 0, Passed 3
As you can see, all test passed. What is wrong? Hint, I am using Intellij.
Update
Here is the code:
import atto._
import Atto._
import cats._
import cats.implicits._
sealed trait PcpPair
case class PcpHead(key: String, value: String) extends PcpPair
case class PcpFieldValue(field: String, value: String) extends PcpPair
case class Pcp(head: List[PcpHead], fv: List[PcpFieldValue], body: String)
object PcpProtocol {
implicit val pcpProtocol: Protocol[Pcp] = new Protocol[Pcp] {
override def encode(text: String): ProtocolResult[Pcp] =
doc
.parseOnly(text)
.either
.flatMap { t =>
isValidPcp(t._1) match {
case true => Right(t)
case false => Left("It is not a valid PCP protocol.")
}
}
.map(t => Pcp(filterPcpHead(t._1), filterFieldValue(t._1), t._2))
override def decode(msg: Pcp): String = ???
}
private val PcpValidity = "pcp-"
private val key = stringOf(letter | char('-'))
private val value = stringOf(notChar('\n'))
private val kv = (key <~ char(':')) ~ value
private val kvs = sepBy(kv, char('\n'))
private val doc = (kvs <~ string("\n\n")) ~ takeText
private val isValidPcp: List[(String, String)] => Boolean = textList =>
textList
.map(kv => kv._1.startsWith(PcpValidity))
.foldLeft(true)(_ || _)
private val filterPcpHead: List[(String, String)] => List[PcpHead] = textList =>
textList
.filter(text => text._1.contains(PcpValidity))
.map(text => PcpHead(text._1, text._2))
private val filterFieldValue: List[(String, String)] => List[PcpFieldValue] = textList =>
textList
.filter(text => !text._1.contains(PcpValidity))
.map(text => PcpFieldValue(text._1, text._2))
private val decodeHead: List[PcpHead] => String = heads =>
heads.foldLeft("") { (acc, value) =>
acc |+| value.key |+| ":" |+| value.value |+| "\n"
}
private val decodePair: List[PcpPair] => ((String, PcpPair) => String) => String = pcpList => fnPcp =>
pcpList.foldLeft("")(fnPcp)
}
and here is the test:
class PcpParserSpec extends FunSpec with Matchers {
val valid =
"""pcp-action:MESSAGE
|pcp-channel:apc\:///
|pcp-body-type:text
|PUBLICKEY:THISPK
|TOPIC:SEND
|
|Hello Foo""".stripMargin
describe("Receive message from SAP server") {
it("should contains pcp-channel:apc") {
Protocol.encode(valid) should be('right)
}
it("should be separated by head and body") {
Protocol.encode(valid) match {
case Right(value) => assert(value.head.length > 0)
case Left(text) => assert(text.length > 0)
}
}
describe("The body of the message") {
it("should contains Hello Foo") {
Protocol.encode(valid) match {
case Right(value) => assert(value.body == "Hello Foo")
case Left(text) => assert(text.length > 0)
}
}
}
}
describe("Send message to SAP") {
it("should encode appropriate PCP protocol") {
}
}
}
What you'll probably find is that target/streams/test/test/\$global/streams/succeeded_tests (in your project directory) is getting garbled. (Mine was an encoded mess).
You should be getting something like
#Successful Tests
#Thu Apr 04 07:54:33 NZDT 2019
MyTest=1554317673393
The number it's trying to read is after the =.
The confusing bit (to me) is that it's outputting the summary - mine didn't when it failed parsing.
Before you run the following, could you share (some of) the contents of succeeded_tests (above) in a comment here (I'm curious to see these failures).
sbt clean test should fix you up...

How do I operate the MapWithState function collectly

I have a spark streaming job, the codes are below there:
val filterActions = userActions.filter(Utils.filterPageType)
val parseAction = filterActions.flatMap(record => ParseOperation.parseMatch(categoryMap, record))
val finalActions = parseAction.filter(record => record.get("invalid") == None)
val userModels = finalActions.map(record => (record("deviceid"), record)).mapWithState(StateSpec.function(stateUpdateFunction))
but all functions can compile smoothly except for the mapWithState function, the return type of ParseOperation.parseMatch(categoryMap, record) is ListBuffer[Map[String, Any]], the error like below:
[INFO] Compiling 9 source files to /Users/spare/project/campaign-project/stream-official-mall/target/classes at 1530404002409
[ERROR] /Users/spare/project/campaign-project/stream-official-mall/src/main/scala/com/shopee/mall/data/OfficialMallTracker.scala:77: error: overloaded method value function with alternatives:
[ERROR] [KeyType, ValueType, StateType, MappedType](mappingFunction: org.apache.spark.api.java.function.Function3[KeyType,org.apache.spark.api.java.Optional[ValueType],org.apache.spark.streaming.State[StateType],MappedType])org.apache.spark.streaming.StateSpec[KeyType,ValueType,StateType,MappedType] <and>
[ERROR] [KeyType, ValueType, StateType, MappedType](mappingFunction: org.apache.spark.api.java.function.Function4[org.apache.spark.streaming.Time,KeyType,org.apache.spark.api.java.Optional[ValueType],org.apache.spark.streaming.State[StateType],org.apache.spark.api.java.Optional[MappedType]])org.apache.spark.streaming.StateSpec[KeyType,ValueType,StateType,MappedType] <and>
[ERROR] [KeyType, ValueType, StateType, MappedType](mappingFunction: (KeyType, Option[ValueType], org.apache.spark.streaming.State[StateType]) => MappedType)org.apache.spark.streaming.StateSpec[KeyType,ValueType,StateType,MappedType] <and>
[ERROR] [KeyType, ValueType, StateType, MappedType](mappingFunction: (org.apache.spark.streaming.Time, KeyType, Option[ValueType], org.apache.spark.streaming.State[StateType]) => Option[MappedType])org.apache.spark.streaming.StateSpec[KeyType,ValueType,StateType,MappedType]
[ERROR] cannot be applied to ((Any, Map[String,Any], org.apache.spark.streaming.State[Map[String,Any]]) => Some[Map[String,Any]])
[ERROR] val userModels = finalActions.map(record => (record("deviceid"), record)).mapWithState(StateSpec.function(stateUpdateFunction))
[ERROR] ^
[ERROR] one error found
what caused the issue? How do I modify the code?
I had fixed it, it caused StateSpec.function(stateUpdateFunction)) required the type of input parameter is Map[String, Any], before calling it ,I used the map function, the code is below:
val finalActions = parseAction.filter(record => record.get("invalid") == None).map(Utils.parseFinalRecord)
val parseFinalRecord = (record: Map[String, Any]) => {
val recordMap = collection.mutable.Map(record.toSeq: _*)
logger.info(s"recordMap: ${recordMap}")
recordMap.toMap
}
it works!

Weird type mismatch error in Scala

Already asked at scala-user, didn't get an answer.
I expect the below to compile:
trait Elems {
trait Dummy
abstract class Elem[A] extends Serializable with Dummy
class BaseElem[A] extends Elem[A]
implicit val BooleanElement: Elem[Boolean] = new BaseElem[Boolean]
implicit val ByteElement: Elem[Byte] = new BaseElem[Byte]
implicit val ShortElement: Elem[Short] = new BaseElem[Short]
implicit val IntElement: Elem[Int] = new BaseElem[Int]
implicit val LongElement: Elem[Long] = new BaseElem[Long]
implicit val FloatElement: Elem[Float] = new BaseElem[Float]
implicit val DoubleElement: Elem[Double] = new BaseElem[Double]
implicit val UnitElement: Elem[Unit] = new BaseElem[Unit]
implicit val StringElement: Elem[String] = new BaseElem[String]
implicit val CharElement: Elem[Char] = new BaseElem[Char]
}
trait GoodMatch { self: Elems =>
private def boxed_class(e: Elem[_]): Class[_] = e match {
case BooleanElement => classOf[java.lang.Boolean]
case ByteElement => classOf[java.lang.Byte]
case ShortElement => classOf[java.lang.Short]
case IntElement => classOf[java.lang.Integer]
case LongElement => classOf[java.lang.Long]
case FloatElement => classOf[java.lang.Float]
case DoubleElement => classOf[java.lang.Double]
case CharElement => classOf[java.lang.Character]
case _ => ???
}
}
abstract class BadMatch[+A <: Elems](scalan: A) {
import scalan._
protected def toLuaValue(x: Any, eX: Elem[_]): String = eX match {
case UnitElement => ""
case _ => ???
}
// should check type before conversion?
protected def fromLuaValue[B](lv: Any, eA: Elem[B]): B = (eA match {
case UnitElement => ()
}).asInstanceOf[B]
}
And GoodMatch does, but BadMatch fails (in Scala 2.11.8):
[error] /tmp/rendererqu0xjasKpX/src/main/scala/test.scala:48: type mismatch;
[error] found : BadMatch.this.scalan.Elem[Unit]
[error] required: BadMatch.this.scalan.Elem[_$3] where type _$3
[error] case UnitElement => ""
[error] ^
[error] /tmp/rendererqu0xjasKpX/src/main/scala/test.scala:63: type mismatch;
[error] found : BadMatch.this.scalan.Elem[Unit]
[error] required: BadMatch.this.scalan.Elem[B]
[error] case UnitElement => ()
[error] ^
Removing with Dummy makes BadMatch compile as well.
Is this a Scala bug? If so, is it a known one?
Yes, it's a Scala compiler bug: https://issues.scala-lang.org/browse/SI-9779.