object Reflects {
def mirror() = universe.runtimeMirror(getClass.getClassLoader)
def caseFields(x: AnyRef) = {
val instanceMirror = mirror().reflect(x)
instanceMirror.symbol.typeSignature.members.collect {
case m: MethodSymbol if (m.isCaseAccessor) => m.name.toString -> instanceMirror.reflectMethod(m).apply()
}
}
}
I define an object Reflects, when I invole caseFields method within other class
Sometimes this method throws following exception
java.lang.UnsupportedOperationException: tail of empty list
at scala.collection.immutable.Nil$.tail(List.scala:339) ~[scala-library.jar:na]
at scala.collection.immutable.Nil$.tail(List.scala:334) ~[scala-library.jar:na]
at scala.reflect.internal.SymbolTable.popPhase(SymbolTable.scala:172) ~[scala-reflect.jar:na]
And other strange exception.
What's wrong with this method
In 2.10.3 (and probably in 2.10.4, because it doesn't look like we're going to have time to backport the fix from 2.11.0-M7), runtime reflection isn't thread-safe: http://docs.scala-lang.org/overviews/reflection/thread-safety.html. Your stack trace is one of the multitude of possible manifestations of the problem.
Bad news is that in 2.10.x there's no workaround for the thread-unsafety issue apart from putting all reflective operations in a synchronized block. Good news is that in 2.11.0 the problem should be gone.
Related
I can't comprehend why the compiler allows for the current code.
I use phantom types to protect access to methods. Only under specific "states" should methods be allowed to be called.
In most scenarios, this invariant is indeed verified by the compilation. Sometimes however, the compiler just ignores the constraint imposed by the phantom type.
This feels like a major bug. What am I not understanding?
I tried to simplify the problem as much as possible. My use case is more complex:
class Door[State <: DoorState] private {
def doorKey: Int = 1
def open[Phatom >: State <: Closed.type]: Door[Open.type] = new Door[Open.type]
def close[Phatom >: State <: Open.type]: Door[Closed.type] = new Door[Closed.type]
}
object Door {
def applyOpenDoor: Door[Open.type] = new Door[Open.type]
def applyClosedDoor: Door[Closed.type] = new Door[Closed.type]
}
sealed trait DoorState
case object Closed extends DoorState
case object Open extends DoorState
Then
val aClosedDoor = Door.applyClosedDoor
val res1 = aClosedDoor.close // does not compile. Good!
val res2 = aClosedDoor.open.close.close // does not compile. Good!
println(aClosedDoor.open.close.close) // does not compile. Good!
println(aClosedDoor.open.close.close.doorKey) // does not compile. Good!
aClosedDoor.open.close.close.doorKey == 1 // does not compile. Good!
println(aClosedDoor.open.close.close.doorKey == 1) // compiles! WTF?
As you can see above, the client can close a closed door. In my library, the corresponding behaviour is throwing a runtime exception. (I was confident the exception was well protected and impossible to reach)
I have only been able to replicate this problem for expressions returning boolean and when this is the argument to a function (in the example, println)
I am not looking for alternative solutions as much as I am looking for an explanation as to how this can happen? Don't you agree this is a considerable flaw? Or maybe I am missing something?
Scala version: 2.13.5
Edit
After a discussion on gitter, I opened bug request # https://github.com/scala/bug
The problem does not seem to occur in scala 3.
Seems it is an bug related to usage of nullary method
def doorKey: Int = 1
if instead it is defined as nilary method
def doorKey(): Int = 1
then it works as expected
println(aClosedDoor.open.close.close.doorKey() == 1) // compiler error
I'm using Spring Boot 2.0.0.M7 and Project Reactor. My issue is relating to some strange behavior encountered while writing a Unit test. I also ran into this while trying to feed the output of flatMap into the repository.
Mono<Foo> create(Mono<FooResource> resourceMono) {
resourceMono.flatMap({
// convert resource into Foo domain Entity
return new Foo()
})
}
This closure should emit a Mono<Foo> due to the return value of flatMap. However, when calling block() to subscribe and get the resulting Foo, there is a ClassCastException in FluxFlatMap.trySubscribeScalarMap
Test code:
def createdFoo = Foo.create(Mono.just(fooResource)).block()
Stack Trace:
java.lang.ClassCastException: com.example.Foo cannot be cast to org.reactivestreams.Publisher
at reactor.core.publisher.FluxFlatMap.trySubscribeScalarMap(FluxFlatMap.java:141)
at reactor.core.publisher.MonoFlatMap.subscribe(MonoFlatMap.java:53)
at reactor.core.publisher.Mono.block(Mono.java:1161)
This appears to occur because MonoJust implements Callable, so trySubscribeScalarMap tries to unwrap it unsuccessfully.
In a non-test case scenario, a similar error occurs.
Mono<ServerResponse> createFoo(ServerRequest request) {
def body = request.body(BodyExtractors.toMono(FooResource))
ok().body(fromPublisher(Foo.create(body)
.flatMap({fooRepository.save(it)}), Foo))
}
Stack trace:
java.lang.ClassCastException: com.example.Foo cannot be cast to reactor.core.publisher.Mono
at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:118) [reactor-core-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at reactor.core.publisher.FluxOnAssembly$OnAssemblySubscriber.onNext(FluxOnAssembly.java:450) ~[reactor-core-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1092) ~[reactor-core-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at reactor.core.publisher.MonoSingle$SingleSubscriber.onComplete(MonoSingle.java:171) ~[reactor-core-3.1.2.RELEASE.jar:3.1.2.RELEASE]
<...>
Assembly trace from producer [reactor.core.publisher.MonoFlatMap] :
reactor.core.publisher.Mono.flatMap(Mono.java:2059)
org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite$PojoCachedMethodSiteNoUnwrap.invoke(PojoMetaMethodSite.java:213)
org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite.call(PojoMetaMethodSite.java:56)
org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125)
com.example.Foo.create(Foo.groovy:28)
org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125)
com.example.FooHandlerFunctions.createFoo(FooHandlerFunctions.groovy:48)
Wrapping the output of the flatMap closure in another Mono.just(foo) solves both issues. However, it seems like that shouldn't be needed. Am I doing something wrong or just misunderstanding how flatMap works here?
the flatmap takes a Function that should return a Mono (I guess groovy let you return the wrong type, as Foo doesn't seem to implement Publisher?)
I tried the following code:
class C(val g: Int => Int)
object C {
object A extends {
var f: Int => Int = x => x
} with C(x => f(x) + 1)
def main(args: Array[String]): Unit = {
println(A.g(3))
}
}
It can compile (at scala version 2.12.2), but throws Exception at runtime:
Exception in thread "main" java.lang.ExceptionInInitializerError
at pkg1.C$.main(C.scala:14)
at pkg1.C.main(C.scala)
Caused by: java.lang.ClassCastException: scala.runtime.ObjectRef cannot be cast to scala.Function1
at pkg1.C$A$.<init>(C.scala:10)
at pkg1.C$A$.<clinit>(C.scala)
... 2 more
Why does this happen?
This is probably a scalac bug (/unintended interference of features of early initialization of variables and lambda type references' class type) caused by using method handles for functions starting from 2.12:
Scala and Java 8 interop is also improved for functional code, as methods that take functions can easily be called in both directions using lambda syntax. The FunctionN classes in Scala’s standard library are now Single Abstract Method (SAM) types, and all SAM types are treated uniformly – from type checking through code generation. No class file is generated for a lambda; invokedynamic is used instead.
Edit: I could not find a good workaround for this problem besides changing the var to val, but I guess that is not possible in your use case.
I am writing a Scala macro (Scala 2.11) where I'd like to obtain the tree representing an implicit variable inside the macro using inferImplicitValue, evaluate that syntax tree, and use the value. I have actually done this, but it doesn't seem to work in all circumstances[1]. I constructed a simplified example where it fails.
// a class for implicit evidence
class DemoEvidence(val value: Int)
// define 'foo' method for invoking the macro
object demoModule {
def foo: Int = macro DemoMacros.fooImpl
}
class DemoMacros(val c: whitebox.Context) {
import c.universe._
def fooImpl: Tree = {
val vInt = try {
// get the tree representing the implicit value
val impl = c.inferImplicitValue(typeOf[DemoEvidence], silent = false)
// print it out
println(s"impl= $impl")
// try to evaluate the tree (this is failing)
val eval = c.eval(c.Expr[DemoEvidence](c.untypecheck(impl.duplicate)))
eval.value
} catch {
case e: Throwable => {
// on failure print out the failure message
println(s"Eval failed with: $e\nStack trace:\n${e.printStackTrace}")
0
}
}
q"$vInt" // return tree representing the integer value
}
}
If I compile the above, and then invoke it:
object demo {
implicit val demoEvidence: DemoEvidence = new DemoEvidence(42)
val i: Int = demoModule.foo
}
I see the compilation fail in the following way:
impl= demo.this.demoEvidence
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl$ToolBoxGlobal$$anonfun$compile$1.apply(ToolBoxFactory.scala:275)
...
Full output at:
https://gist.github.com/erikerlandson/df48f64329be6ab9de9caef5f5be4a83
So, you can see it is finding the tree for the declared implicit value demo.this.demoEvidence, but evaluation of that tree is failing. I have seen this basic approach work elsewhere in my project. Not sure what the difference is, and why it fails here.
[1] UPDATE: If the implicit value is defined in a (sub)project, and compiled, and then used exterior to that project, it works as expected. That was the case where this approach is working for me.
So the question is whether that's just a fundamental constraint I have to live with, or if there is some clever workaround, or if this is a "bug" with inferring implicit values inside macros that might be fixed.
UPDATE: I filed a Scala issue for this: https://github.com/scala/scala-dev/issues/353
From the look of the stack trace, the eval is expecting object demo to exist in classfile form for execution, which makes sense given that the value you're trying to compute depends on val demoEvidence which is a member of object demo.
But the eval is happening during the typechecking of object demo so the classfile doesn't exist yet, hence the error. In the version with the implicit value defined in a subproject I imagine the subproject is compiled first, hence the classfiles required for the eval exist and so evaluation proceeds as you were expecting.
Jerkson started throwing a really strange error that I haven't seen before.
com.fasterxml.jackson.databind.JsonMappingException: No serializer found for class scala.runtime.BoxedUnit and no properties discovered to create BeanSerializer (to avoid exception, disable SerializationConfig.SerializationFeature.FAIL_ON_EMPTY_BEANS) ) (through reference chain: scala.collection.MapWrapper["data"])
I'm parsing some basic data from an API. The class I've defined is:
case class Segmentation(
#(JsonProperty#field)("legend_size")
val legend_size: Int,
#(JsonProperty#field)("data")
val data: Data
)
and Data looks like:
case class Data(
#(JsonProperty#field)("series")
val series: List[String],
#(JsonProperty#field)("values")
val values: Map[String, Map[String, Any]]
)
Any clue why this would be triggering errors? Seems like a simple class that Jerkson can handle.
Edit: sample data:
{"legend_size": 1, "data": {"series": ["2013-04-06", "2013-04-07", "2013-04-08", "2013-04-09", "2013-04-10", "2013-04-11", "2013-04-12", "2013-04-13", "2013-04-14", "2013-04-15"], "values": {"datapoint": {"2013-04-12": 0, "2013-04-15": 4, "2013-04-14": 0, "2013-04-08":
0, "2013-04-09": 0, "2013-04-11": 0, "2013-04-10": 0, "2013-04-13": 0, "2013-04-06": 0, "2013-04-07": 0}}}}
this isn't the answer to the above example, but I'm going to offer it because it was the answer to my similar "BoxedUnit" scenario:
No serializer found for class scala.runtime.BoxedUnit and no properties discovered to create BeanSerializer (to avoid exception, disable SerializationFeature.FAIL_ON_EMPTY_BEANS)
In my case Jackson was complaining about deserializing an instance of a scala.runtime.BoxedUnit object.
Q: So what is scala.runtime.BoxedUnit?
A: It's scala java representation for "Unit". The core part of Jackson (which is java code) is attempting to deserialize a java representation of the scala Unit non-entity.
Q: So why was this happening?
A: In my case this was a downstream side effect caused by a buggy method with an undeclared return value. The method in question wrapped a match clause that (unintentionally) didn't return a value for each case. Because of the buggy code described above, Scala dynamically declared the var capturing the result of this method as "Unit". Later on in the code when this var gets serialized into json, the jackson error occurs.
So if you are getting an issue like this, my advice would be to examine any implicitly typed vars / methods with non-defined return values and ensure they are doing what you think they are doing.
I had the same exception. what caused it in my case is that I defined an apply method in the companion object without '=':
object Somthing {
def apply(s: SomthingElse) {
...
}
}
instead of
object Somthing {
def apply(s: SomthingElse) = {
...
}
}
That caused the apply method return type to be Unit which caused the exception when I passed the object to jackson.
Not sure if that is the case in your code or if this question is still relevant but this might help others with this kind of problem.
It's been a while since I first posted this question. The solution as of writing this answer appears to be moving on from Jerkson and using the Jackson-module-scala or Json4s with the Jackson backend. Many Scala types are included in the default serialized and are natively handled.
In addition, the reason why I'm seeing BoxedUnit is because the explicit type Jerkson was seeing was Any (a part of Map[String, Map[String, Any]]). Any is a base type and doesn't give Jerkson/Jackson information about what it's deserializing. Therefore, it complains about a missing serializer.