Scala / Play 2.5: Overloaded method apply with alternatives - scala

I am using Scala & Play 2.5. I am stuck with this error:
Game.scala:99: overloaded method value apply with alternatives:
[error] (block: => play.api.mvc.Result)play.api.mvc.Action[play.api.mvc.AnyContent] <and>
[error] (block: play.api.mvc.Request[play.api.mvc.AnyContent] => play.api.mvc.Result)play.api.mvc.Action[play.api.mvc.AnyContent] <and>
[error] [A](bodyParser: play.api.mvc.BodyParser[A])(block: play.api.mvc.Request[A] => play.api.mvc.Result)play.api.mvc.Action[A]
[error] cannot be applied to (Object)
[error] def start(id: String, apiKey: Option[String]) = Action {
This is the function:
def start(id: String, apiKey: Option[String]) = Action {
apiKey match {
case Some(API_KEY) => {
Server.actor ! Server.Start(id)
Ok("Started")
}
case _ => Future.successful(Unauthorized)
}
}

The problem is, the result of the match statement has been inferred to be Object, since from one case statement you're returning Result, and from the other you're returning Future[Result], so the only common super type is Object. To fix, change case _ => Future.successful(Unauthorized) to case _ => Unauthorized.

Related

How do I operate the MapWithState function collectly

I have a spark streaming job, the codes are below there:
val filterActions = userActions.filter(Utils.filterPageType)
val parseAction = filterActions.flatMap(record => ParseOperation.parseMatch(categoryMap, record))
val finalActions = parseAction.filter(record => record.get("invalid") == None)
val userModels = finalActions.map(record => (record("deviceid"), record)).mapWithState(StateSpec.function(stateUpdateFunction))
but all functions can compile smoothly except for the mapWithState function, the return type of ParseOperation.parseMatch(categoryMap, record) is ListBuffer[Map[String, Any]], the error like below:
[INFO] Compiling 9 source files to /Users/spare/project/campaign-project/stream-official-mall/target/classes at 1530404002409
[ERROR] /Users/spare/project/campaign-project/stream-official-mall/src/main/scala/com/shopee/mall/data/OfficialMallTracker.scala:77: error: overloaded method value function with alternatives:
[ERROR] [KeyType, ValueType, StateType, MappedType](mappingFunction: org.apache.spark.api.java.function.Function3[KeyType,org.apache.spark.api.java.Optional[ValueType],org.apache.spark.streaming.State[StateType],MappedType])org.apache.spark.streaming.StateSpec[KeyType,ValueType,StateType,MappedType] <and>
[ERROR] [KeyType, ValueType, StateType, MappedType](mappingFunction: org.apache.spark.api.java.function.Function4[org.apache.spark.streaming.Time,KeyType,org.apache.spark.api.java.Optional[ValueType],org.apache.spark.streaming.State[StateType],org.apache.spark.api.java.Optional[MappedType]])org.apache.spark.streaming.StateSpec[KeyType,ValueType,StateType,MappedType] <and>
[ERROR] [KeyType, ValueType, StateType, MappedType](mappingFunction: (KeyType, Option[ValueType], org.apache.spark.streaming.State[StateType]) => MappedType)org.apache.spark.streaming.StateSpec[KeyType,ValueType,StateType,MappedType] <and>
[ERROR] [KeyType, ValueType, StateType, MappedType](mappingFunction: (org.apache.spark.streaming.Time, KeyType, Option[ValueType], org.apache.spark.streaming.State[StateType]) => Option[MappedType])org.apache.spark.streaming.StateSpec[KeyType,ValueType,StateType,MappedType]
[ERROR] cannot be applied to ((Any, Map[String,Any], org.apache.spark.streaming.State[Map[String,Any]]) => Some[Map[String,Any]])
[ERROR] val userModels = finalActions.map(record => (record("deviceid"), record)).mapWithState(StateSpec.function(stateUpdateFunction))
[ERROR] ^
[ERROR] one error found
what caused the issue? How do I modify the code?
I had fixed it, it caused StateSpec.function(stateUpdateFunction)) required the type of input parameter is Map[String, Any], before calling it ,I used the map function, the code is below:
val finalActions = parseAction.filter(record => record.get("invalid") == None).map(Utils.parseFinalRecord)
val parseFinalRecord = (record: Map[String, Any]) => {
val recordMap = collection.mutable.Map(record.toSeq: _*)
logger.info(s"recordMap: ${recordMap}")
recordMap.toMap
}
it works!

Type mismatch when the type is already specified in scala

I am trying to use the code below in scala, using GraphX
val vertexRDD: RDD[(VertexId, String)] = graph.vertices.filter({
case (id, (str)) => {
val c: Boolean = scala.util.Try(str.toInt) match {
case Success(_) => false
case _ => true
}
}
})
This function is with the official interface def filter(pred: Tuple2[VertexId, VD] => Boolean): VertexRDD[VD]
However it throws a type mismatch error
[error] found : Unit
[error] required: Boolean
[error] }
[error] ^
How could it be? I have already specified the return to be Boolean and it is really Boolean, am I right?...
The reason this fails is that the value of a block is the value of the last expression in the block, but unfortunately the last expression in your block is a declaration which has type Unit. To fix this you can just remove the declaration.
You can also simplify your code by using Try.isSuccess and removing some unnecessary brackets
val vertexRDD: RDD[(VertexId, String)] = graph.vertices.filter{
case (_, (str)) =>
scala.util.Try(str.toInt).isSuccess
}

StackOverflow during typeprovider macro expansion

I was playing around a bit with macros and I thought writing a json type provider would be a good start to get a deeper understanding of how all this works, but I hit a weird error that I can't seem to be able to figure out myself. Code is available on GitHub if you want to take a look at the whole stuff: https://github.com/qwe2/json-typeprovider/.
The problematic part is, I tried to make it as typesafe as I can, meaning I wanted to implement json arrays in such a way, that indexing into them would return the correct type (as a subsequent macro invocation). relevant methods of the code:
json to Tree method:
def jsonToTpe(value: JValue): Option[Tree] = value match {
case JNothing => None
case JNull => None
case JString(s) => Some(q"$s")
case JDouble(d) => Some(q"$d")
case JDecimal(d) => Some(q"scala.BigDecimal(${d.toString})")
case JInt(i) => Some(q"scala.BigInt(${i.toByteArray})")
case JLong(l) => Some(q"$l")
case JBool(b) => Some(q"$b")
case JArray(arr) =>
val arrTree = arr.flatMap(jsonToTpe)
val clsName = c.freshName[TypeName](TypeName("harraycls"))
val hArray =
q"""
class $clsName {
#_root_.com.example.json.provider.body(scala.Array[Any](..$arrTree))
def apply(i: Int): Any = macro _root_.com.example.json.provider.DelegatedMacros.arrApply_impl
#_root_.com.example.json.provider.body(scala.Array[Any](..$arrTree))
def toArray: scala.Array[Any] = macro _root_.com.example.json.provider.DelegatedMacros.selectField_impl
}
new $clsName {}
"""
Some(hArray)
case JSet(set) => Some(q"scala.Set(..${set.flatMap(jsonToTpe)})")
case JObject(fields) =>
val fs = fields.flatMap { case (k, v) =>
jsonToTpe(v).map(v => q"""
#_root_.com.example.json.provider.body($v) def ${TermName(k)}: Any =
macro _root_.com.example.json.provider.DelegatedMacros.selectField_impl""")
}
val clsName = c.freshName[TypeName](TypeName("jsoncls"))
Some(q"""
class $clsName {
..$fs
}
new $clsName {}
""")
}
reading the annotation:
class body(tree: Any) extends StaticAnnotation
def arrApply_impl(c: whitebox.Context)(i: c.Expr[Int]): c.Tree = {
import c.universe._
def bail(msg: String): Nothing = {
c.abort(c.enclosingPosition, msg)
}
def error(msg: String): Unit = {
c.error(c.enclosingPosition, msg)
}
val arrValue = selectField_impl(c)
val arrElems = arrValue match {
case q"scala.Array.apply[$tpe](..$elems)($cls)" => elems
case _ => bail("arr needs to be an array of constants")
}
val idx = i.tree match {
case Literal(Constant(ix: Int)) => ix
case _ => bail(s"i needs to be a constant Int, got ${showRaw(i.tree)}")
}
arrElems(idx)
}
def selectField_impl(c: whitebox.Context) : c.Tree = {
c.macroApplication.symbol.annotations.filter(
_.tree.tpe <:< c.typeOf[body]
).head.tree.children.tail.head
}
As you can see the way I tried to do it was, basically shove the actual array into a static annotation and when indexing it, I would dispatch that to another macro that can figure out the type. I got the idea from reading about vampire methods.
This is the json I'm trying to parse:
[
{"id": 1},
{"id": 2}
]
And this is how I invoke it:
val tpe3 = TypeProvider("arrayofobj.json")
println(tpe3.toArray.mkString(", "))
Reading an array of ints or an object of primitive fields works as expected but an array of objects throws a stackoverflow during compilation:
[error] /home/isti/projects/json-typeprovider/core/src/main/scala/com/example/Hello.scala:7:14: Internal error: unable to find the outer accessor symbol of object Hello
[error] object Hello extends App {
[error] ^
[error] ## Exception when compiling 1 sources to /home/isti/projects/json-typeprovider/core/target/scala-2.12/classes
[error] null
[error] java.lang.String.valueOf(String.java:2994)
[error] scala.collection.mutable.StringBuilder.append(StringBuilder.scala:200)
[error] scala.collection.TraversableOnce.$anonfun$addString$1(TraversableOnce.scala:359)
[error] scala.collection.immutable.List.foreach(List.scala:389)
[error] scala.collection.TraversableOnce.addString(TraversableOnce.scala:357)
[error] scala.collection.TraversableOnce.addString$(TraversableOnce.scala:353)
[error] scala.collection.AbstractTraversable.addString(Traversable.scala:104)
[error] scala.collection.TraversableOnce.mkString(TraversableOnce.scala:323)
[error] scala.collection.TraversableOnce.mkString$(TraversableOnce.scala:322)
[error] scala.collection.AbstractTraversable.mkString(Traversable.scala:104)
[error] scala.collection.TraversableOnce.mkString(TraversableOnce.scala:325)
[error] scala.collection.TraversableOnce.mkString$(TraversableOnce.scala:325)
[error] scala.collection.AbstractTraversable.mkString(Traversable.scala:104)
[error] scala.collection.TraversableOnce.mkString(TraversableOnce.scala:327)
[error] scala.collection.TraversableOnce.mkString$(TraversableOnce.scala:327)
[error] scala.collection.AbstractTraversable.mkString(Traversable.scala:104)
[error] xsbt.DelegatingReporter$.makePosition$1(DelegatingReporter.scala:89)
[error] xsbt.DelegatingReporter$.convert(DelegatingReporter.scala:94)
[error] xsbt.DelegatingReporter.info0(DelegatingReporter.scala:125)
[error] xsbt.DelegatingReporter.info0(DelegatingReporter.scala:102)
[error] scala.reflect.internal.Reporter.error(Reporting.scala:84)
[error] scala.reflect.internal.Reporting.globalError(Reporting.scala:69)
[error] scala.reflect.internal.Reporting.globalError$(Reporting.scala:69)
[error] scala.reflect.internal.SymbolTable.globalError(SymbolTable.scala:16)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerSelect(ExplicitOuter.scala:235)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
Edit: that's only the top of the stacktrace, there are a lot more of scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267).

Refactoring Scala to use search functions as arguments results in Option[Any] issue

So originally I had the following. It contains a lot of boiler-plate:
private def getCollection(newState: Asset, currentState: Option[Asset]) =
newState.assetGroup.flatMap(_.collection) match {
case Some(collection) => Some(collection)
case None => currentState match {
case Some(state) => state.assetGroup.flatMap(_.collection)
case None => None
}
}
private def getChildSource(newState: Asset, currentState: Option[Asset]) =
newState.content.flatMap(_.contract.flatMap(_.childSource)) match {
case Some(childSource) => Some(childSource)
case None => currentState match {
case Some(state) => state.content.flatMap(_.contract.flatMap(_.childSource))
case None => None
}
}
private def getParentSource(newState: Asset, currentState: Option[Asset]) =
newState.content.flatMap(_.contract.flatMap(_.parentSourceId)) match {
case Some(childSource) => Some(childSource)
case None => currentState match {
case Some(state) => state.content.flatMap(_.contract.flatMap(_.parentSourceId))
case None => None
}
}
So after some work I simplified it to the following:
private def getCurrentField[A](newState: Asset, currentState: Option[Asset], searchFunction: Asset => Option[A]) : Option[A] =
newState.content.flatMap(_.contract.flatMap(_.childSource)) orElse {
currentState match {
case Some(state) => searchFunction(state)
case None => None
}
}
val getCollection : Asset => Option[Collection] = (state : Asset) => state.assetGroup.flatMap(_.collection)
val getChildSource : Asset => Option[String] = (state : Asset) => state.content.flatMap(_.contract.flatMap(_.childSource))
...but this gives me a compiler error:
[warn] <filename_removed>.scala:68: a type was inferred to be `Any`; this may indicate a programming error.
[warn] currentState match {
[warn] ^
[error] _:67: type mismatch;
[error] found : Option[Any]
[error] required: Option[A]
[error] newState.content.flatMap(_.contract.flatMap(_.childSource)) orElse {
[error] ^
[warn] one warning found
[error] one error found
If I remove the return type to getCurrentField, it compiles and the tests pass, but I still get that compiler warning: a type was inferred to be Any.
What's the best way to deal with type parameters in this situation?
It looks like _.childSource is of type Option[Asset], and searchFucntion returns Option[A]. So, when both _.contract and _.childSource are both defined (.flatMap returns Some), you are returning Option[Asset], and in the other case (.orElse) the returned type is Option[A].
So, the resulting return type of your function has to be Option[Any], because Any is the only common superclass of Asset and A, as far as the compiler can know.
In my refactored version, I had replaced one of the search functions, but not the other. The following removes the warning, compiles and tests correctly:
private def getCurrentField[A](newState: Asset, currentState: Option[Asset], searchFunction: Asset => Option[A]) : Option[A] =
searchFunction(newState) orElse {
currentState map searchFunction getOrElse None
}

Refactoring val to method results in compile-time error

I currently have
def list(node: NodeSeq): NodeSeq = {
val all = Thing.findAll.flatMap({
thing => bind("thing", chooseTemplate("thing", "entry", node),
"desc" -> Text(thing.desc.is),
"creator" -> thing.creatorName.getOrElse("UNKNOWN"),
"delete" -> SHtml.link("/test", () => delete(thing), Text("delete"))
)
})
all match {
case Nil => <span>No things</span>
case _ => <ol>{bind("thing", node, "entry" -> all)}</ol>
}
}
and I tried to refactor it to
def listItemHelper(node: NodeSeq): List[NodeSeq] = {
Thing.findAll.flatMap({
thing => bind("thing", chooseTemplate("thing", "entry", node),
"desc" -> Text(thing.desc.is),
"creator" -> thing.creatorName.getOrElse("UNKNOWN"),
"delete" -> SHtml.link("/test", () => delete(thing), Text("delete"))
)
})
}
def list(node: NodeSeq): NodeSeq = {
val all = listItemHelper(node)
all match {
case Nil => <span>No things</span>
case all: List[NodeSeq] => <ol>{bind("thing", node, "entry" -> all)}</ol>
case _ => <span>wtf</span>
}
}
but I get the following. I've traced all the return types and I don't see how my refactoring is any different than what would be happening internally. I even tried adding more match cases (as you can see in the refactored code) to make sure I was selecting the right type.
/Users/trenton/projects/sc2/supperclub/src/main/scala/com/runbam/snippet/Whyme.scala:37: error: overloaded method value -> with alternatives [T <: net.liftweb.util.Bindable](T with net.liftweb.util.Bindable)net.liftweb.util.Helpers.TheBindableBindParam[T] <and> (Boolean)net.liftweb.util.Helpers.BooleanBindParam <and> (Long)net.liftweb.util.Helpers.LongBindParam <and> (Int)net.liftweb.util.Helpers.IntBindParam <and> (Symbol)net.liftweb.util.Helpers.SymbolBindParam <and> (Option[scala.xml.NodeSeq])net.liftweb.util.Helpers.OptionBindParam <and> (net.liftweb.util.Box[scala.xml.NodeSeq])net.liftweb.util.Helpers.BoxBindParam <and> ((scala.xml.NodeSeq) => scala.xml.NodeSeq)net.liftweb.util.Helpers.FuncBindParam <and> (Seq[scala.xml.Node])net.liftweb.util.Helpers.TheBindParam <and> (scala.xml.Node)net.liftweb.util.Helpers.TheBindParam <and> (scala.xml.Text)net.liftweb.util.Helpers.TheBindParam <and> (scala.xml.NodeSeq)net.liftweb.util.Helpers.TheBindParam <and> (String)net.liftweb.util.Helpers.TheStrBindParam cannot be applied to (List[scala.xml.NodeSeq])
case all: List[NodeSeq] => <ol>{bind("thing", node, "entry" -> all)}</ol>
^
Here's how my brain parsed the error message...
error: overloaded method value ->
This is the name of the method, which is '->'.
with alternatives
What will follow is the list of possible parameters for -> within the bind() function.
[T <: net.liftweb.util.Bindable](T with net.liftweb.util.Bindable)net.liftweb.util.Helpers.TheBindableBindParam[T]
This says that anything which implements or includes the trait Bindable is fair game.
<and> (Boolean)net.liftweb.util.Helpers.BooleanBindParam
<and> (Long)net.liftweb.util.Helpers.LongBindParam
<and> (Int)net.liftweb.util.Helpers.IntBindParam
<and> (Symbol)net.liftweb.util.Helpers.SymbolBindParam
<and> (Option[scala.xml.NodeSeq])net.liftweb.util.Helpers.OptionBindParam
<and> (net.liftweb.util.Box[scala.xml.NodeSeq])net.liftweb.util.Helpers.BoxBindParam
Bunch of type-specific options.
<and> ((scala.xml.NodeSeq) => scala.xml.NodeSeq)net.liftweb.util.Helpers.FuncBindParam
<and> (Seq[scala.xml.Node])net.liftweb.util.Helpers.TheBindParam
<and> (scala.xml.Node)net.liftweb.util.Helpers.TheBindParam
<and> (scala.xml.Text)net.liftweb.util.Helpers.TheBindParam
<and> (scala.xml.NodeSeq)net.liftweb.util.Helpers.TheBindParam
<and> (String)net.liftweb.util.Helpers.TheStrBindParam
Ah! Node-related stuff. Our valid options seem to be NodeSeq, Seq[Node], Text, and Node
cannot be applied to (List[scala.xml.NodeSeq])
Looks like List[NodeSeq] is not a valid option.
With this in mind, you probably want to take an individual NodeSeq out of the List in order to bind it to the form. Are you sure you really want to return a List from the helper method?
I failed to see that NodeSeq extends Seq[Node], so I had the wrong return type on the extracted method. Changing it to
def listItemHelper(node: NodeSeq): NodeSeq = {
Thing.findAll.flatMap({
thing => bind("thing", chooseTemplate("thing", "entry", node),
"desc" -> Text(thing.desc.is),
"creator" -> thing.creatorName.getOrElse("UNKNOWN"),
"delete" -> SHtml.link("/test", () => delete(thing), Text("delete"))
)
})
}
def list(node: NodeSeq): NodeSeq = {
val all = listItemHelper(node)
all.length match {
case 0 => <span>No things</span>
case _ => <ol>{bind("thing", node, "entry" -> all)}</ol>
}
}
works.
One issue is that your match doesn't really make any sense: basically you are matching against either an empty list or a non-empty list. There is no other possibility:
all match {
case Nil => //if list is empty
case nonEmptyList => //if list is not empty
}
Of course you could also do:
case Nil =>
case x :: Nil => //list with only a head
case x :: xs => //head :: tail
As a side note, there's one thing in your code that doesn't work:
case all: List[NodeSeq]
Because of type erasure, there's no way to test, at runtime, whether all list a List[NodeSeq], List[String], List[AnyRef] or what-have-you. I'm pretty sure you must be getting a warning on that line, and ignoring it because you don't understand what it is warning you about (at least, that's what happened to me when I got such warning :). The correct line would be:
case all: List[_]
Which would accept any kind of List. Look up a question of mine on type erasure and Scala to see a bit more about it, if you are interested.