In Twitter's Scala school collections section, they show a Map with a partial function as a value:
// timesTwo() was defined earlier.
def timesTwo(i: Int): Int = i * 2
Map("timesTwo" -> timesTwo(_))
If I try to compile this with Scala 2.9.1 and sbt I get the following:
[error] ... missing parameter type for expanded function ((x$1) => "timesTwo".$minus$greater(timesTwo(x$1)))
[error] Map("timesTwo" -> timesTwo(_))
[error] ^
[error] one error found
If I add the parameter type:
Map("timesTwo" -> timesTwo(_: Int))
I then get the following compiler error:
[error] ... type mismatch;
[error] found : Int => (java.lang.String, Int)
[error] required: (?, ?)
[error] Map("timesTwo" -> timesTwo(_: Int))
[error] ^
[error] one error found
I'm stumped. What am I missing?
It thinks you want to do this:
Map((x: Int) => "timesTwo".->timesTwo(x))
When you want this:
Map("timesTwo" -> { (x: Int) => timesTwo(x) })
So this works:
Map( ("timesTwo", timesTwo(_)) )
Map("timesTwo" -> { timesTwo(_) })
Note this is not an usual error, see
https://stackoverflow.com/a/7695459/257449.
Scala underscore - ERROR: missing parameter type for expanded function
(and probably more)
You are missing telling scalac that you want to lift the method timesTwo into a function. This can be done with an underscore as follows
scala> Map("timesTwo" -> timesTwo _)
res0: scala.collection.immutable.Map[java.lang.String,Int => Int] = Map(timesTwo -> <function1>)
Related
I am new to Apache Spark, and am not able to get this to work.
I have an RDD of the form (Int,(Int,Int)), and would like to sum up the first element of the value while appending the second element.
For example, I have the following RDD:
[(5,(1,0)), (5,(1,2)), (5,(1,5)))]
And I want to be able to get something like this:
(5,3,(0,2,5))
I tried this:
sampleRdd.reduceByKey{case(a,(b,c)) => (a + b)}
But I get this error:
type mismatch;
[error] found : Int
[error] required: String
[error] .reduceByKey{case(a,(b,c)) => (a + b)}
[error] ^
How can I achieve this?
Please try this
def seqOp = (accumulator: (Int, List[String]), element: (Int, Int)) =>
(accumulator._1 + element._1, accumulator._2 :+ element._2.toString)
def combOp = (accumulator1: (Int, List[String]), accumulator2: (Int, List[String])) => {
(accumulator1._1 + accumulator2._1, accumulator1._2 ::: accumulator2._2)
}
val zeroVal = ((0, List.empty[String]))
rdd.aggregateByKey(zeroVal)(seqOp, combOp).collect
I have a spark streaming job, the codes are below there:
val filterActions = userActions.filter(Utils.filterPageType)
val parseAction = filterActions.flatMap(record => ParseOperation.parseMatch(categoryMap, record))
val finalActions = parseAction.filter(record => record.get("invalid") == None)
val userModels = finalActions.map(record => (record("deviceid"), record)).mapWithState(StateSpec.function(stateUpdateFunction))
but all functions can compile smoothly except for the mapWithState function, the return type of ParseOperation.parseMatch(categoryMap, record) is ListBuffer[Map[String, Any]], the error like below:
[INFO] Compiling 9 source files to /Users/spare/project/campaign-project/stream-official-mall/target/classes at 1530404002409
[ERROR] /Users/spare/project/campaign-project/stream-official-mall/src/main/scala/com/shopee/mall/data/OfficialMallTracker.scala:77: error: overloaded method value function with alternatives:
[ERROR] [KeyType, ValueType, StateType, MappedType](mappingFunction: org.apache.spark.api.java.function.Function3[KeyType,org.apache.spark.api.java.Optional[ValueType],org.apache.spark.streaming.State[StateType],MappedType])org.apache.spark.streaming.StateSpec[KeyType,ValueType,StateType,MappedType] <and>
[ERROR] [KeyType, ValueType, StateType, MappedType](mappingFunction: org.apache.spark.api.java.function.Function4[org.apache.spark.streaming.Time,KeyType,org.apache.spark.api.java.Optional[ValueType],org.apache.spark.streaming.State[StateType],org.apache.spark.api.java.Optional[MappedType]])org.apache.spark.streaming.StateSpec[KeyType,ValueType,StateType,MappedType] <and>
[ERROR] [KeyType, ValueType, StateType, MappedType](mappingFunction: (KeyType, Option[ValueType], org.apache.spark.streaming.State[StateType]) => MappedType)org.apache.spark.streaming.StateSpec[KeyType,ValueType,StateType,MappedType] <and>
[ERROR] [KeyType, ValueType, StateType, MappedType](mappingFunction: (org.apache.spark.streaming.Time, KeyType, Option[ValueType], org.apache.spark.streaming.State[StateType]) => Option[MappedType])org.apache.spark.streaming.StateSpec[KeyType,ValueType,StateType,MappedType]
[ERROR] cannot be applied to ((Any, Map[String,Any], org.apache.spark.streaming.State[Map[String,Any]]) => Some[Map[String,Any]])
[ERROR] val userModels = finalActions.map(record => (record("deviceid"), record)).mapWithState(StateSpec.function(stateUpdateFunction))
[ERROR] ^
[ERROR] one error found
what caused the issue? How do I modify the code?
I had fixed it, it caused StateSpec.function(stateUpdateFunction)) required the type of input parameter is Map[String, Any], before calling it ,I used the map function, the code is below:
val finalActions = parseAction.filter(record => record.get("invalid") == None).map(Utils.parseFinalRecord)
val parseFinalRecord = (record: Map[String, Any]) => {
val recordMap = collection.mutable.Map(record.toSeq: _*)
logger.info(s"recordMap: ${recordMap}")
recordMap.toMap
}
it works!
I am trying to use the code below in scala, using GraphX
val vertexRDD: RDD[(VertexId, String)] = graph.vertices.filter({
case (id, (str)) => {
val c: Boolean = scala.util.Try(str.toInt) match {
case Success(_) => false
case _ => true
}
}
})
This function is with the official interface def filter(pred: Tuple2[VertexId, VD] => Boolean): VertexRDD[VD]
However it throws a type mismatch error
[error] found : Unit
[error] required: Boolean
[error] }
[error] ^
How could it be? I have already specified the return to be Boolean and it is really Boolean, am I right?...
The reason this fails is that the value of a block is the value of the last expression in the block, but unfortunately the last expression in your block is a declaration which has type Unit. To fix this you can just remove the declaration.
You can also simplify your code by using Try.isSuccess and removing some unnecessary brackets
val vertexRDD: RDD[(VertexId, String)] = graph.vertices.filter{
case (_, (str)) =>
scala.util.Try(str.toInt).isSuccess
}
I was playing around a bit with macros and I thought writing a json type provider would be a good start to get a deeper understanding of how all this works, but I hit a weird error that I can't seem to be able to figure out myself. Code is available on GitHub if you want to take a look at the whole stuff: https://github.com/qwe2/json-typeprovider/.
The problematic part is, I tried to make it as typesafe as I can, meaning I wanted to implement json arrays in such a way, that indexing into them would return the correct type (as a subsequent macro invocation). relevant methods of the code:
json to Tree method:
def jsonToTpe(value: JValue): Option[Tree] = value match {
case JNothing => None
case JNull => None
case JString(s) => Some(q"$s")
case JDouble(d) => Some(q"$d")
case JDecimal(d) => Some(q"scala.BigDecimal(${d.toString})")
case JInt(i) => Some(q"scala.BigInt(${i.toByteArray})")
case JLong(l) => Some(q"$l")
case JBool(b) => Some(q"$b")
case JArray(arr) =>
val arrTree = arr.flatMap(jsonToTpe)
val clsName = c.freshName[TypeName](TypeName("harraycls"))
val hArray =
q"""
class $clsName {
#_root_.com.example.json.provider.body(scala.Array[Any](..$arrTree))
def apply(i: Int): Any = macro _root_.com.example.json.provider.DelegatedMacros.arrApply_impl
#_root_.com.example.json.provider.body(scala.Array[Any](..$arrTree))
def toArray: scala.Array[Any] = macro _root_.com.example.json.provider.DelegatedMacros.selectField_impl
}
new $clsName {}
"""
Some(hArray)
case JSet(set) => Some(q"scala.Set(..${set.flatMap(jsonToTpe)})")
case JObject(fields) =>
val fs = fields.flatMap { case (k, v) =>
jsonToTpe(v).map(v => q"""
#_root_.com.example.json.provider.body($v) def ${TermName(k)}: Any =
macro _root_.com.example.json.provider.DelegatedMacros.selectField_impl""")
}
val clsName = c.freshName[TypeName](TypeName("jsoncls"))
Some(q"""
class $clsName {
..$fs
}
new $clsName {}
""")
}
reading the annotation:
class body(tree: Any) extends StaticAnnotation
def arrApply_impl(c: whitebox.Context)(i: c.Expr[Int]): c.Tree = {
import c.universe._
def bail(msg: String): Nothing = {
c.abort(c.enclosingPosition, msg)
}
def error(msg: String): Unit = {
c.error(c.enclosingPosition, msg)
}
val arrValue = selectField_impl(c)
val arrElems = arrValue match {
case q"scala.Array.apply[$tpe](..$elems)($cls)" => elems
case _ => bail("arr needs to be an array of constants")
}
val idx = i.tree match {
case Literal(Constant(ix: Int)) => ix
case _ => bail(s"i needs to be a constant Int, got ${showRaw(i.tree)}")
}
arrElems(idx)
}
def selectField_impl(c: whitebox.Context) : c.Tree = {
c.macroApplication.symbol.annotations.filter(
_.tree.tpe <:< c.typeOf[body]
).head.tree.children.tail.head
}
As you can see the way I tried to do it was, basically shove the actual array into a static annotation and when indexing it, I would dispatch that to another macro that can figure out the type. I got the idea from reading about vampire methods.
This is the json I'm trying to parse:
[
{"id": 1},
{"id": 2}
]
And this is how I invoke it:
val tpe3 = TypeProvider("arrayofobj.json")
println(tpe3.toArray.mkString(", "))
Reading an array of ints or an object of primitive fields works as expected but an array of objects throws a stackoverflow during compilation:
[error] /home/isti/projects/json-typeprovider/core/src/main/scala/com/example/Hello.scala:7:14: Internal error: unable to find the outer accessor symbol of object Hello
[error] object Hello extends App {
[error] ^
[error] ## Exception when compiling 1 sources to /home/isti/projects/json-typeprovider/core/target/scala-2.12/classes
[error] null
[error] java.lang.String.valueOf(String.java:2994)
[error] scala.collection.mutable.StringBuilder.append(StringBuilder.scala:200)
[error] scala.collection.TraversableOnce.$anonfun$addString$1(TraversableOnce.scala:359)
[error] scala.collection.immutable.List.foreach(List.scala:389)
[error] scala.collection.TraversableOnce.addString(TraversableOnce.scala:357)
[error] scala.collection.TraversableOnce.addString$(TraversableOnce.scala:353)
[error] scala.collection.AbstractTraversable.addString(Traversable.scala:104)
[error] scala.collection.TraversableOnce.mkString(TraversableOnce.scala:323)
[error] scala.collection.TraversableOnce.mkString$(TraversableOnce.scala:322)
[error] scala.collection.AbstractTraversable.mkString(Traversable.scala:104)
[error] scala.collection.TraversableOnce.mkString(TraversableOnce.scala:325)
[error] scala.collection.TraversableOnce.mkString$(TraversableOnce.scala:325)
[error] scala.collection.AbstractTraversable.mkString(Traversable.scala:104)
[error] scala.collection.TraversableOnce.mkString(TraversableOnce.scala:327)
[error] scala.collection.TraversableOnce.mkString$(TraversableOnce.scala:327)
[error] scala.collection.AbstractTraversable.mkString(Traversable.scala:104)
[error] xsbt.DelegatingReporter$.makePosition$1(DelegatingReporter.scala:89)
[error] xsbt.DelegatingReporter$.convert(DelegatingReporter.scala:94)
[error] xsbt.DelegatingReporter.info0(DelegatingReporter.scala:125)
[error] xsbt.DelegatingReporter.info0(DelegatingReporter.scala:102)
[error] scala.reflect.internal.Reporter.error(Reporting.scala:84)
[error] scala.reflect.internal.Reporting.globalError(Reporting.scala:69)
[error] scala.reflect.internal.Reporting.globalError$(Reporting.scala:69)
[error] scala.reflect.internal.SymbolTable.globalError(SymbolTable.scala:16)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerSelect(ExplicitOuter.scala:235)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
[error] scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267)
Edit: that's only the top of the stacktrace, there are a lot more of scala.tools.nsc.transform.ExplicitOuter$OuterPathTransformer.outerPath(ExplicitOuter.scala:267).
I am using Scala & Play 2.5. I am stuck with this error:
Game.scala:99: overloaded method value apply with alternatives:
[error] (block: => play.api.mvc.Result)play.api.mvc.Action[play.api.mvc.AnyContent] <and>
[error] (block: play.api.mvc.Request[play.api.mvc.AnyContent] => play.api.mvc.Result)play.api.mvc.Action[play.api.mvc.AnyContent] <and>
[error] [A](bodyParser: play.api.mvc.BodyParser[A])(block: play.api.mvc.Request[A] => play.api.mvc.Result)play.api.mvc.Action[A]
[error] cannot be applied to (Object)
[error] def start(id: String, apiKey: Option[String]) = Action {
This is the function:
def start(id: String, apiKey: Option[String]) = Action {
apiKey match {
case Some(API_KEY) => {
Server.actor ! Server.Start(id)
Ok("Started")
}
case _ => Future.successful(Unauthorized)
}
}
The problem is, the result of the match statement has been inferred to be Object, since from one case statement you're returning Result, and from the other you're returning Future[Result], so the only common super type is Object. To fix, change case _ => Future.successful(Unauthorized) to case _ => Unauthorized.