"Unchecked type pattern" warning in Scala? - scala

Suppose I have a map m: Map[Any, Int]. Now I would like to take only entries (String, Int) from m and create a new map m1: Map[String, Int] with those entries.
I am trying to do the following:
val m1: Map[String, Int] = m collect {case e:(String, Int) => e}
It seems working but I get a warning: non variable type-argument String in type pattern (String, Int) is unchecked since it is eliminated by erasure.
How can I get rid of the warning?

you probably want:
val m1: Map[String, Int] = m collect {case (k:String, v:Int) => k->v}

(Just for reference. What you want is virtualeyes’s answer.)
val m1: Map[String, Int] = m flatMap { e =>
e._1 match {
case e1: String => Some(e1 -> e._2)
case _ => None
}
}

Careful testing will show that your solution actually matches everything in the map, not just entries of type (String,Int). The warning from the compiler is telling you that the types of your match will be thrown away at runtime so your code is actually doing something like this:
val m1: Map[String, Int] = m collect {case e:Tuple2[Any,Any] => e.asInstanceOf[Tuple2[String,Int]]}
And the asInstanceOf call will not blow up since it only casts to a Tuple2 and the (String,Int) bit gets lost again because of erasure. You'll get a nasty failure when you try to iterate against the result though ...
Try this in the REPL
val m:Map[String,Int] = Map("2" -> 3, 3 -> 4, "6" -> 10) collect {case e:(String, Int) => e}
// Oops, thought we would get '2'
m size
// Nothing wrong with this code except m contains some
// hidden nasties which causes us to blow up
for ((s:String, i:Int) <- m) yield s.length + i
`

Related

get tuple as arguments from zipWithIndex in scala

I can pass and use values from tuple like this in for
for ((v,i) <- in.zipWithIndex) {
println(s"$i is $v")
}
But in foreach it's used only like
in.zipWithIndex.foreach {
case(v, i) => println(s"$i is $v")
}
How can I make something like function
val f: (Int,Int) => Unit = (v,i) => {println(s"$i is $v")}
and then pass it into .foreach(). AND (it's important just for me) without using pattern matching case.
P.S. .tupled works only for methods (def). not for function defined as val
The argument to your function needs to be a Tuple.
scala> val l = List(1,2,3).zipWithIndex
l: List[(Int, Int)] = List((1,0), (2,1), (3,2))
scala> val f = (t: (Int, Int)) => println(s"${t._1} ${t._2}")
f: ((Int, Int)) => Unit = <function1>
scala> l.foreach(f)
1 0
2 1
3 2
Personally I hate the ._1, ._n syntax and prefer to pattern match on tuples.
BTW your for comprehension example is also using pattern matching...

Get FunctionX arguments and output types

I am struggling to obtained the types of the arguments of a defined function in Scala. For example Funcion1[T1, T2].
Since Java will eliminate the types checking (compiler warning: is unchecked since it is eliminated by erasure), I would like to find a way to match that function with their types.
The goal is to be able to have same functionality as:
val fnInput = {x: Map[String, Double] => x}
fnInput match {
case f: Function1[Map[String, Double], Map[String, Double]] => ???
case f: Function1[T1, T2] => ???
case f: Function2[T1, T2, T3] => ???
}
But, checking the arguments types.
Updated: so far my solution will go into using the following tools
import scala.reflect.runtime.universe._
def getType[T: TypeTag](obj: T) = typeOf[T]
val t = getType({x: Map[String, Any] => x})
// check first argument
typeOf[Map[String, Int]] <:< t.typeArgs(0)
// check return of Function1
typeOf[Map[String, Int]] <:< t.typeArgs(1)
// t.typeArgs.length will return the number of arguments +1
Do you believe that this is a good approach?

How does flatMap work on Maps in scala? map can be used as mapValues on Maps but how does the flatMap function work on Map objects?

I am not able to understand the functioning of flatMap function on Map objects.
You use flatMap if you want to flatten your result from map-function.
keep in mind:
flatMap(something)
is identical to
map(something).flatten
I think it is good question, cause map cannot be flatten as other collections. First off all we should look at the signature of this method:
def flatMap[B](f: (A) ⇒ GenTraversableOnce[B]): Map[B]
So, the documentation says that it should return Map, but it is not true, cause it can return any GenTraversableOnce and not only it. We can see it in the provided examples:
def getWords(lines: Seq[String]): Seq[String] = lines flatMap (line => line split "\\W+")
// lettersOf will return a Seq[Char] of likely repeated letters, instead of a Set
def lettersOf(words: Seq[String]) = words flatMap (word => word.toSet)
// lettersOf will return a Set[Char], not a Seq
def lettersOf(words: Seq[String]) = words.toSet flatMap (word => word.toSeq)
// xs will be an Iterable[Int]
val xs = Map("a" -> List(11,111), "b" -> List(22,222)).flatMap(_._2)
// ys will be a Map[Int, Int]
val ys = Map("a" -> List(1 -> 11,1 -> 111), "b" -> List(2 -> 22,2 -> 222)).flatMap(_._2)
So let look at the full signature:
def flatMap[B, That](f: ((K, V)) ⇒ GenTraversableOnce[B])(implicit bf: CanBuildFrom[Map[K, V], B, That]): That
Now we see it returns That - something that implicit CanBuildFrom can provide for us.
You can find many explanation how CanBuildFrom works.
But the main idea is put some function from your key -> value pair to GenTraversableOnce, it can be some Map, Seq or even Option and it will be mapped and flattened. Also you can provide your own CanBuildFrom.
If you have a value or a key in the Map having a list then it can be flatMap-ed.
Example:
val a = Map(1->List(1,2),2->List(2,3))
a.map(_._2) gives List(List(1,2),List(2,3))
you can flatten this using flatMap => a.flatMap(_._2) or a.map(_._2).flatten gives List(1,2,2,3)
src: http://www.scala-lang.org/old/node/12158.html
Not sure about any other way of using flatMap on a Map though.

Infering generic type parameters in Scala

Hi, I've been trying to unify collection of nested maps.
So I want to implement a method with signature:
def unifyMaps(seq: Seq[Map[String, Map[WordType, Int]]]): Map[String, Map[WordType, Int]]
(WordType is a Java Enum.) The first approach was to do manual map-merging.
def unifyMapsManually(seq: IndexedSeq[Map[String, Map[WordType, Int]]]): Map[String, Map[WordType, Int]] = {
seq reduce { (acc, newMap) =>
acc ++ newMap.map { case (k, v) =>
val nestedMap = acc.getOrElse(k, Map.empty)
k -> (nestedMap ++ v.map { case (k2, v2) => k2 -> (nestedMap.getOrElse(k2, 0) + v2) })
}
}
}
It works, but what I'm doing here is recursively applying the exact same pattern, so I thought I'd make a recursive-generic version.
Second approach:
def unifyTwoMapsRecursively(m1: Map[String, Map[WordType, Int]], m2: Map[String, Map[WordType, Int]]): Map[String, Map[WordType, Int]] = {
def unifyTwoMaps[K, V](nestedMapOps: (V, (V, V) => V))(m1: Map[K, V], m2: Map[K, V]): Map[K, V] = {
nestedMapOps match {
case (zero, add) =>
m1 ++ m2.map { case (k, v) => k -> add(m1.getOrElse(k, zero), v) }
}
}
val intOps = (0, (a: Int, b: Int) => a + b)
val mapOps = (Map.empty[WordType, Int], unifyTwoMaps(intOps) _)
unifyTwoMaps(mapOps)(m1, m2)
}
But it fails with:
Error:(90, 18) type mismatch;
found : (scala.collection.immutable.Map[pjn.wierzba.DictionaryCLP.WordType,Int], (Map[Nothing,Int], Map[Nothing,Int]) => Map[Nothing,Int])
required: (scala.collection.immutable.Map[_ <: pjn.wierzba.DictionaryCLP.WordType, Int], (scala.collection.immutable.Map[_ <: pjn.wierzba.DictionaryCLP.WordType, Int], scala.collection.immutable.Map[_ <: pjn.wierzba.DictionaryCLP.WordType, Int]) => scala.collection.immutable.Map[_ <: pjn.wierzba.DictionaryCLP.WordType, Int])
unifyTwoMaps(mapOps)(m1, m2)
^
So ok, I have no idea about upper bound on map key, but the curried function clearly is not inferred correctly. I had similar error with intOps, so I tried to provide exact types:
val mapOps = (Map.empty[WordType, Int], unifyTwoMaps(intOps)(_: Map[String, Map[WordType, Int]], _: Map[String, Map[WordType, Int]]))
But this time it fails with:
Error:(89, 67) type mismatch;
found : Map[String,Map[pjn.wierzba.DictionaryCLP.WordType,Int]]
required: Map[?,Int]
val mapOps = (Map.empty[WordType, Int], unifyTwoMaps(intOps)(_: Map[String, Map[WordType, Int]], _: Map[String, Map[WordType, Int]]))
^
And this time I have absolutely no idea what to try next to get it working.
EDIT: I've found solution to my problem, but I'm still wondering why do I get type mismatch error in this code snippet:
val mapOps = (Map.empty[WordType, Int], unifyTwoMaps(intOps) _)
According to this answer scala type inference works per parameter's list - this is exactly what I've been doing here for currying purposes. My unifyTwoMaps function takes two parameters' lists and I'm trying to infer just the second one.
Solution to generic-recursive solution
Ok, so after spending morning on it I've finally understood that I've been providing wrong exact types.
val mapOps = (Map.empty[WordType, Int], unifyTwoMaps(intOps)(_: Map[String, Map[WordType, Int]], _: Map[String, Map[WordType, Int]]))
Should've been
val mapOps = (Map.empty[WordType, Int], unifyTwoMaps(intOps)(_: Map[WordType, Int], _: Map[WordType, Int]))
Because I needed to pass the type of Map's V, Map[WordType, Int], and not the type of whole outer map. And now it works!
Solution to underlying problem of nested map merging
Well, abstracting over maps' V zero and add should ring a bell, I've been reinventing Monoid. So I thought I'd try Scalaz |+| Semigroups operator solution from this answer.
import scalaz.Scalaz._
def unifyMapsWithScalaz(seq: Seq[Map[String, Map[WordType, Int]]]): Map[String, Map[WordType, Int]] = {
seq reduce (_ |+| _)
}
And it works!
What's interesting is that I already saw that post before trying my solution, but I thought that I'm not sure it'd work for nested data structure, especially with my map's keys being Java Enum. I thought I'd have to provide some custom implementation extending Semigroups's typeclass.
But as it turned out during my reinventing-the-wheel implementation, the enum is only needed as a passed type and map key and it works pretty well. Well done Scalaz!
Well, that would've made a good blog post actually..
EDIT: but I still don't understand why I had this type inference problem in the first place, I've updated the question.

Higher order operations with flattened tuples in scala

I've recently come across a problem. I'm trying to flatten "tail-nested" tuples in a compiler-friendly way, and I've come up with the code below:
implicit def FS[T](x: T): List[T] = List(x)
implicit def flatten[T,V](x: (T,V))(implicit ft: T=>List[T], fv: V=>List[T]) =
ft(x._1) ++ fv(x._2)
This above code works well for flattening tuples I am calling "tail-nested" like the ones below.
flatten((1,2)) -> List(1,2)
flatten((1,(2,3))) -> List(1,2,3)
flatten((1,(2,(3,4)))) -> List(1,2,3,4)
However, I seek to make my solution more robust. Consider a case where I have a list of these higher-kinded "tail-nested" tuples.
val l = List( (1,2), (1,(2,3)), (1,(2,(3,4))) )
The inferred type signature of this would be List[(Int, Any)] and this poses a problem for an operation such as map, which would fail with:
error: No implicit view available from Any => List[Int]
This error makes sense to me because of the nature of my recursive implicit chain in the flatten function. However, I was wondering: is there any way I can make my method of flattening the tuples more robust so that higher order functions such as map mesh well with it?
EDIT:
As Bask.ws pointed out, the Product trait offers potential for a nice solution. The below code illustrates this:
def flatten(p: Product): List[_] = p.productIterator.toList.flatMap {x => x match {
case pr: Product => flatten(pr)
case _ => List(x)
}}
The result type of this new flatten call is always List[Any]. My problem would be solved if there was a way to have the compiler tighten this bound a bit. In parallel to my original question, does anyone know if it is possible to accomplish this?
UPD Compile-time fail solution added
I have one solution that may suit you. Types of your first 3 examples are resolved in compile time: Int, Tuple2[Int, Int], Tuple2[Int, Tuple2[Int, Int]]. For you example with the list you have heterogeneous list with actual type List[(Int, Any)] and you have to resolve the second type in runtime or it maybe can be done by macro. So you may want to actually write implicit def flatten[T](x: (T,Any)) as your error advises you
Here is the fast solution. It gives a couple of warnings, but it works nicely:
implicit def FS[T](x: T): List[T] = List(x)
implicit def FP[T](x: Product): List[T] = {
val res = (0 until x.productArity).map(i => x.productElement(i) match {
case p: Product => FP[T](p)
case e: T => FS(e)
case _ => sys.error("incorrect element")
})
res.toList.flatten
}
implicit def flatten[T](x: (T,Any))(implicit ft: T=>List[T], fp: Product =>List[T]) =
ft(x._1) ++ (x._2 match {
case p: Product => fp(p)
case t: T => ft(t)
})
val l = List( (1,2), (1,(2,3)), (1,(2,(3,4))) )
scala> l.map(_.flatten)
res0: List[List[Int]] = List(List(1, 2), List(1, 2, 3), List(1, 2, 3, 4))
UPD
I have researched problem a little bit more, and I have found simple solution to make homogeneus list, which can fail at compile time. It is fully typed without Any and match and looks like compiler now correctly resolves nested implicits
case class InfiniteTuple[T](head: T, tail: Option[InfiniteTuple[T]] = None) {
def flatten: List[T] = head +: tail.map(_.flatten).getOrElse(Nil)
}
implicit def toInfiniteTuple[T](x: T): InfiniteTuple[T] = InfiniteTuple(x)
implicit def toInfiniteTuple2[T, V](x: (T, V))(implicit ft: V => InfiniteTuple[T]): InfiniteTuple[T] =
InfiniteTuple(x._1, Some(ft(x._2)))
def l: List[InfiniteTuple[Int]] = List( (1,2), (1,(2,3)), (1,(2,(3,4)))) //OK
def c: List[InfiniteTuple[Int]] = List( (1,2), (1,(2,3)), (1,(2,(3,"44"))))
//Compile-time error
//<console>:11: error: No implicit view available from (Int, (Int, java.lang.String)) => InfiniteTuple[Int]
Then you can implement any flatten you want. For example, one above:
scala> l.map(_.flatten)
res0: List[List[Int]] = List(List(1, 2), List(1, 2, 3), List(1, 2, 3, 4))