How to convert Java LinkedHashMap to Scala LinkedHashMap? - scala

I'm new to Scala. I've been trying to convert a java LinkedHashMap to an equivalent collection(LinkedHashMap?) in Scala in order to preserve the insertion order.
Tried following things as suggested in other threads, but nothing seems to work!
scalaAsMap() - is messing up the order
TreeMap() - sort on keys, values, etc. is not something I'm looking for
Explicit conversion is not working.
val f = new java.util.LinkedHashMap[String, java.util.Map[String, String]]
var g: scala.collection.mutable.LinkedHashMap[String, java.util.Map[String, String]] = f

Hmm, how about:
val javaMap = new java.util.LinkedHashMap[String, String]()
val scalaMap = javaMap.asScala
The type of scalaMap is Map[String, String] but under the hood it behaves just like LinkedHashMap.

Related

Java function with Map.Class parameter in Scala

I'm trying to use a Jackson's ObjectMapper() function: convertValue.
It takes 2 parameters (3 overloads):
(Object, Call)
(Object, TypeReference)
(Object, JavaType)
I have the following code:
val m = new ObjectMapper()
val map: Map[String, Object] = m.convertValue(bean, classOf[Map])
which doesn't work with error Type Mismatch. Expected JavaType actual Class[Map].
I tested with classOf[java.util.Map], Map.getClass, etc. but can't make it work.
How should I send that parameter?
Step 1: look at https://fasterxml.github.io/jackson-databind/javadoc/2.8/com/fasterxml/jackson/databind/JavaType.html. See
Instances can (only) be constructed by com.fasterxml.jackson.databind.type.TypeFactory.
Step 2: look at https://fasterxml.github.io/jackson-databind/javadoc/2.8/com/fasterxml/jackson/databind/type/TypeFactory.html.
Then you can see it can be used as e.g.
m.getTypeFactory.constructMapType(classOf[java.util.Map[_, _]], classOf[YourKey], classOf[YourValue])
You can use the mapper to get the JavaType, for example:
val stringType:JavaType = mapper.constructType(String.class);
You can try the following for your problem:
val m = new ObjectMapper()
val mapType:JavaType = mapper.constructType(java.util.Map.class)
val map: Map[String, Object] = m.convertValue(bean, mapType)

Evalutate complex type with quasiquote scala, unlifting

I need to compile function and then evaluate it with different parameters of type List[Map[String, AnyRef]].
I have the following code that does not compile with such the type but compiles with simple type like List[Int].
I found that there are just certain implementations of Liftable in scala.reflect.api.StandardLiftables.StandardLiftableInstances
import scala.reflect.runtime.universe
import scala.reflect.runtime.universe._
import scala.tools.reflect.ToolBox
val tb = universe.runtimeMirror(getClass.getClassLoader).mkToolBox()
val functionWrapper =
"""
object FunctionWrapper {
def makeBody(messages: List[Map[String, AnyRef]]) = Map.empty
}""".stripMargin
val functionSymbol =
tb.define(tb.parse(functionWrapper).asInstanceOf[tb.u.ImplDef])
val list: List[Map[String, AnyRef]] = List(Map("1" -> "2"))
tb.eval(q"$functionSymbol.function($list)")
Getting compilation error for this, how can I make it work?
Error:(22, 38) Can't unquote List[Map[String,AnyRef]], consider using
... or providing an implicit instance of
Liftable[List[Map[String,AnyRef]]]
tb.eval(q"$functionSymbol.function($list)")
^
The problem comes not from complicated type but from the attempt to use AnyRef. When you unquote some literal, it means you want the infrastructure to be able to create a valid syntax tree to create an object that would exactly match the object you pass. Unfortunately this is obviously not possible for all objects. For example, assume that you've passed a reference to Thread.currentThread() as a part of the Map. How it could possible work? Compiler is just not able to recreate such a complicated object (not to mention making it the current thread). So you have two obvious alternatives:
Make you argument also a Tree i.e. something like this
def testTree() = {
val tb = universe.runtimeMirror(getClass.getClassLoader).mkToolBox()
val functionWrapper =
"""
| object FunctionWrapper {
|
| def makeBody(messages: List[Map[String, AnyRef]]) = Map.empty
|
| }
""".stripMargin
val functionSymbol =
tb.define(tb.parse(functionWrapper).asInstanceOf[tb.u.ImplDef])
//val list: List[Map[String, AnyRef]] = List(Map("1" -> "2"))
val list = q"""List(Map("1" -> "2"))"""
val res = tb.eval(q"$functionSymbol.makeBody($list)")
println(s"testTree = $res")
}
The obvious drawback of this approach is that you loose type safety at compile time and might need to provide a lot of context for the tree to work
Another approach is to not try to pass anything containing AnyRef to the compiler-infrastructure. It means you create some function-like Wrapper:
package so {
trait Wrapper {
def call(args: List[Map[String, AnyRef]]): Map[String, AnyRef]
}
}
and then make your generated code return a Wrapper instead of directly executing the logic and call the Wrapper from the usual Scala code rather than inside compiled code. Something like this:
def testWrapper() = {
val tb = universe.runtimeMirror(getClass.getClassLoader).mkToolBox()
val functionWrapper =
"""
|object FunctionWrapper {
| import scala.collection._
| import so.Wrapper /* <- here probably different package :) */
|
| def createWrapper(): Wrapper = new Wrapper {
| override def call(args: List[Map[String, AnyRef]]): Map[String, AnyRef] = Map.empty
| }
|}
| """.stripMargin
val functionSymbol = tb.define(tb.parse(functionWrapper).asInstanceOf[tb.u.ImplDef])
val list: List[Map[String, AnyRef]] = List(Map("1" -> "2"))
val tree: tb.u.Tree = q"$functionSymbol.createWrapper()"
val wrapper = tb.eval(tree).asInstanceOf[Wrapper]
val res = wrapper.call(list)
println(s"testWrapper = $res")
}
P.S. I'm not sure what are you doing but beware of performance issues. Scala is a hard language to compile and thus it might easily take more time to compile your custom code than to run it. If performance becomes an issue you might need to use some other methods such as full-blown macro-code-generation or at least caching of the compiled code.

Scala Spark - converting sign of a Seq[double] in a flatMap

I'm trying to change the sign of a Seq[double] inside a flatMap. I am getting type mismatch error.
import util.Random.nextDouble
var numbers = Seq.fill(1000)(nextDouble)
val nrdd = sc.parallelize(numbers)
val mrdd = nrdd.flatMap(a => (a)* -1.0)
I think you are simply looking for the map method instead of flatMap.
val mrdd = nrdd.map(a => -a)

Scala :- Gatling :- Concatenation of two Maps stores last value only and ignores all other values

I have a two Maps and I want to concatenate them.
I tried almost all example given here Best way to merge two maps and sum the values of same key? but it ignores all values for key metrics and only stores last value.
I have downloaded scalaz-full_2.9.1-6.0.3.jar and imported it import scalaz._ but it won't works for me.
How can I concate this two maps with multiple values to same keys ?
Edit :-
Now I tried
val map = new HashMap[String, Set[String]] with MultiMap[String, String]
map.addBinding("""report_type""" , """performance""")
map.addBinding("""start_date""" ,start_date)
map.addBinding("""end_date""" , end_date)
map.addBinding("metrics" , "plays")
map.addBinding("metrics", "displays")
map.addBinding("metrics" , "video_starts")
map.addBinding("metrics" , "playthrough_25")
map.addBinding("metrics", "playthrough_50")
map.addBinding("metrics", "playthrough_75")
map.addBinding("metrics", "playthrough_100")
val map1 = new HashMap[String, Set[String]] with MultiMap[String, String]
map1.addBinding("""dimensions""" , """asset""")
map1.addBinding("""limit""" , """50""")
And tried to conver this mutable maps to immutable type using this link as
val asset_query_string = map ++ map1
val asset_query_string_map =(asset_query_string map { x=> (x._1,x._2.toSet) }).toMap[String, Set[String]]
But still I get
i_ui\config\config.scala:51: Cannot prove that (String, scala.collection.immutable.Set[String]) <:< (St
ring, scala.collection.mutable.Set[String]).
11:10:13.080 [ERROR] i.g.a.ZincCompiler$ - val asset_query_string_map =(asset_query_string map { x=> (x
._1,x._2.toSet) }).toMap[String, Set[String]]
Your problem is not related with a concatenation but with a declaration of the metrics map. It's not possible to have multiple values for a single key in a Map. Perhaps you should look at this collection:
http://www.scala-lang.org/api/2.10.3/index.html#scala.collection.mutable.MultiMap
You can't have duplicate keys in a Map.
for simple map it is impossible to have duplicates keys,if you have the duplicates keys in the map it takes the last one
but you can use MultiMap
import collection.mutable.{ HashMap, MultiMap, Set }
val mm = new HashMap[String, Set[String]] with MultiMap[String, String]
mm.addBinding("metrics","plays")
mm.addBinding("metrics","displays")
mm.addBinding("metrics","players")
println(mm,"multimap")//(Map(metrics -> Set(players, plays, displays)),multimap)
I was able to create two MultiMaps but when I tried to concatenate val final_map = map1 ++ map2
and I tried answer given here Mutable MultiMap to immutable Map
But my problem was not solved, I got
config\config.scala:51: Cannot prove that (String, scala.collection.immutable.Set[String]) <:< (St
ring, scala.collection.mutable.Set[String]).
finally it solved by
val final_map = map1 ++ map2
val asset_query_string_map = final_map.map(kv => (kv._1,kv._2.toSet)).toMap

Getting a Scala Map from a Java Properties

I was trying to pull environment variables into a scala script using java Iterators and / or Enumerations and realised that Dr Frankenstein might claim parentage, so I hacked the following from the ugly tree instead:
import java.util.Map.Entry
import System._
val propSet = getProperties().entrySet().toArray()
val props = (0 until propSet.size).foldLeft(Map[String, String]()){(m, i) =>
val e = propSet(i).asInstanceOf[Entry[String, String]]
m + (e.getKey() -> e.getValue())
}
For example to print the said same environment
props.keySet.toList.sortWith(_ < _).foreach{k =>
println(k+(" " * (30 - k.length))+" = "+props(k))
}
Please, please don't set about polishing this t$#d, just show me the scala gem that I'm convinced exists for this situation (i.e java Properties --> scala.Map), thanks in advance ;#)
Scala 2.10.3
import scala.collection.JavaConverters._
//Create a variable to store the properties in
val props = new Properties
//Open a file stream to read the file
val fileStream = new FileInputStream(new File(fileName))
props.load(fileStream)
fileStream.close()
//Print the contents of the properties file as a map
println(props.asScala.toMap)
Scala 2.7:
val props = Map() ++ scala.collection.jcl.Conversions.convertMap(System.getProperties).elements
Though that needs some typecasting. Let me work on it a bit more.
val props = Map() ++ scala.collection.jcl.Conversions.convertMap(System.getProperties).elements.asInstanceOf[Iterator[(String, String)]]
Ok, that was easy. Let me work on 2.8 now...
import scala.collection.JavaConversions.asMap
val props = System.getProperties() : scala.collection.mutable.Map[AnyRef, AnyRef] // or
val props = System.getProperties().asInstanceOf[java.util.Map[String, String]] : scala.collection.mutable.Map[String, String] // way too many repetitions of types
val props = asMap(System.getProperties().asInstanceOf[java.util.Map[String, String]])
The verbosity, of course, can be decreased with a couple of imports. First of all, note that Map will be a mutable map on 2.8. On the bright side, if you convert back the map, you'll get the original object.
Now, I have no clue why Properties implements Map<Object, Object>, given that the javadocs clearly state that key and value are String, but there you go. Having to typecast this makes the implicit option much less attractive. This being the case, the alternative is the most concise of them.
EDIT
Scala 2.8 just acquired an implicit conversion from Properties to mutable.Map[String,String], which makes most of that code moot.
In Scala 2.9.1 this is solved by implicit conversions inside collection.JavaConversions._ . The other answers use deprecated functions. The details are documented here. This is a relevant snippet out of that page:
scala> import collection.JavaConversions._
import collection.JavaConversions._
scala> import collection.mutable._
import collection.mutable._
scala> val jul: java.util.List[Int] = ArrayBuffer(1, 2, 3)
jul: java.util.List[Int] = [1, 2, 3]
scala> val buf: Seq[Int] = jul
buf: scala.collection.mutable.Seq[Int] = ArrayBuffer(1, 2, 3)
scala> val m: java.util.Map[String, Int] = HashMap("abc" -> 1, "hello" -> 2)
m: java.util.Map[String,Int] = {hello=2, abc=1}
Getting from a mutable map to an immutable map is a matter of calling toMap on it.
In Scala 2.8.1 you can do it with asScalaMap(m : java.util.Map[A, B]) in a more concise way:
var props = asScalaMap(System.getProperties())
props.keySet.toList.sortWith(_ < _).foreach { k =>
println(k + (" " * (30 - k.length)) + " = " + props(k))
}
In Scala 2.13.2:
import scala.jdk.javaapi.CollectionConverters._
val props = asScala(System.getProperties)
Looks like in the most recent version of Scala (2.10.2 as of the time of this answer), the preferred way to do this is using the explicit .asScala from scala.collection.JavaConverters:
import scala.collection.JavaConverters._
val props = System.getProperties().asScala
assert(props.isInstanceOf[Map[String, String]])