How to convert an Iterable to a js.Dynamic - scala.js

I have some optional parameters I want to use to initialise a JQuery widget that are passed as
Iterable[(String, String)]
What is the best way to convert this object to a js.Dynamic?

Probably the following:
val seq = iterable.toSeq
val dict = js.Dictionary(seq: _*)
val dynamic = dict.asInstanceOf[js.Dynamic]
You can of course make it into a single expression:
js.Dictionay(iterable.toSeq: _*).asInstanceOf[js.Dynamic]

Related

scala lazy zip of varargs options to option of tuple

i want to combine multiple options to an option of a tuple:
val maybeA: Option[Int] = ...
val maybeB: Option[String] = ...
val combined: Option[(Int,String)] = combine(maybeA, maybeB)
there are many ways to do this, let's go with:
def combined[X,Y](maybeA: Option[X], maybeB: Option[Y]) = maybeA.zip(maybeB).headOption
and that's great.
now i wonder, can this be done for varargs and a tuple dynamically somehow?
what would the signature look like?
def combine[???](options: Option[?]*): Tuple[?]

Pass case class to Spark UDF

I have a scala-2.11 function which creates a case class from Map based on the provided class type.
def createCaseClass[T: TypeTag, A](someMap: Map[String, A]): T = {
val rMirror = runtimeMirror(getClass.getClassLoader)
val myClass = typeOf[T].typeSymbol.asClass
val cMirror = rMirror.reflectClass(myClass)
// The primary constructor is the first one
val ctor = typeOf[T].decl(termNames.CONSTRUCTOR).asTerm.alternatives.head.asMethod
val argList = ctor.paramLists.flatten.map(param => someMap(param.name.toString))
cMirror.reflectConstructor(ctor)(argList: _*).asInstanceOf[T]
}
I'm trying to use this in the context of a spark data frame as a UDF. However, I'm not sure what's the best way to pass the case class. The approach below doesn't seem to work.
def myUDF[T: TypeTag] = udf { (inMap: Map[String, Long]) =>
createCaseClass[T](inMap)
}
I'm looking for something like this-
case class MyType(c1: String, c2: Long)
val myUDF = udf{(MyType, inMap) => createCaseClass[MyType](inMap)}
Thoughts and suggestions to resolve this is appreciated.
However, I'm not sure what's the best way to pass the case class
It is not possible to use case classes as arguments for user defined functions. SQL StructTypes are mapped to dynamically typed (for lack of a better word) Row objects.
If you want to operate on statically typed objects please use statically typed Dataset.
From try and error I learn that whatever data structure that is stored in a Dataframe or Dataset is using org.apache.spark.sql.types
You can see with:
df.schema.toString
Basic types like Int,Double, are stored like:
StructField(fieldname,IntegerType,true),StructField(fieldname,DoubleType,true)
Complex types like case class are transformed to a combination of nested types:
StructType(StructField(..),StructField(..),StructType(..))
Sample code:
case class range(min:Double,max:Double)
org.apache.spark.sql.Encoders.product[range].schema
//Output:
org.apache.spark.sql.types.StructType = StructType(StructField(min,DoubleType,false), StructField(max,DoubleType,false))
The UDF parameter type in this cases is Row, or Seq[Row] when you store an array of case classes
A basic debug technic is print to string:
val myUdf = udf( (r:Row) => r.schema.toString )
then, to see was happen:
df.take(1).foreach(println) //

Scala class constructor with variable number of arguments

If we pass in a list to a method that takes a variable number of arguments it works.
val testList = List("a", "b", "c")
def testMethod(str: String*): Seq[String] = str
testMethod(testList) // outputs WrappedArray(List("a", "b", "c"))
But if we pass in a list to a class constructor that takes a variable number of arguments, we get a type error.
val testList = List("a", "b", "c")
class TestClass(str: String*)
val t = new TestClass(testList)
// error: type mismatch
// found: List[String]
// required: [String]
Any idea how we can fix this?
It's not working in neither case (note the unwanted WrappedArray in the first case). In order to pass a sequence as a variable-argument list, you need to treat it as such. The syntax for it is the same. In the first case:
testMethod(testList: _*)
and in the second case:
val t = new testClass(testList: _*)
You can interpret this notation in a similar fashion of variable-arguments syntax, the only difference being that here the type is not explicitly stated (underscore is used instead).

Access element of an Array and return a monad?

If I access an index outside the bounds of an Array, I get an ArrayIndexOutOfBoundsException, eg:
val a = new Array[String](3)
a(4)
java.lang.ArrayIndexOutOfBoundsException: 4
Is there a method to return a monad instead (eg: Option)? And why doesn't the default collections apply method for Array support this?
You can use lift:
a.lift(4) // None
a.lift(2) // Some(null)
Array[T] is a PartialFunction[Int, T] and lift creates a Function[Int, Option[T]] from the index to an option of the element type.
You could use scala.util.Try:
scala> val a = new Array[String](3)
a: Array[String] = Array(null, null, null)
scala> import scala.util.Try
import scala.util.Try
scala> val fourth = Try(a(3))
third: scala.util.Try[String] = Failure(java.lang.ArrayIndexOutOfBoundsException
: 3)
scala> val third = Try(a(2))
third: scala.util.Try[String] = Success(null)
Another good idea is not using Array in the first place, but that's outside the scope of this question.
Where it is relevant to your question though is why this behaviour. Array is intended to function like a Java array and be compatible with a Java Array. Since this is how Java arrays work, this is how Array works.

Is there a way to get a Scala HashMap to automatically initialize values?'

I thought it could be done as follows
val hash = new HashMap[String, ListBuffer[Int]].withDefaultValue(ListBuffer())
hash("A").append(1)
hash("B").append(2)
println(hash("B").head)
However the above prints the unintuitive value of 1. I would like
hash("B").append(2)
To do something like the following behind the scenes
if (!hash.contains("B")) hash.put("B", ListBuffer())
Use getOrElseUpdate to provide the default value at the point of access:
scala> import collection.mutable._
import collection.mutable._
scala> def defaultValue = ListBuffer[Int]()
defaultValue: scala.collection.mutable.ListBuffer[Int]
scala> val hash = new HashMap[String, ListBuffer[Int]]
hash: scala.collection.mutable.HashMap[String,scala.collection.mutable.ListBuffer[Int]] = Map()
scala> hash.getOrElseUpdate("A", defaultValue).append(1)
scala> hash.getOrElseUpdate("B", defaultValue).append(2)
scala> println(hash("B").head)
2
withDefaultValue uses exactly the same value each time. In your case, it's the same empty ListBuffer that gets shared by everyone.
If you use withDefault instead, you could generate a new ListBuffer every time, but it wouldn't get stored.
So what you'd really like is a method that would know to add the default value. You can create such a method inside a wrapper class and then write an implicit conversion:
class InstantiateDefaults[A,B](h: collection.mutable.Map[A,B]) {
def retrieve(a: A) = h.getOrElseUpdate(a, h(a))
}
implicit def hash_can_instantiate[A,B](h: collection.mutable.Map[A,B]) = {
new InstantiateDefaults(h)
}
Now your code works as desired (except for the extra method name, which you could pick to be shorter if you wanted):
val hash = new collection.mutable.HashMap[
String, collection.mutable.ListBuffer[Int]
].withDefault(_ => collection.mutable.ListBuffer())
scala> hash.retrieve("A").append(1)
scala> hash.retrieve("B").append(2)
scala> hash("B").head
res28: Int = 2
Note that the solution (with the implicit) doesn't need to know the default value itself at all, so you can do this once and then default-with-addition to your heart's content.