Struggling with Play.current.configuration.getStringList("mongodb.replicaSetSeeds") Option handling - scala

I have an conf/application.conf setting like
mongodb.replicaSetSeeds = ["bobk-mbp.local:27017","bobk-mbp.local:27018"]
I'm pulling it out in my code like (the actual extraction is a little different, but this is the gist of it)
val replicaSetSeeds = Play.current.configuration.getStringList("mongodb.replicaSetSeeds")
val listOfString: List[String] = replicaSetSeeds.getOrElse(List("localhost"))
but the compiler hates me
type mismatch; found : Object required: List[String]
The signature of getStringList is
def getStringList(path: String): Option[java.util.List[String]]
How do I handle the None case here or is my problem List[String] is not the same as List[java.util.String]?

Give this a shot:
import collection.JavaConversions._
val optList:Option[List[String]] = Play.current.configuration.getStringList("mongodb.replicaSetSeeds").map(_.toList)
val list = optList.getOrElse(List("localhost"))
There are multiple things going on here. First, you need to import the JavaConversions implicits because what's being returned is an Option[java.util.List[String]] and we want that to be a scala List instead. By doing the map(_.toList), I'm forcing the implicit conversion to kick in and get me an Option[List[String]] and from there things are pretty straight forward.

In play 2.5, you need to use dependency injection, the following works well for me:
1) in your class inject Configuration
class Application #Inject()(
configuration: play.api.Configuration
) ...
2) in your method
import scala.collection.JavaConversions._
val optlist = configuration.getStringList("mongodb.replicaSetSeeds").map{_.toList}
val list = optList.getOrElse(List("localhost"))

Related

Is there a simple way to convert Option[Task[T]] to Task[Option[T]]?

While using monix.eval.Task or zio.Task, is there a simple way to convert Option of Task to Task of Option?
If you want a pure ZIO solution, you can use .foreach with identity:
val fx: Option[UIO[Int]] = Option(Task.effectTotal(42))
val res: UIO[Option[Int]] = ZIO.foreach(fx)(identity)
If you're also using cats, the method you're looking for is called .sequence.
import cats.implicits.toTraverseOps
import zio.interop.catz._
import zio.{Task, UIO}
val fx: Option[UIO[Int]] = Option(Task.effectTotal(42))
val res: UIO[Option[Int]] = fx.sequence
The other way around is not possible as one would need to materialize the Task in order to be able to lift it into an Option[T].

How to infer StructType schema for Spark Scala at run time given a Fully Qualified Name of a case class

Since a few days I was wondering if it is possible to infer a schema for Spark in Scala for a given case class, but unknown at compile time.
The only input is a string containing the FQN of the class (that could be used for example to create an instance of the case class at runtime via reflection)
I was thinking if it was possible to do something like:
package com.my.namespace
case class MyCaseClass (name: String, num: Int)
//Somewhere else in codebase
// coming from external configuration file, so unknown at compile time
val fqn = "com.my.namespace.MyCaseClass"
val schema = Encoders.product [ getXYZ( fqn ) ].schema
Of course, any other techniques that is not using Encoders is fine (building StructType analysing an instance of the case class ? Is it even possible ?)
What is the best approach?
Is it something feasible ?
You can use reflective toolbox
package com.my.namespace
import org.apache.spark.sql.types.StructType
import scala.reflect.runtime
import scala.tools.reflect.ToolBox
case class MyCaseClass (name: String, num: Int)
object Main extends App {
val fqn = "com.my.namespace.MyCaseClass"
val runtimeMirror = runtime.currentMirror
val toolbox = runtimeMirror.mkToolBox()
val res = toolbox.eval(toolbox.parse(s"""
import org.apache.spark.sql.Encoders
Encoders.product[$fqn].schema
""")).asInstanceOf[StructType]
println(res) // StructType(StructField(name,StringType,true),StructField(num,IntegerType,false))
}

Gatling: Dynamically assemble HttpCheck for multiple Css selectors

I am working on a Gatling test framework that can be parameterized through external config objects. One use case I have is that there may be zero or more CSS selector checks that need to be saved to variables. In my config object, I've implemented that as a Map[String,(String, String)], where the key is the variable name, and the value is the 2-part css selector.
I am struggling with how to dynamically assemble the check. Here's what I got so far:
val captureMap: Map[String, (String, String)] = config.capture
httpRequestBuilder.check(
captureMap.map((mapping) => {
val varName = mapping._1
val cssSel = mapping._2
css(cssSel._1, cssSel._2).saveAs(varName)
}).toArray: _* // compilation error here
)
The error I'm getting is:
Error:(41, 10) type mismatch;
found : Array[io.gatling.core.check.CheckBuilder[io.gatling.core.check.css.CssCheckType,jodd.lagarto.dom.NodeSelector,String]]
required: Array[_ <: io.gatling.http.check.HttpCheck]
}).toArray: _*
apparently, I need to turn my CheckBuilder into a HttpCheck, so how do I do that?
Update:
I managed to get it to work by introducing a variable of type HttpCheck and returning it in the next line:
httpRequestBuilder.check(
captureMap.map((mapping) => {
val varName = mapping._1
val cssSel = mapping._2
val check:HttpCheck= css(cssSel._1, cssSel._2).saveAs(varName)
check
}).toArray: _*
)
While this works, it's ugly as hell. Can this be improved?
I had the same issue.
I had the following imports:
import io.gatling.core.Predef._
import io.gatling.http.Predef.http
I changed these imports to:
import io.gatling.core.Predef._
import io.gatling.http.Predef._
import io.gatling.http.request.builder.HttpRequestBuilder.toActionBuilder
which made it work.

Getting json parsing error org.json4s.package$MappingException

I am trying to parse json extract elements into case class. Just curios why code is running one way and not the other way.
This code works
import org.json4s._
import org.json4s.DefaultFormats
import org.json4s.jackson.JsonMethods.parse
object JsonCase {
def main(args: Array[String]): Unit = {
implicit val formats = DefaultFormats
val input = """{"InputDB: "XYZ"}"""
case class config(stagingDB: String)
val spec = parse(input).extract[config]
println(spec.stagingDB)
}
}
Why below code doesn't work
import org.json4s._
import org.json4s.DefaultFormats
import org.json4s.jackson.JsonMethods.parse
implicit val formats = DefaultFormats
val input = """{"stagingDB": "XYZ"}"""
case class config(stagingDB: String)
val spec = parse(input).extract[config]
println(spec.stagingDB)
I find the opposite to be true. The second block of code works but the first fails because the quoting is wrong in input. There is no closing " for InputDB so it is not valid JSON.
More generally, when comparing two blocks of code you should remove as much of the shared code as possible. So config, input and formats should be outside the object and shared between both examples so that you know you are focussing on the differences in the code not the similarities.
Everything needs to be defined at least at object/class level (it makes no sense otherwise).
Below is I think what you want
object JsonCase {
// This way it applies to the whole object
implicit val formats = DefaultFormats
def main(args: Array[String]): Unit = {
// do stuff
....

Getting a Scala Map from a Java Properties

I was trying to pull environment variables into a scala script using java Iterators and / or Enumerations and realised that Dr Frankenstein might claim parentage, so I hacked the following from the ugly tree instead:
import java.util.Map.Entry
import System._
val propSet = getProperties().entrySet().toArray()
val props = (0 until propSet.size).foldLeft(Map[String, String]()){(m, i) =>
val e = propSet(i).asInstanceOf[Entry[String, String]]
m + (e.getKey() -> e.getValue())
}
For example to print the said same environment
props.keySet.toList.sortWith(_ < _).foreach{k =>
println(k+(" " * (30 - k.length))+" = "+props(k))
}
Please, please don't set about polishing this t$#d, just show me the scala gem that I'm convinced exists for this situation (i.e java Properties --> scala.Map), thanks in advance ;#)
Scala 2.10.3
import scala.collection.JavaConverters._
//Create a variable to store the properties in
val props = new Properties
//Open a file stream to read the file
val fileStream = new FileInputStream(new File(fileName))
props.load(fileStream)
fileStream.close()
//Print the contents of the properties file as a map
println(props.asScala.toMap)
Scala 2.7:
val props = Map() ++ scala.collection.jcl.Conversions.convertMap(System.getProperties).elements
Though that needs some typecasting. Let me work on it a bit more.
val props = Map() ++ scala.collection.jcl.Conversions.convertMap(System.getProperties).elements.asInstanceOf[Iterator[(String, String)]]
Ok, that was easy. Let me work on 2.8 now...
import scala.collection.JavaConversions.asMap
val props = System.getProperties() : scala.collection.mutable.Map[AnyRef, AnyRef] // or
val props = System.getProperties().asInstanceOf[java.util.Map[String, String]] : scala.collection.mutable.Map[String, String] // way too many repetitions of types
val props = asMap(System.getProperties().asInstanceOf[java.util.Map[String, String]])
The verbosity, of course, can be decreased with a couple of imports. First of all, note that Map will be a mutable map on 2.8. On the bright side, if you convert back the map, you'll get the original object.
Now, I have no clue why Properties implements Map<Object, Object>, given that the javadocs clearly state that key and value are String, but there you go. Having to typecast this makes the implicit option much less attractive. This being the case, the alternative is the most concise of them.
EDIT
Scala 2.8 just acquired an implicit conversion from Properties to mutable.Map[String,String], which makes most of that code moot.
In Scala 2.9.1 this is solved by implicit conversions inside collection.JavaConversions._ . The other answers use deprecated functions. The details are documented here. This is a relevant snippet out of that page:
scala> import collection.JavaConversions._
import collection.JavaConversions._
scala> import collection.mutable._
import collection.mutable._
scala> val jul: java.util.List[Int] = ArrayBuffer(1, 2, 3)
jul: java.util.List[Int] = [1, 2, 3]
scala> val buf: Seq[Int] = jul
buf: scala.collection.mutable.Seq[Int] = ArrayBuffer(1, 2, 3)
scala> val m: java.util.Map[String, Int] = HashMap("abc" -> 1, "hello" -> 2)
m: java.util.Map[String,Int] = {hello=2, abc=1}
Getting from a mutable map to an immutable map is a matter of calling toMap on it.
In Scala 2.8.1 you can do it with asScalaMap(m : java.util.Map[A, B]) in a more concise way:
var props = asScalaMap(System.getProperties())
props.keySet.toList.sortWith(_ < _).foreach { k =>
println(k + (" " * (30 - k.length)) + " = " + props(k))
}
In Scala 2.13.2:
import scala.jdk.javaapi.CollectionConverters._
val props = asScala(System.getProperties)
Looks like in the most recent version of Scala (2.10.2 as of the time of this answer), the preferred way to do this is using the explicit .asScala from scala.collection.JavaConverters:
import scala.collection.JavaConverters._
val props = System.getProperties().asScala
assert(props.isInstanceOf[Map[String, String]])