parse json object where keys start with a number using scala - scala

I want to parse the following JSON object using Scala:
val result = """{"24h_volume_usd": "9097260000.0"}"""
normally I use:
import net.liftweb.json._
case class VolumeUSDClass(24h_volume_usd:String) //<- problem 24h_volume_usd does not work
val element = parse(result)
element.extract[CryptoDataClass]
The problem is that I cannot define a case class with an argument that starts with a number. what is the best way to circumvent this?

You can simply enclose the name of the variable into backticks:
implicit val formats = net.liftweb.json.DefaultFormats
val result = """{"24h_volume_usd": "9097260000.0"}"""
import net.liftweb.json._
case class VolumeUSDClass(`24h_volume_usd`:String)
val element = parse(result)
val vusdcl = element.extract[VolumeUSDClass]
println(vusdcl)
Recall that almost everything can be transformed into a valid Scala identifier if you enclose it in backticks. Even strange stuff like
val `]strange...O_o...stuff[` = 42
println(`]strange...O_o...stuff[`)
works.
The example is tested with "net.liftweb" %% "lift-json" % "3.2.0" and Scala 2.11.

Related

How to pass an array to a slick SQL plain query?

How to pass an array to a slick SQL plain query?
I tried as follows but it fails:
// "com.typesafe.slick" %% "slick" % "3.3.2", // latest version
val ids = Array(1, 2, 3)
db.run(sql"""select name from person where id in ($ids)""".as[String])
Error: could not find implicit value for parameter e: slick.jdbc.SetParameter[Array[Int]]
However this ticket seems to say that it should work:
https://github.com/tminglei/slick-pg/issues/131
Note: I am not interested in the following approach:
db.run(sql"""select name from person where id in #${ids.mkString("(", ",", ")")}""".as[Int])
The issue you linked points to a commit which adds this:
def mkArraySetParameter[T: ClassTag](/* ... */): SetParameter[Seq[T]]
def mkArrayOptionSetParameter[T: ClassTag](/* ... */): SetParameter[Option[Seq[T]]]
Note that they are not implicit.
You'll need to do something like
implicit val setIntArray: SetParameter[Array[Int]] = mkArraySetParameter[Int](...)
and make sure that is in scope when you try to construct your sql"..." string.
I meet same problem and searched it.
And I resolved it with a implicit val like this:
implicit val strListParameter: slick.jdbc.SetParameter[List[String]] =
slick.jdbc.SetParameter[List[String]]{ (param, pointedParameters) =>
pointedParameters.setString(f"{${param.mkString(", ")}}")
}
put it into your slick-pg profile and import it with other val at where needed.
Or more strict, like this:
implicit val strListParameter: slick.jdbc.SetParameter[List[String]] =
slick.jdbc.SetParameter[List[String]]{ (param, pointedParameters) =>
pointedParameters.setObject(param.toArray, java.sql.Types.ARRAY)
}
implicit val strSeqParameter: slick.jdbc.SetParameter[Seq[String]] =
slick.jdbc.SetParameter[Seq[String]]{ (param, pointedParameters) =>
pointedParameters.setObject(param.toArray, java.sql.Types.ARRAY)
}
and use the val like:
val entries: Seq[String]
val query = {
sql"""select ... from xxx
where entry = ANY($entries)
order by ...
""".as[(Column, Types, In, Here)]
}

How to stream Anorm large query results to client in chunked response with Play 2.5

I have a pretty large result set (60k+ records columns) that I am pulling from a database and parsing with Anorm (though I can use play's default data access module that returns a ResultSet if needed). I need to transform and stream these results directly to the client (without holding them in a big list in memory) where they will then be downloaded directly to a file on the client's machine.
I have been referring to what is demonstrated in the Chunked Responses section in the ScalaStream 2.5.x Play documentation. I am having trouble implementing the "getDataStream" portion of what it shows there.
I've also been referencing what is demoed in the Streaming Results and Iteratee sections in the ScalaAnorm 2.5.x Play documentation. I have tried piping the results as an enumerator like what is returned here:
val resultsEnumerator = Iteratees.from(SQL"SELECT * FROM Test", SqlParser.str("colName"))
into
val dataContent = Source.fromPublisher(Streams.enumeratorToPublisher(resultsEnumerator))
Ok.chunked(dataContent).withHeaders(("ContentType","application/x-download"),("Content-disposition","attachment; filename=myDataFile.csv"))
But the resulting file/content is empty.
And I cannot find any sample code or references on how to convert a function in the data service that returns something like this:
#annotation.tailrec
def go(c: Option[Cursor], l: List[String]): List[String] = c match {
case Some(cursor) => {
if (l.size == 10000000) l // custom limit, partial processing
else {
go(cursor.next, l :+ cursor.row[String]("VBU_NUM"))
}
}
case _ => l
}
val sqlString = s"select colName FROM ${tableName} WHERE ${whereClauseStr}"
val results : Either[List[Throwable], List[String]] = SQL(sqlString).withResult(go(_, List.empty[String]))
results
into something i can pass to Ok.chunked().
So basically my question is, how should I feed each record fetch from the database into a stream that I can do a transformation on and send to the client as a chunked response that can be downloaded to a file?
I would prefer not to use Slick for this. But I can go with a solution that does not use Anorm, and just uses the play dbApi objects that returns the raw java.sql.ResultSet object and work with that.
After referencing the Anorm Akka Support documentation and much trial and error, I was able to achieve my desired solution. I had to add these dependencies
"com.typesafe.play" % "anorm_2.11" % "2.5.2",
"com.typesafe.play" % "anorm-akka_2.11" % "2.5.2",
"com.typesafe.akka" %% "akka-stream" % "2.4.4"
to by build.sbt file for Play 2.5.
and I implemented something like this
//...play imports
import anorm.SqlParser._
import anorm._
import akka.actor.ActorSystem
import akka.stream.ActorMaterializer
import akka.stream.scaladsl.{Sink, Source}
...
private implicit val akkaActorSystem = ActorSystem("MyAkkaActorSytem")
private implicit val materializer = ActorMaterializer()
def streamedAnormResultResponse() = Action {
implicit val connection = db.getConnection()
val parser: RowParser[...] = ...
val sqlQuery: SqlQuery = SQL("SELECT * FROM table")
val source: Source[Map[String, Any] = AkkaStream.source(sqlQuery, parser, ColumnAliaser.empty).alsoTo(Sink.onComplete({
case Success(v) =>
connection.close()
case Failure(e) =>
println("Info from the exception: " + e.getMessage)
connection.close()
}))
Ok.chunked(source)
}

Converting Java to Scala durations

Is there an elegant way to convert java.time.Duration to scala.concurrent.duration.FiniteDuration?
I am trying to do the following simple use of Config in Scala:
val d = ConfigFactory.load().getDuration("application.someTimeout")
However I don't see any simple way to use the result in Scala.
Certainly hope the good people of Typesafe didn't expect me to do this:
FiniteDuration(d.getNano, TimeUnit.NANOSECONDS)
Edit: Note the line has a bug, which proves the point. See the selected answer below.
I don't know whether an explicit conversion is the only way, but if you want to do it right
FiniteDuration(d.toNanos, TimeUnit.NANOSECONDS)
toNanos will return the total duration, while getNano will only return the nanoseconds component, which is not what you want.
E.g.
import java.time.Duration
import jata.time.temporal.ChronoUnit
Duration.of(1, ChronoUnit.HOURS).getNano // 0
Duration.of(1, ChronoUnit.HOURS).toNanos // 3600000000000L
That being said, you can also roll your own implicit conversion
implicit def asFiniteDuration(d: java.time.Duration) =
scala.concurrent.duration.Duration.fromNanos(d.toNanos)
and when you have it in scope:
val d: FiniteDuration = ConfigFactory.load().getDuration("application.someTimeout")
Starting Scala 2.13, there is a dedicated DurationConverter from java's Duration to scala's FiniteDuration (and vice versa):
import scala.jdk.DurationConverters._
// val javaDuration = java.time.Duration.ofNanos(123456)
javaDuration.toScala
// scala.concurrent.duration.FiniteDuration = 123456 nanoseconds
I don't know any better way, but you can make it a bit shorter:
Duration.fromNanos(d.toNanos)
and also wrap it into an implicit conversion yourself
implicit def toFiniteDuration(d: java.time.Duration): FiniteDuration = Duration.fromNanos(d.toNanos)
(changed d.toNano to d.toNanos)
There is a function for this in scala-java8-compat
in build.sbt
libraryDependencies += "org.scala-lang.modules" %% "scala-java8-compat" % "0.9.0"
import scala.compat.java8.DurationConverters._
val javaDuration: java.time.Duration = ???
val scalaDuration: FiniteDuration = javaDuration.toScala

Scala function does not return a value

I think I understand the rules of implicit returns but I can't figure out why splithead is not being set. This code is run via
val m = new TaxiModel(sc, file)
and then I expect
m.splithead
to give me an array strings. Note head is an array of strings.
import org.apache.spark.SparkContext
import org.apache.spark.rdd.RDD
class TaxiModel(sc: SparkContext, dat: String) {
val rawData = sc.textFile(dat)
val head = rawData.take(10)
val splithead = head.slice(1,11).foreach(splitData)
def splitData(dat: String): Array[String] = {
val splits = dat.split("\",\"")
val split0 = splits(0).substring(1, splits(0).length)
val split8 = splits(8).substring(0, splits(8).length - 1)
Array(split0).union(splits.slice(1, 8)).union(Array(split8))
}
}
foreach just evaluates expression, and do not collect any data while iterating. You probably need map or flatMap (see docs here)
head.slice(1,11).map(splitData) // gives you Array[Array[String]]
head.slice(1,11).flatMap(splitData) // gives you Array[String]
Consider also a for comprehension (which desugars in this case into flatMap),
for (s <- head.slice(1,11)) yield splitData(s)
Note also that Scala strings are equipped with ordered collections methods, thus
splits(0).substring(1, splits(0).length)
proves equivalent to any of the following
splits(0).drop(1)
splits(0).tail

Scala script in 2.11

I have found an example code for a Scala runtime scripting in answer to Generating a class from string and instantiating it in Scala 2.10, however the code seems to be obsolete for 2.11 - I cannot find any function corresponding to build.setTypeSignature. Even if it worked, the code seems hard to read and follow to me.
How can Scala scripts be compiled and executed in Scala 2.11?
Let us assume I want following:
define several variables (names and values)
compile script
(optional improvement) change variable values
execute script
For simplicity consider following example:
I want to define following variables (programmatically, from the code, not from the script text):
val a = 1
val s = "String"
I want a following script to be compiled and on execution a String value "a is 1, s is String" returned from it:
s"a is $a, s is $s"
How should my functions look like?
def setupVariables() = ???
def compile() = ???
def changeVariables() = ???
def execute() : String = ???
Scala 2.11 adds a JSR-223 scripting engine. It should give you the functionality you are looking for. Just as a reminder, as with all of these sorts of dynamic things, including the example listed in the description above, you will lose type safety. You can see below that the return type is always Object.
Scala REPL Example:
scala> import javax.script.ScriptEngineManager
import javax.script.ScriptEngineManager
scala> val e = new ScriptEngineManager().getEngineByName("scala")
e: javax.script.ScriptEngine = scala.tools.nsc.interpreter.IMain#566776ad
scala> e.put("a", 1)
a: Object = 1
scala> e.put("s", "String")
s: Object = String
scala> e.eval("""s"a is $a, s is $s"""")
res6: Object = a is 1, s is String`
An addition example as an application running under scala 2.11.6:
import javax.script.ScriptEngineManager
object EvalTest{
def main(args: Array[String]){
val e = new ScriptEngineManager().getEngineByName("scala")
e.put("a", 1)
e.put("s", "String")
println(e.eval("""s"a is $a, s is $s""""))
}
}
For this application to work make sure to include the library dependency.
libraryDependencies += "org.scala-lang" % "scala-compiler" % scalaVersion.value