I would like to get groups attribute as Seq[Int] from the given mongodb Document. How to do it? The method getList catches a runtime exception, and I would like to understand and fix it.
n: Document((_id,BsonObjectId{value=613645d689898b7d4ac2b1b2}), (groups,BsonArray{values=[BsonInt32{value=2}, BsonInt32{value=3}]}))
I tried this way that compiles, but I get the runtime error "Caused by: java.lang.ClassCastException: List element cannot be cast to scala.Int$"
val groups = n.getList("groups", Int.getClass)
Some sbt library dependencies:
scalaVersion := "2.12.14"
libraryDependencies += "org.mongodb.scala" %% "mongo-scala-driver" % "4.3.1"
Setup code:
val collection = db.getCollection("mylist")
Await.result(collection.drop.toFuture, Duration.Inf)
val groupsIn = Seq[Int](2, 3)
val doc = Document("groups" -> groupsIn)
Await.result(collection.insertOne(doc).toFuture, Duration.Inf)
println("see mongosh to verify that a Seq[Int] has been added")
val result = Await.result(collection.find.toFuture, Duration.Inf)
for(n <- result) {
println("n: " + n)
val groups = n.getList("groups", Int.getClass)
println("groups: " + groups)
}
Comments: result is of type Seq[Document], n is of type Document.
The getList hover-on description in VSCODE:
def getList[T](key: Any, clazz: Class[T]): java.util.List[T]
Gets the list value of the given key, casting the list elements to the given Class. This is useful to avoid having casts in client code, though the effect is the same.
With the help of sarveshseri and Gael J, the solution is reached:
import collection.JavaConverters._
val groups = n.getList("groups", classOf[Integer]).asScala.toSeq.map(p => p.toInt)
Related
How to pass an array to a slick SQL plain query?
I tried as follows but it fails:
// "com.typesafe.slick" %% "slick" % "3.3.2", // latest version
val ids = Array(1, 2, 3)
db.run(sql"""select name from person where id in ($ids)""".as[String])
Error: could not find implicit value for parameter e: slick.jdbc.SetParameter[Array[Int]]
However this ticket seems to say that it should work:
https://github.com/tminglei/slick-pg/issues/131
Note: I am not interested in the following approach:
db.run(sql"""select name from person where id in #${ids.mkString("(", ",", ")")}""".as[Int])
The issue you linked points to a commit which adds this:
def mkArraySetParameter[T: ClassTag](/* ... */): SetParameter[Seq[T]]
def mkArrayOptionSetParameter[T: ClassTag](/* ... */): SetParameter[Option[Seq[T]]]
Note that they are not implicit.
You'll need to do something like
implicit val setIntArray: SetParameter[Array[Int]] = mkArraySetParameter[Int](...)
and make sure that is in scope when you try to construct your sql"..." string.
I meet same problem and searched it.
And I resolved it with a implicit val like this:
implicit val strListParameter: slick.jdbc.SetParameter[List[String]] =
slick.jdbc.SetParameter[List[String]]{ (param, pointedParameters) =>
pointedParameters.setString(f"{${param.mkString(", ")}}")
}
put it into your slick-pg profile and import it with other val at where needed.
Or more strict, like this:
implicit val strListParameter: slick.jdbc.SetParameter[List[String]] =
slick.jdbc.SetParameter[List[String]]{ (param, pointedParameters) =>
pointedParameters.setObject(param.toArray, java.sql.Types.ARRAY)
}
implicit val strSeqParameter: slick.jdbc.SetParameter[Seq[String]] =
slick.jdbc.SetParameter[Seq[String]]{ (param, pointedParameters) =>
pointedParameters.setObject(param.toArray, java.sql.Types.ARRAY)
}
and use the val like:
val entries: Seq[String]
val query = {
sql"""select ... from xxx
where entry = ANY($entries)
order by ...
""".as[(Column, Types, In, Here)]
}
I'm facing some issues with SQL-Interpolation in ScalikeJdbc. Upon trying to run following piece of code
val dbTableSQLSyntax: SQLSyntax = SQLSyntax.createUnsafely(dbTableName)
sql"""
SELECT
COUNT(*) AS count,
MIN($distributionColumn) AS min,
MAX($distributionColumn) AS max
FROM
$dbTableSQLSyntax
""".stripMargin.
map(mapResult).
single().
apply().
get()
I get this error
scalikejdbc.ResultSetExtractorException: Failed to retrieve value because For input string: "tab_id". If you're using SQLInterpolation, you may mistake u.id for u.resultName.id.
at scalikejdbc.WrappedResultSet.wrapIfError(WrappedResultSet.scala:27)
at scalikejdbc.WrappedResultSet.get(WrappedResultSet.scala:479)
at scalikejdbc.WrappedResultSet.longOpt(WrappedResultSet.scala:233)
...
How can I get rid of this error without having to use Query-DSL?
Is there anything I can improve upon (in terms of performance / security) in above code-snippet?
Frameworks / Libraries
Scala 2.11.11
"org.scalikejdbc" %% "scalikejdbc" % "3.2.0"
EDIT-1
In response to #Kazuhiro Sera's answer, I'm providing my mapResult method
def mapResult(rs: WrappedResultSet): (Long, Long, Long) = {
val count: Long = rs.long("count")
val minOpt: Option[Long] = rs.longOpt("min")
val maxOpt: Option[Long] = rs.longOpt("max")
(count, minOpt.getOrElse(0), maxOpt.getOrElse(Long.MaxValue))
}
It depends on your mapResult function. I am afraid that mapResult tries to fetch tag_id from the ResultSet value. In this case, your SQL query returns only count, min, and max.
I'm trying to write a simple program to traverse all the referenced code starting from a given method using scalameta.
I was able to follow the calls but could not resolve method references.
analyzeme/src/main/scala/codelab/FindMe.scala
package codelab
object FindMe {
def main(args: Array[String]): Unit = {
val x = someRecognizeableName(1, 2)
val y = List(1, 2, 3)
y.foldLeft(0)(someRecognizeableName)
}
def someRecognizeableName(a: Int, b: Int): Int = a + b
}
Generated and loaded semanticdb for FindMe.scala and checking the usages of someRecognizeableName method.
I can see the first call in the db.names list:
[87..108): someRecognizeableName => _root_.codelab.FindMe.someRecognizeableName(Int,Int).
The second one, though, when I don't call the method, just pass the reference is showing up as this:
[159..180): someRecognizeableName => local2_src_main_scala_codelab_FindMe_scala
So when I try to follow references startin from main, I don't get a fully qualified name of the someRecognizeableName reference in the second case.
Question: Is there a way to get a fully qualified name from semanticdb for that reference?
Full source to reproduce the above
run instructions:
analyzeme $ sbt compile
analyzer $ sbt "run ../analyzeme"
analyzeme/src/main/scala/codelab/FindMe.scala
package codelab
object FindMe {
def main(args: Array[String]): Unit = {
val x = someRecognizeableName(1, 2)
val y = List(1, 2, 3)
y.foldLeft(0)(someRecognizeableName)
}
def someRecognizeableName(a: Int, b: Int): Int = a + b
}
analyzer/src/main/scala/Main.scala
import org.langmeta.io.{Classpath, Sourcepath}
import scala.meta._
object Main {
def main(args: Array[String]): Unit = {
println(s"Loading from [${ args(0) }]")
println()
val cp = Classpath(s"${ args(0) }/target/scala-2.12/classes")
val sp = Sourcepath(s"${ args(0) }/src/main/scala")
val db = Database.load(cp, sp)
println("* names:")
db.names foreach println
println()
println("* symbols:")
db.symbols foreach println
println()
println("* synthetics:")
db.synthetics foreach println
println()
println("* messages:")
db.messages foreach println
println()
}
}
analyzeme/build.sbt
name := "analyzee"
version := "0.1"
scalaVersion := "2.12.4"
addCompilerPlugin("org.scalameta" % "semanticdb-scalac" % "3.4.0" cross CrossVersion.full)
scalacOptions += "-Yrangepos"
analyzer/build.sbt
name := "analyzer"
version := "0.1"
scalaVersion := "2.12.4"
libraryDependencies += "org.scalameta" %% "scalameta" % "3.4.0"
libraryDependencies += "org.scalameta" %% "contrib" % "3.4.0"
package codelab
object FindMe {
def main(args: Array[String]): Unit = {
val x = someRecognizeableName(1, 2)
y.foldLeft(0)(someRecognizeableName)
// same as
y.foldLeft(0){ a, b => someRecognizeableName(a, b) }
}
I debug the code and found at the second case, the compiler passed an anonymous symbol which is not accessible from the current semanticdb, it maybe should comes in the syhthetics partion but I can't find it inside.
So I guess the compiler generated anonymous is missing in the current semanticdb.
I think I understand the rules of implicit returns but I can't figure out why splithead is not being set. This code is run via
val m = new TaxiModel(sc, file)
and then I expect
m.splithead
to give me an array strings. Note head is an array of strings.
import org.apache.spark.SparkContext
import org.apache.spark.rdd.RDD
class TaxiModel(sc: SparkContext, dat: String) {
val rawData = sc.textFile(dat)
val head = rawData.take(10)
val splithead = head.slice(1,11).foreach(splitData)
def splitData(dat: String): Array[String] = {
val splits = dat.split("\",\"")
val split0 = splits(0).substring(1, splits(0).length)
val split8 = splits(8).substring(0, splits(8).length - 1)
Array(split0).union(splits.slice(1, 8)).union(Array(split8))
}
}
foreach just evaluates expression, and do not collect any data while iterating. You probably need map or flatMap (see docs here)
head.slice(1,11).map(splitData) // gives you Array[Array[String]]
head.slice(1,11).flatMap(splitData) // gives you Array[String]
Consider also a for comprehension (which desugars in this case into flatMap),
for (s <- head.slice(1,11)) yield splitData(s)
Note also that Scala strings are equipped with ordered collections methods, thus
splits(0).substring(1, splits(0).length)
proves equivalent to any of the following
splits(0).drop(1)
splits(0).tail
I have found an example code for a Scala runtime scripting in answer to Generating a class from string and instantiating it in Scala 2.10, however the code seems to be obsolete for 2.11 - I cannot find any function corresponding to build.setTypeSignature. Even if it worked, the code seems hard to read and follow to me.
How can Scala scripts be compiled and executed in Scala 2.11?
Let us assume I want following:
define several variables (names and values)
compile script
(optional improvement) change variable values
execute script
For simplicity consider following example:
I want to define following variables (programmatically, from the code, not from the script text):
val a = 1
val s = "String"
I want a following script to be compiled and on execution a String value "a is 1, s is String" returned from it:
s"a is $a, s is $s"
How should my functions look like?
def setupVariables() = ???
def compile() = ???
def changeVariables() = ???
def execute() : String = ???
Scala 2.11 adds a JSR-223 scripting engine. It should give you the functionality you are looking for. Just as a reminder, as with all of these sorts of dynamic things, including the example listed in the description above, you will lose type safety. You can see below that the return type is always Object.
Scala REPL Example:
scala> import javax.script.ScriptEngineManager
import javax.script.ScriptEngineManager
scala> val e = new ScriptEngineManager().getEngineByName("scala")
e: javax.script.ScriptEngine = scala.tools.nsc.interpreter.IMain#566776ad
scala> e.put("a", 1)
a: Object = 1
scala> e.put("s", "String")
s: Object = String
scala> e.eval("""s"a is $a, s is $s"""")
res6: Object = a is 1, s is String`
An addition example as an application running under scala 2.11.6:
import javax.script.ScriptEngineManager
object EvalTest{
def main(args: Array[String]){
val e = new ScriptEngineManager().getEngineByName("scala")
e.put("a", 1)
e.put("s", "String")
println(e.eval("""s"a is $a, s is $s""""))
}
}
For this application to work make sure to include the library dependency.
libraryDependencies += "org.scala-lang" % "scala-compiler" % scalaVersion.value