How to pass an array to a slick SQL plain query? - scala

How to pass an array to a slick SQL plain query?
I tried as follows but it fails:
// "com.typesafe.slick" %% "slick" % "3.3.2", // latest version
val ids = Array(1, 2, 3)
db.run(sql"""select name from person where id in ($ids)""".as[String])
Error: could not find implicit value for parameter e: slick.jdbc.SetParameter[Array[Int]]
However this ticket seems to say that it should work:
https://github.com/tminglei/slick-pg/issues/131
Note: I am not interested in the following approach:
db.run(sql"""select name from person where id in #${ids.mkString("(", ",", ")")}""".as[Int])

The issue you linked points to a commit which adds this:
def mkArraySetParameter[T: ClassTag](/* ... */): SetParameter[Seq[T]]
def mkArrayOptionSetParameter[T: ClassTag](/* ... */): SetParameter[Option[Seq[T]]]
Note that they are not implicit.
You'll need to do something like
implicit val setIntArray: SetParameter[Array[Int]] = mkArraySetParameter[Int](...)
and make sure that is in scope when you try to construct your sql"..." string.

I meet same problem and searched it.
And I resolved it with a implicit val like this:
implicit val strListParameter: slick.jdbc.SetParameter[List[String]] =
slick.jdbc.SetParameter[List[String]]{ (param, pointedParameters) =>
pointedParameters.setString(f"{${param.mkString(", ")}}")
}
put it into your slick-pg profile and import it with other val at where needed.
Or more strict, like this:
implicit val strListParameter: slick.jdbc.SetParameter[List[String]] =
slick.jdbc.SetParameter[List[String]]{ (param, pointedParameters) =>
pointedParameters.setObject(param.toArray, java.sql.Types.ARRAY)
}
implicit val strSeqParameter: slick.jdbc.SetParameter[Seq[String]] =
slick.jdbc.SetParameter[Seq[String]]{ (param, pointedParameters) =>
pointedParameters.setObject(param.toArray, java.sql.Types.ARRAY)
}
and use the val like:
val entries: Seq[String]
val query = {
sql"""select ... from xxx
where entry = ANY($entries)
order by ...
""".as[(Column, Types, In, Here)]
}

Related

scala mongodb document getList

I would like to get groups attribute as Seq[Int] from the given mongodb Document. How to do it? The method getList catches a runtime exception, and I would like to understand and fix it.
n: Document((_id,BsonObjectId{value=613645d689898b7d4ac2b1b2}), (groups,BsonArray{values=[BsonInt32{value=2}, BsonInt32{value=3}]}))
I tried this way that compiles, but I get the runtime error "Caused by: java.lang.ClassCastException: List element cannot be cast to scala.Int$"
val groups = n.getList("groups", Int.getClass)
Some sbt library dependencies:
scalaVersion := "2.12.14"
libraryDependencies += "org.mongodb.scala" %% "mongo-scala-driver" % "4.3.1"
Setup code:
val collection = db.getCollection("mylist")
Await.result(collection.drop.toFuture, Duration.Inf)
val groupsIn = Seq[Int](2, 3)
val doc = Document("groups" -> groupsIn)
Await.result(collection.insertOne(doc).toFuture, Duration.Inf)
println("see mongosh to verify that a Seq[Int] has been added")
val result = Await.result(collection.find.toFuture, Duration.Inf)
for(n <- result) {
println("n: " + n)
val groups = n.getList("groups", Int.getClass)
println("groups: " + groups)
}
Comments: result is of type Seq[Document], n is of type Document.
The getList hover-on description in VSCODE:
def getList[T](key: Any, clazz: Class[T]): java.util.List[T]
Gets the list value of the given key, casting the list elements to the given Class. This is useful to avoid having casts in client code, though the effect is the same.
With the help of sarveshseri and Gael J, the solution is reached:
import collection.JavaConverters._
val groups = n.getList("groups", classOf[Integer]).asScala.toSeq.map(p => p.toInt)

Cannot splat an Array into function's arguments that accepts varargs

I have tried to make a function that can enrich a given DataFrame with a "session" column using a window function. So I need to use partitionBy and orderBy.
val by_uuid_per_date = Window.partitionBy("uuid").orderBy("year","month","day")
// A Session = A day of events for a certain user. uuid x (year+month+day)
val enriched_df = df
.withColumn("session", dense_rank().over(by_uuid_per_date))
.orderBy("uuid","timestamp")
.select("uuid","year","month","day","session")
This works perfectly, but when I try to make a function that encapsulates this behavior :
PS: I used the _* splat operator.
def enrich_with_session(df:DataFrame,
window_partition_cols:Array[String],
window_order_by_cols:Array[String],
presentation_order_by_cols:Array[String]):DataFrame={
val by_uuid_per_date = Window.partitionBy(window_partition_cols: _*).orderBy(window_order_by_cols: _*)
df.withColumn("session", dense_rank().over(by_uuid_per_date))
.orderBy(presentation_order_by_cols:_*)
.select("uuid","year","month","mday","session")
}
I get the following error:
notebook:6: error: no `: _*' annotation allowed here
(such annotations are only allowed in arguments to -parameters)
val by_uuid_per_date = Window.partitionBy(window_partition_cols: _).orderBy(window_order_by_cols: _*)
partitionBy and orderBy are expecting Seq[Column] or
Array[Column] as arguments, see below:
val data = Seq(
(1,99),
(1,99),
(1,70),
(1,20)
).toDF("id","value")
data.select('id,'value, rank().over(Window.partitionBy('id).orderBy('value))).show()
val partitionBy: Seq[Column] = Seq(data("id"))
val orderBy: Seq[Column] = Seq(data("value"))
data.select('id,'value, rank().over(Window.partitionBy(partitionBy:_*).orderBy(orderBy:_*))).show()
So in this case, your code should looks like this:
def enrich_with_session(df:DataFrame,
window_partition_cols:Array[String],
window_order_by_cols:Array[String],
presentation_order_by_cols:Array[String]):DataFrame={
val window_partition_cols_2: Array[Column] = window_partition_cols.map(df(_))
val window_order_by_cols_2: Array[Column] = window_order_by_cols.map(df(_))
val presentation_order_by_cols_2: Array[Column] = presentation_order_by_cols.map(df(_))
val by_uuid_per_date = Window.partitionBy(window_partition_cols_2: _*).orderBy(window_order_by_cols_2: _*)
df.withColumn("session", dense_rank().over(by_uuid_per_date))
.orderBy(presentation_order_by_cols_2:_*)
.select("uuid","year","month","mday","session")
}

Evalutate complex type with quasiquote scala, unlifting

I need to compile function and then evaluate it with different parameters of type List[Map[String, AnyRef]].
I have the following code that does not compile with such the type but compiles with simple type like List[Int].
I found that there are just certain implementations of Liftable in scala.reflect.api.StandardLiftables.StandardLiftableInstances
import scala.reflect.runtime.universe
import scala.reflect.runtime.universe._
import scala.tools.reflect.ToolBox
val tb = universe.runtimeMirror(getClass.getClassLoader).mkToolBox()
val functionWrapper =
"""
object FunctionWrapper {
def makeBody(messages: List[Map[String, AnyRef]]) = Map.empty
}""".stripMargin
val functionSymbol =
tb.define(tb.parse(functionWrapper).asInstanceOf[tb.u.ImplDef])
val list: List[Map[String, AnyRef]] = List(Map("1" -> "2"))
tb.eval(q"$functionSymbol.function($list)")
Getting compilation error for this, how can I make it work?
Error:(22, 38) Can't unquote List[Map[String,AnyRef]], consider using
... or providing an implicit instance of
Liftable[List[Map[String,AnyRef]]]
tb.eval(q"$functionSymbol.function($list)")
^
The problem comes not from complicated type but from the attempt to use AnyRef. When you unquote some literal, it means you want the infrastructure to be able to create a valid syntax tree to create an object that would exactly match the object you pass. Unfortunately this is obviously not possible for all objects. For example, assume that you've passed a reference to Thread.currentThread() as a part of the Map. How it could possible work? Compiler is just not able to recreate such a complicated object (not to mention making it the current thread). So you have two obvious alternatives:
Make you argument also a Tree i.e. something like this
def testTree() = {
val tb = universe.runtimeMirror(getClass.getClassLoader).mkToolBox()
val functionWrapper =
"""
| object FunctionWrapper {
|
| def makeBody(messages: List[Map[String, AnyRef]]) = Map.empty
|
| }
""".stripMargin
val functionSymbol =
tb.define(tb.parse(functionWrapper).asInstanceOf[tb.u.ImplDef])
//val list: List[Map[String, AnyRef]] = List(Map("1" -> "2"))
val list = q"""List(Map("1" -> "2"))"""
val res = tb.eval(q"$functionSymbol.makeBody($list)")
println(s"testTree = $res")
}
The obvious drawback of this approach is that you loose type safety at compile time and might need to provide a lot of context for the tree to work
Another approach is to not try to pass anything containing AnyRef to the compiler-infrastructure. It means you create some function-like Wrapper:
package so {
trait Wrapper {
def call(args: List[Map[String, AnyRef]]): Map[String, AnyRef]
}
}
and then make your generated code return a Wrapper instead of directly executing the logic and call the Wrapper from the usual Scala code rather than inside compiled code. Something like this:
def testWrapper() = {
val tb = universe.runtimeMirror(getClass.getClassLoader).mkToolBox()
val functionWrapper =
"""
|object FunctionWrapper {
| import scala.collection._
| import so.Wrapper /* <- here probably different package :) */
|
| def createWrapper(): Wrapper = new Wrapper {
| override def call(args: List[Map[String, AnyRef]]): Map[String, AnyRef] = Map.empty
| }
|}
| """.stripMargin
val functionSymbol = tb.define(tb.parse(functionWrapper).asInstanceOf[tb.u.ImplDef])
val list: List[Map[String, AnyRef]] = List(Map("1" -> "2"))
val tree: tb.u.Tree = q"$functionSymbol.createWrapper()"
val wrapper = tb.eval(tree).asInstanceOf[Wrapper]
val res = wrapper.call(list)
println(s"testWrapper = $res")
}
P.S. I'm not sure what are you doing but beware of performance issues. Scala is a hard language to compile and thus it might easily take more time to compile your custom code than to run it. If performance becomes an issue you might need to use some other methods such as full-blown macro-code-generation or at least caching of the compiled code.

Mapping column types Slick 3.1.1

I am new to Slick and having a really hard time getting mapping of java.sql.date/time/timestamp mapped into jodatime.
trait ColumnTypeMappings {
val profile: JdbcProfile
import profile.api._
val localTimeFormatter = DateTimeFormat.forPattern("HH:mm:ss")
val javaTimeFormatter = new SimpleDateFormat("HH:mm:ss")
implicit val myDateColumnType = MappedColumnType.base[LocalDate, Date](
ld => new java.sql.Date(ld.toDateTimeAtStartOfDay(DateTimeZone.UTC).getMillis),
d => new LocalDateTime(d.getTime).toLocalDate
)
implicit val myTimeColumnType = MappedColumnType.base[LocalTime, Time](
lt => new java.sql.Time(javaTimeFormatter.parse(lt.toString(localTimeFormatter)).getTime),
t => new LocalTime(t.getTime)
)
implicit val myTimestampColumnType = MappedColumnType.base[DateTime, Timestamp](
dt => new java.sql.Timestamp(dt.getMillis),
ts => new DateTime(ts.getTime, DateTimeZone.UTC)
)
}
In the auto generated Tables.scala I include the mapping like this:
trait Tables extends ColumnTypeMappings {
val profile: slick.driver.JdbcDriver
import profile.api._
import scala.language.implicitConversions
// + rest of the auto generated code by slick codegen
}
And to wrap it all up I use this like this:
object TestTables extends Tables {
val profile = slick.driver.MySQLDriver
}
import Tables._
import profile.api._
val db = Database.forURL("url", "user", "password", driver = "com.mysql.jdbc.Driver")
val q = Company.filter(_.companyid === 1).map(._name)
val action = q.result
val future = db.run(action)
val result = Await.result(future, Duration.Inf)
I get an NullPointerException on: implicit val myDateColumnType.... when running this. I've verified that this last block of code works if I remove the mapping.
Try changing implicit val to implicit def in your definitions of the MappedColumnTypes. The reason why is related to the answer given by Maksym Chernenko to this question. Generally, the JdbcProfile driver (that defines api.MappedColumnType) has not been injected yet, and:
that causes NPE. You can either make your "mapper" val lazy, or change it
from val to def (as shown below)
implicit def myDateColumnType = MappedColumnType.base[LocalDate, Date](
ld => new java.sql.Date(ld.toDateTimeAtStartOfDay(DateTimeZone.UTC).getMillis),
d => new LocalDateTime(d.getTime).toLocalDate
)
implicit def myTimeColumnType = MappedColumnType.base[LocalTime, Time](
lt => new java.sql.Time(javaTimeFormatter.parse(lt.toString(localTimeFormatter)).getTime),
t => new LocalTime(t.getTime)
)
implicit def myTimestampColumnType = MappedColumnType.base[DateTime, Timestamp](
dt => new java.sql.Timestamp(dt.getMillis),
ts => new DateTime(ts.getTime, DateTimeZone.UTC)
)
So i think the issue may be that you are extending ColumnTypeMappings in your Tables.scala. The documentation doesn't make it clear but I think the auto generated code relating to the database should not be touched, as this is used by slick to map the rows in the DB, and then extend TestTables by ColumnTypeMappings to do the implicit conversion when you get the result back from the database.
I haven't particularly delved into slick 3.x yet so I may be wrong, but I think that makes sense.
Edit: No, i was wrong :(. Apologies

Add data type from PostgreSQL extension in Slick

I'm using the PostGIS extension for PostgreSQL and I'm trying to retrieve a PGgeometry object from a table.
This version is working fine :
import java.sql.DriverManager
import java.sql.Connection
import org.postgis.PGgeometry
object PostgersqlTest extends App {
val driver = "org.postgresql.Driver"
val url = "jdbc:postgresql://localhost:5432/gis"
var connection:Connection = null
try {
Class.forName(driver)
connection = DriverManager.getConnection(url)
val statement = connection.createStatement()
val resultSet = statement.executeQuery("SELECT geom FROM table;")
while ( resultSet.next() ) {
val geom = resultSet.getObject("geom").asInstanceOf[PGgeometry]
println(geom)
}
} catch {
case e: Exception => e.printStackTrace()
}
connection.close()
}
I need to be able to do the same thing using Slick custom query. But this version doesn't work :
Q.queryNA[PGgeometry]("SELECT geom FROM table;")
and gives me this compilation error
Error:(50, 40) could not find implicit value for parameter rconv: scala.slick.jdbc.GetResult[org.postgis.PGgeometry]
val query = Q.queryNA[PGgeometry](
^
Is there a simple way to add the PGgeometry data type in Slick without having to convert the returned object to a String and parse it?
To use it successfully, you need define a GetResult, and maybe SetParameter if you want to insert/update it to db.
Here's some codes extracted from slick tests (p.s. I assume you're using slick 2.1.0):
implicit val getUserResult = GetResult(r => new User(r.<<, r.<<))
case class User(id:Int, name:String)
val userForID = Q[Int, User] + "select id, name from USERS where id = ?"
But, if your java/scala type is jts.Geometry instead of PGgeometry, you can try to use slick-pg, which has built-in support for jts.Geometry and PostGIS for slick Lifted and Plain SQL.
To overcome the same issue, I used slick-pg (0.8.2)and JTS's Geometry classes as tminglei mentioned in the previous answer. There are two steps to use slick-pg to handle PostGIS's geometry types: (i) extend Slick's PostgresDriver with PgPostGISSupport and (ii) define an implicit converter for your plain query as shown below.
As shown in this page, you should first extend the PostgresDriver with PgPostGISSupport:
object MyPostgresDriver extends PostgresDriver with PgPostGISSupport {
override lazy val Implicit = new Implicits with PostGISImplicits
override val simple = new Implicits with SimpleQL with PostGISImplicits with PostGISAssistants
val plainImplicits = new Implicits with PostGISPlainImplicits
}
Using the implicit conversions defined in plainImplicits in the extended driver, you can write your query as:
import com.vividsolutions.jts.geom.LineString // Or any other JTS geometry types.
import MyPostgresDriver.plainImplicits._
import scala.slick.jdbc.GetResult
case class Row(id: Int, geom: LineString)
implicit val geomConverter = GetResult[Row](r => {
Row(r.nextInt, r.nextGeometry[LineString])
})
val query = Q.queryNA[Row](
"""SELECT id, geom FROM table;"""
)