I'm using Slick 3.1.0 and Slick-pg 0.10.0. I have an enum as:
object UserProviders extends Enumeration {
type Provider = Value
val Google, Facebook = Value
}
Following the test case, it works fine with the column mapper simply adding the following implicit mapper into my customized driver.
implicit val userProviderMapper = createEnumJdbcType("UserProvider", UserProviders, quoteName = true)
However, when using plain SQL, I encountered the following compilation error:
could not find implicit value for parameter e: slick.jdbc.SetParameter[Option[models.UserProviders.Provider]]
I could not find any document about this. How can I write plain SQL with enum in slick? Thanks.
You need to have an implicit of type SetParameter[T] in scope which tells slick how to set parameters from some custom type T that it doesn't already know about. For example:
implicit val setInstant: SetParameter[Instant] = SetParameter { (instant, pp) =>
pp.setTimestamp(new Timestamp(instant.toEpochMilli))
}
The type of pp is PositionedParameters.
You might also come across the need to tell slick how to extract a query result into some custom type T that it doesn't already know about. For this, you need an implicit GetResult[T] in scope. For example:
implicit def getInstant(implicit get: GetResult[Long]): GetResult[Instant] =
get andThen (Instant.ofEpochMilli(_))
In a DataFrame object in Apache Spark (I'm using the Scala interface), if I'm iterating over its Row objects, is there any way to extract values by name? I can see how to do some really awkward stuff:
def foo(r: Row) = {
val ix = (0 until r.schema.length).map( i => r.schema(i).name -> i).toMap
val field1 = r.getString(ix("field1"))
val field2 = r.getLong(ix("field2"))
...
}
dataframe.map(foo)
I figure there must be a better way - this is pretty verbose, it requires creating this extra structure, and it also requires knowing the types explicitly, which if incorrect, will produce a runtime exception rather than a compile-time error.
You can use "getAs" from org.apache.spark.sql.Row
r.getAs("field1")
r.getAs("field2")
Know more about getAs(java.lang.String fieldName)
This is not supported at this time in the Scala API. The closest you have is this JIRA titled "Support converting DataFrames to typed RDDs"
To SQL-compile a query, you need to compile a function which takes, for each query parameter arg: type, a lifted parameter of type Rep[type] .
I have a case class JobRecord and a TableQuery jobRecords.
So to insert a JobRecord case-class instance, I need to be able to say something like:
val qMapToId = (jobRecords returning jobRecords.map(_.id))
def ucCreate(jobRecord: Rep[JobRecord]) = qMapToId += jobRecord
val cCreate = Compiled(ucCreate _)
But of course this doesn't compile, because += doesn't take a Rep, and I'm not sure Rep[JobRecord] is valid either.
I've tried many things, not worth showing, including mixing in the Monomorphic Case Classes guidance. I probably got a step away from a solution a few times. A pointer to a working example would be great!
You don't have to do anything, val qMapToId = (jobRecords returning jobRecords.map(_.id)) generates the statement once, at compile time (i.e. on container startup).
Compiled replaces Parameters in the new API, and comes into play for selects, updates, and (I believe) deletes where you are binding placeholders for generating a prepared statement. For insert statements there's nothing to bind, you already have an instance for +=.
You can use TableQuery[] as follows.
# define TableQuery of JobRecord
case class JobRecordRow(...)
class JobRecord(tag:Tag) extends Table[JobRecordRow](tag, "JOB_TABLE_NAME") {
}
# define compiled query
val insert = Compiled( TableQuery[JobRecord].filter(_ => true:Rep[Boolean]))
val stmt = (insert += JobRecordRow(...))
db.run( stmt)
The Compiled query seems little bit tricky. However, when I tried Compiled(TableQuery[JobRecord]) as suggested in other articles, it did not work. By adding filter(), I could build insert query.
Updated on 2019-07-21
Instead of filter(), map(identity) can be used.
TableQuery[JobRecord].map(identity)
I would like to implement an external DSL such as SQL in Scala using Macros. I have already seen papers on how to implement internal DSLs with Scala. Also, I've recently written an article about how this can be done in Java, myself.
Now, internal DSLs always feel a bit clumsy as they have to be implemented and used in the host language (e.g. Scala) and adhere to the host language's syntax constraints. That's why I'm hoping that Scala Macros will allow to internalise an external DSL without any such constraints. However, I don't fully understand Scala Macros and how far I can go with them. I've seen that SLICK and also a much less-known library called sqltyped have started using Macros, but SLICK uses a "Scalaesque" syntax for querying, which isn't really SQL, whereas sqltyped uses Macros to parse SQL strings (which can be done without Macros, too). Also, the various examples given on the Scala website are too trivial for what I'm trying to do
My question is:
Given an example external DSL defined as some BNF grammar like this:
MyGrammar ::= (
'SOME-KEYWORD' 'OPTION'?
(
( 'CHOICE-1' 'ARG-1'+ )
| ( 'CHOICE-2' 'ARG-2' )
)
)
Can I implement the above grammar using Scala Macros to allow for client programs like this? Or are Scala Macros not powerful enough to implement such a DSL?
// This function would take a Scala compile-checked argument and produce an AST
// of some sort, that I can further process
def evaluate(args: MyGrammar): MyGrammarEvaluated = ...
// These expressions produce a valid result, as the argument is valid according
// to my grammar
val result1 = evaluate(SOME-KEYWORD CHOICE-1 ARG-1 ARG-1)
val result2 = evaluate(SOME-KEYWORD CHOICE-2 ARG-2)
val result3 = evaluate(SOME-KEYWORD OPTION CHOICE-1 ARG-1 ARG-1)
val result4 = evaluate(SOME-KEYWORD OPTION CHOICE-2 ARG-2)
// These expressions produce a compilation error, as the argument is invalid
// according to my grammar
val result5 = evaluate(SOME-KEYWORD CHOICE-1)
val result6 = evaluate(SOME-KEYWORD CHOICE-2 ARG-2 ARG-2)
Note, I'm not interested in solutions that parse strings, the way sqltyped does
It's been some time since this question was answered by paradigmatic, but I've just stumbled upon it and thought it's worth extending.
An internalized DSL must indeed be valid Scala code with all the names defined before macro expansion, however one can overcome this restriction with a carefully designed syntax and Dynamics.
Let's say we wanted to create a simple, silly DSL allowing us to introduce people in a classy way. It might look like this:
people {
introduce John please
introduce Frank and Lilly please
}
We would like to translate (as part of compilation) the above code to an object (of a class derived for example from class People) containing definitions of fields of type Person for every introduced person - something like this:
new People {
val john: Person = new Person("John")
val frank: Person = new Person("Frank")
val lilly: Person = new Person("Lilly")
}
To make it possible we need to define some artificial objects and classes having two purposes: defining grammar (somewhat...) and tricking the compiler into accepting undefined names (like John or Lilly).
import scala.language.dynamics
trait AllowedAfterName
object and extends Dynamic with AllowedAfterName {
def applyDynamic(personName: String)(arg: AllowedAfterName): AllowedAfterName = this
}
object please extends AllowedAfterName
object introduce extends Dynamic {
def applyDynamic(personName: String)(arg: AllowedAfterName): and.type = and
}
These dummy definitions make our DSL code legal - the compiler translates it to the below code before proceeding to macro expansion:
people {
introduce.applyDynamic("John")(please)
introduce.applyDynamic("Frank")(and).applyDynamic("Lilly")(please)
}
Do we need this ugly and seemingly redundant please? One could probably come up with a nicer syntax, for example using Scala's postfix operator notation (language.postfixOps), but that gets tricky due to semicolon inference (you can try it yourself in the REPL console or IntelliJ's Scala Worksheet). It's easiest to just interlace keywords with undefined names.
As we've made the syntax legal, we can process the block with a macro:
def people[A](block: A): People = macro Macros.impl[A]
class Macros(val c: whitebox.Context) {
import c.universe._
def impl[A](block: c.Tree) = {
val introductions = block.children
def getNames(t: c.Tree): List[String] = t match {
case q"applyDynamic($name)(and).$rest" =>
name :: getNames(q"$rest")
case q"applyDynamic($name)(please)" =>
List(name)
}
val names = introductions flatMap getNames
val defs = names map { n =>
val varName = TermName(n.toLowerCase())
q"val $varName: Person = new Person($n)"
}
c.Expr[People](q"new People { ..$defs }")
}
}
The macro finds all the introduced names by pattern matching against the expanded dynamic calls and generates the desired output code. Notice that the macro must be whitebox in order to be allowed to return an expression of a type derived from the one declared in the signature.
I don't think so. The expression you pass to a macro must be a valid Scala expression and identifiers should be defined.
I'm writing a data structure that converts the results of a database query. The raw structure is a java ResultSet and it would be converted to a map or class which permits accessing different fields on that data structure by either a named method call or passing a string into apply(). Clearly different values may have different types. In order to reduce burden on the clients of this data structure, my preference is that one not need to cast the values of the data structure but the value fetched still has the correct type.
For example, suppose I'm doing a query that fetches two column values, one an Int, the other a String. The result then names of the columns are "a" and "b" respectively. Some ideal syntax might be the following:
val javaResultSet = dbQuery("select a, b from table limit 1")
// with ResultSet, particular values can be accessed like this:
val a = javaResultSet.getInt("a")
val b = javaResultSet.getString("b")
// but this syntax is undesirable.
// since I want to convert this to a single data structure,
// the preferred syntax might look something like this:
val newStructure = toDataStructure[Int, String](javaResultSet)("a", "b")
// that is, I'm willing to state the types during the instantiation
// of such a data structure.
// then,
val a: Int = newStructure("a") // OR
val a: Int = newStructure.a
// in both cases, "val a" does not require asInstanceOf[Int].
I've been trying to determine what sort of data structure might allow this and I could not figure out a way around the casting.
The other requirement is obviously that I would like to define a single data structure used for all db queries. I realize I could easily define a case class or similar per call and that solves the typing issue, but such a solution does not scale well when many db queries are being written. I suspect some people are going to propose using some sort of ORM, but let us assume for my case that it is preferred to maintain the query in the form of a string.
Anyone have any suggestions? Thanks!
To do this without casting, one needs more information about the query and one needs that information at compiole time.
I suspect some people are going to propose using some sort of ORM, but let us assume for my case that it is preferred to maintain the query in the form of a string.
Your suspicion is right and you will not get around this. If current ORMs or DSLs like squeryl don't suit your fancy, you can create your own one. But I doubt you will be able to use query strings.
The basic problem is that you don't know how many columns there will be in any given query, and so you don't know how many type parameters the data structure should have and it's not possible to abstract over the number of type parameters.
There is however, a data structure that exists in different variants for different numbers of type parameters: the tuple. (E.g. Tuple2, Tuple3 etc.) You could define parameterized mapping functions for different numbers of parameters that returns tuples like this:
def toDataStructure2[T1, T2](rs: ResultSet)(c1: String, c2: String) =
(rs.getObject(c1).asInstanceOf[T1],
rs.getObject(c2).asInstanceOf[T2])
def toDataStructure3[T1, T2, T3](rs: ResultSet)(c1: String, c2: String, c3: String) =
(rs.getObject(c1).asInstanceOf[T1],
rs.getObject(c2).asInstanceOf[T2],
rs.getObject(c3).asInstanceOf[T3])
You would have to define these for as many columns you expect to have in your tables (max 22).
This depends of course on that using getObject and casting it to a given type is safe.
In your example you could use the resulting tuple as follows:
val (a, b) = toDataStructure2[Int, String](javaResultSet)("a", "b")
if you decide to go the route of heterogeneous collections, there are some very interesting posts on heterogeneous typed lists:
one for instance is
http://jnordenberg.blogspot.com/2008/08/hlist-in-scala.html
http://jnordenberg.blogspot.com/2008/09/hlist-in-scala-revisited-or-scala.html
with an implementation at
http://www.assembla.com/wiki/show/metascala
a second great series of posts starts with
http://apocalisp.wordpress.com/2010/07/06/type-level-programming-in-scala-part-6a-heterogeneous-list%C2%A0basics/
the series continues with parts "b,c,d" linked from part a
finally, there is a talk by Daniel Spiewak which touches on HOMaps
http://vimeo.com/13518456
so all this to say that perhaps you can build you solution from these ideas. sorry that i don't have a specific example, but i admit i haven't tried these out yet myself!
Joschua Bloch has introduced a heterogeneous collection, which can be written in Java. I once adopted it a little. It now works as a value register. It is basically a wrapper around two maps. Here is the code and this is how you can use it. But this is just FYI, since you are interested in a Scala solution.
In Scala I would start by playing with Tuples. Tuples are kinda heterogeneous collections. The results can be, but not have to be accessed through fields like _1, _2, _3 and so on. But you don't want that, you want names. This is how you can assign names to those:
scala> val tuple = (1, "word")
tuple: ([Int], [String]) = (1, word)
scala> val (a, b) = tuple
a: Int = 1
b: String = word
So as mentioned before I would try to build a ResultSetWrapper around tuples.
If you want "extract the column value by name" on a plain bean instance, you can probably:
use reflects and CASTs, which you(and me) don't like.
use a ResultSetToJavaBeanMapper provided by most ORM libraries, which is a little heavy and coupled.
write a scala compiler plugin, which is too complex to control.
so, I guess a lightweight ORM with following features may satisfy you:
support raw SQL
support a lightweight,declarative and adaptive ResultSetToJavaBeanMapper
nothing else.
I made an experimental project on that idea, but note it's still an ORM, and I just think it may be useful to you, or can bring you some hint.
Usage:
declare the model:
//declare DB schema
trait UserDef extends TableDef {
var name = property[String]("name", title = Some("姓名"))
var age1 = property[Int]("age", primary = true)
}
//declare model, and it mixes in properties as {var name = ""}
#BeanInfo class User extends Model with UserDef
//declare a object.
//it mixes in properties as {var name = Property[String]("name") }
//and, object User is a Mapper[User], thus, it can translate ResultSet to a User instance.
object `package`{
#BeanInfo implicit object User extends Table[User]("users") with UserDef
}
then call raw sql, the implicit Mapper[User] works for you:
val users = SQL("select name, age from users").all[User]
users.foreach{user => println(user.name)}
or even build a type safe query:
val users = User.q.where(User.age > 20).where(User.name like "%liu%").all[User]
for more, see unit test:
https://github.com/liusong1111/soupy-orm/blob/master/src/test/scala/mapper/SoupyMapperSpec.scala
project home:
https://github.com/liusong1111/soupy-orm
It uses "abstract Type" and "implicit" heavily to make the magic happen, and you can check source code of TableDef, Table, Model for detail.
Several million years ago I wrote an example showing how to use Scala's type system to push and pull values from a ResultSet. Check it out; it matches up with what you want to do fairly closely.
implicit val conn = connect("jdbc:h2:f2", "sa", "");
implicit val s: Statement = conn << setup;
val insertPerson = conn prepareStatement "insert into person(type, name) values(?, ?)";
for (val name <- names)
insertPerson<<rnd.nextInt(10)<<name<<!;
for (val person <- query("select * from person", rs => Person(rs,rs,rs)))
println(person.toXML);
for (val person <- "select * from person" <<! (rs => Person(rs,rs,rs)))
println(person.toXML);
Primitives types are used to guide the Scala compiler into selecting the right functions on the ResultSet.