How to implement a generic REST api for tables in Play2 with squeryl and spray-json - scala

I'm trying to implement a controller in Play2 which exposes a simple REST-style api for my db-tables. I'm using squeryl for database access and spray-json for converting objects to/from json
My idea is to have a single generic controller to do all the work, so I've set up the following routes in conf/routes:
GET /:tableName controllers.Crud.getAll(tableName)
GET /:tableName/:primaryKey controllers.Crud.getSingle(tableName, primaryKey)
.. and the following controller:
object Crud extends Controller {
def getAll(tableName: String) = Action {..}
def getSingle(tableName: String, primaryKey: Long) = Action {..}
}
(Yes, missing create/update/delete, but let's get read to work first)
I've mapped tables to case classes by extended squeryl's Schema:
object MyDB extends Schema {
val accountsTable = table[Account]("accounts")
val customersTable = table[Customer]("customers")
}
And I've told spray-json about my case classes so it knows how to convert them.
object MyJsonProtocol extends DefaultJsonProtocol {
implicit val accountFormat = jsonFormat8(Account)
implicit val customerFormat = jsonFormat4(Customer)
}
So far so good, it actually works pretty well as long as I'm using the table-instances directly. The problem surfaces when I'm trying to generify the code so that I end up with excatly one controller for accessing all tables: I'm stuck with some piece of code that doesn't compile and I am not sure what's the next step.
It seems to be a type issue with spray-json which occurs when I'm trying to convert the list of objects to json in my getAll function.
Here is my generic attempt:
def getAll(tableName: String) = Action {
val json = inTransaction {
// lookup table based on url
val table = MyDB.tables.find( t => t.name == tableName).get
// execute select all and convert to json
from(table)(t =>
select(t)
).toList.toJson // causes compile error
}
// convert json to string and set correct content type
Ok(json.compactPrint).as(JSON)
}
Compile error:
[error] /Users/code/api/app/controllers/Crud.scala:29:
Cannot find JsonWriter or JsonFormat type class for List[_$2]
[error] ).toList.toJson
[error] ^
[error] one error found
I'm guessing the problem could be that the json-library needs to know at compile-time which model type I'm throwing at it, but I'm not sure (notice the List[_$2] in that compile error). I have tried the following changes to the code which compile and return results:
Remove the generic table-lookup (MyDB.tables.find(.....).get) and instead use the specific table instance e.g. MyDB.accountsTable. Proves that JSON serialization for work . However this is not generic, will require a unique controller and route config per table in db.
Convert the list of objects from db query to a string before calling toJson. I.e: toList.toJson --> toList.toString.toJson. Proves that generic lookup of tables work But not a proper json response since it is a string-serialized list of objects..
Thoughts anyone?

Your guess is correct. MyDb.tables is a Seq[Table[_]], in other words it could hold any type of table. There is no way for the compiler to figure out the type of the table you locate using the find method, and it seems like the type is needed for the JSON conversion. There are ways to get around that, but you'd need to some type of access to the model class.

Related

Scala function return type based on generic

Using Scala generics I'm trying to abstract some common functions in my Play application. The functions return Seqs with objects deserialized from a REST JSON service.
def getPeople(cityName: String): Future[Seq[People]] = {
getByEndpoint[People](s"http://localhost/person/$cityName")
}
def getPeople(): Future[Seq[Dog]] = {
getByEndpoint[Dog]("http://localhost/doge")
}
The fetch and deserialization logic is packed into a single function using generics.
private def getByEndpoint[T](endpoint: String): Future[Seq[T]] = {
ws.url(endpoint)
.get()
.map(rsp => rsp.json)
.flatMap { json =>
json.validate[Seq[T]] match {
case s: JsSuccess[Seq[T]] =>
Future.successful(s.get)
case e: JsError =>
Future.failed(new RuntimeException(s"Get by endpoint JSON match failed: $e"))
}
}
}
Problem is is I'm getting "No Json deserializer found for type Seq[T]. Try to implement an implicit Reads or Format for this type.". I'm sure I'm not using T properly in Seq[T] (according to my C#/Java memories at least), but I can't find any clue how to do it the proper way in Scala. Everything works as expected without using generics.
Play JSON uses type classes to capture information about which types can be (de-)serialized to and from JSON, and how. If you have an implicit value of type Format[Foo] in scope, that's referred to as an instance of the Format type class for Foo.
The advantage of this approach is that it gives us a way to constrain generic types (and have those constraints checked at compile time) that doesn't depend on subtyping. For example, there's no way the standard library's String will ever extend some kind of Jsonable trait that Play (or any other library) might provide, so we need some way of saying "we know how to encode Strings as JSON" that doesn't involve making String a subtype of some trait we've defined ourselves.
In Play JSON you can do this by defining implicit Format instances, and Play itself provides many of these for you (e.g., if you've got one for T, it'll give you one for Seq[T]). The validate method on JsValue requires one of these instances (actually a subtype of Format, Reads, but that's not terribly relevant here) for its type parameter—Seq[T] in this case—and it won't compile unless the compiler can find that instance.
You can provide this instance by adding the constraint to your own generic method:
private def getByEndpoint[T: Format](endpoint: String): Future[Seq[T]] = {
...
}
Now with the T: Format syntax you've specified that there has to be a Format instance for T (even though you don't constraint T in any other way), so the compiler knows how to provide the Format instance for Seq[T] that the json.validate[Seq[T]] call requires.

Enum in Plain SQL when using Slick 3.1

I'm using Slick 3.1.0 and Slick-pg 0.10.0. I have an enum as:
object UserProviders extends Enumeration {
type Provider = Value
val Google, Facebook = Value
}
Following the test case, it works fine with the column mapper simply adding the following implicit mapper into my customized driver.
implicit val userProviderMapper = createEnumJdbcType("UserProvider", UserProviders, quoteName = true)
However, when using plain SQL, I encountered the following compilation error:
could not find implicit value for parameter e: slick.jdbc.SetParameter[Option[models.UserProviders.Provider]]
I could not find any document about this. How can I write plain SQL with enum in slick? Thanks.
You need to have an implicit of type SetParameter[T] in scope which tells slick how to set parameters from some custom type T that it doesn't already know about. For example:
implicit val setInstant: SetParameter[Instant] = SetParameter { (instant, pp) =>
pp.setTimestamp(new Timestamp(instant.toEpochMilli))
}
The type of pp is PositionedParameters.
You might also come across the need to tell slick how to extract a query result into some custom type T that it doesn't already know about. For this, you need an implicit GetResult[T] in scope. For example:
implicit def getInstant(implicit get: GetResult[Long]): GetResult[Instant] =
get andThen (Instant.ofEpochMilli(_))

How to dump all tables using Squeryl

I'm trying to write a quick data browser for a database using Squeryl but I have difficulty iterating over all the tables in a generic way. Based on the Squeryl SchoolDb example i tried the following:
def browseTable(name: String) = {
SchoolDb.tables.find(_.name == name) map { t=>
val fields = t.posoMetaData.fieldsMetaData
val rows = from (t) (s => select(s))
// Print the columns
println(fields.map(_.columnName).mkString("\t"))
rows map { row =>
println(fields.map(f => f.get(row)).mkstring("\t"))
}
}
The compiler is not very happy with this attempt (Missing type type for 'row') and I can sort-of understand its dilemma. Explicitly declaring the parametr as Any just changes the comilation error to "No implicit view available from Any => org.squeryl.dsl.ast.TypedExpressionNode[_]" on 'f.get(row)'
How Can I either fix this issue or change the models (maybe adding a trait of some sort) to enable generic access to all data in all tables?
The compiler complains because f.get method expects an AnyRef parameter. AFAIK, in Scala everything can be safely cast to AnyRef - the compiler will force necessary boxing if needed. So I think this should work: f.get(row.asInstanceOf[AnyRef])
EDIT: I just tested this and it works.

Designing serialization library in Scala with type classes

I have system where I need to serialize different kinds of objects to json and xml. Some of them are Lift MetaRecords, some are case classes. I wanted to use type classes and create something like:
trait Serializable[T] {
serialize[T](obj: T): T
}
And usual implementations for json, xml and open for extension.
Problem I'm facing now is serialization itself. Currently there are different contexts in which objects are serialized. Imagine news feed system. There are three objects: User, Post (feed element) and Photo. Those objects have some properties and can reference each other. Now in same cases I want to serialize object alone (user settings, preferences, etc.) in other cases I need other objects to be serialized as well ie. Feed: List[Post] + related photos. In order to do that I need to provide referenced objects.
My current implementation is bloated with optional parametered functions.
def feedAsJson(post: MPost, grp: Option[PrivateGroup], commentsBox: Option[List[MPostComment]] = Empty): JObject
I thought about implementing some kind of context solution. Overload feedAsJson with implicit context parameter that will provide necessary data. I don't know how I'd like to implement it yet as it touches database maybe with cake pattern. Any suggestions very appreciated.
Can't you put the implicits in scope that will create the right kind of serializers that you need? Something to that effect:
def doNothingSerializer[T]: Serializable[T] = ???
implicit def mpostToJson(implicit pgs:Serializable[PrivateGroup]],
cmts:Serializable[List[MPostComment]]) =
new Serializable[MPost] {
def serialize(mpost: MPost): JObject = {
val privateGroupJSon = pgs.serialize(mpost.privateGroup)
// make the mpost json with privateGroupJSon which would be empty
???
}
}
// later where you need to serialize without the inner content:
implicit val privateGroupToJson = doNothingSerializer[PrivateGroup]
implicit val mpostCommentsToJson = doNothingSerializer[List[MPostComment]]
implicitly[Serializable[MPost]].serialize(mpost)
You would need to define default serializable instances in a trait that is then inherited (so that low priority implicits are in scope).
Note that I'm assuming that the trait for Serializable is:
trait Serializable[T] {
def serialize(t: T): JObject
}
(no [T] method type argument and returns a JObject)
Maybe "Scala Pickling" might help you:
http://lampwww.epfl.ch/~hmiller/pickling
I just watched the presentation.

Generic Form processing in Play! Framework: is this -crazy- even feasible?

I'm trying to create a generic database insertion method in Scala using the Slick and Play! frameworks. This involves passing in a generic Form instance and the model object that it's associated to. There are two problems I'm running into that are driving me nuts at the moment:
How do I instantiate a generic type?
How do I dynamically generate the parameters of that generic type from generic form binding?
Code so far
/**
* Template method for accessing a database that abstracts out
* Database.forDataSource(DB.getDataSource()) withSession {}
* and actual form vals; will allow for anonymous Form declarations.
*/
def dbAccessThatUsesAForm[I <: AnyRef, T <: Table[I]](
f: Form[Product], // a Form of some generic tuple or mapping
t: T, // a Slick table to insert I objects into
i: Class[I] // the actual class that I belongs to (maybe not needed)
)(
request: SecuredRequest[AnyContent] // this is a SecureSocial auth. thing
): Boolean = {
f.bindFromRequest((request.request.body.asFormUrlEncoded).getOrElse(Map())).fold(
errors => {
logger.error(t.toString + "was not bound to " + t.toString + "'s Form correctly")
false
},
success => {
t.insert(new I(someParamsThatIHaveNoIdeaWhereToStart)) // DOES NOT WORK
true
}
)
}
On one
type not found: I
At this point I think I'm deeply misunderstanding something about Scala generics and considering using dependency injection as a solution. Maybe pass in a function that binds a class to this method, and call that within my method? But I already have an Injector defined in Global.scala that follows this post... Dependency injection here is based on a module... but injection here isn't based on whether I'm in production or in testing...
On two
Play Forms can take tuples for their field mappings. So I tried looking up how to describe generic tuple types . Therefore I surmised that I'd be passing in a Form[Product] (API for Product and Form) instead of Form[_], and doing something like:
(the thing in the for loop won't work because productArity isn't actually part of mapping )
for( i = 1 to f.mapping.productArity ) { // pretty sure this won't work.
// generate the parameters and store them to some val by calling smth like
val store Params += success.productElem(i)
}
As you can see, I'm quite lost as to
how to get the number of fields in a Form because a Form is actually composed of a Seq[Mapping]
and
how in the world I'd store dynamically generated parameters.
Do constructors receive parameters as a tuple? Is there a sort of Parameter object I could pass in to a generic class instantiator?