Slick 2.0: How to convert lifted query results to a case class? - scala

in order to implement a ReSTfull APIs stack, I need to convert data extracted from a DB to JSON format. I think that the best way is to extract data from the DB and then convert the row set to JSON using Json.toJson() passing as argument a case class after having defined a implicit serializer (writes).
Here's my case class and companion object:
package deals.db.interf.slick2
import scala.slick.driver.MySQLDriver.simple._
import play.api.libs.json.Json
case class PartnerInfo(
id: Int,
name: String,
site: String,
largeLogo: String,
smallLogo: String,
publicationSite: String
)
object PartnerInfo {
def toCaseClass( ?? ) = { // what type are the arguments to be passed?
PartnerInfo( fx(??) ) // how to transform the input types (slick) to Scala types?
}
// Notice I'm using slick 2.0.0 RC1
class PartnerInfoTable(tag: Tag) extends Table[(Int, String, String, String, String, String)](tag, "PARTNER"){
def id = column[Int]("id")
def name = column[String]("name")
def site = column[String]("site")
def largeLogo = column[String]("large_logo")
def smallLogo = column[String]("small_logo")
def publicationSite = column[String]("publication_site")
def * = (id, name, site, largeLogo, smallLogo, publicationSite)
}
val partnerInfos = TableQuery[PartnerInfoTable]
def qPartnerInfosForPuglisher(publicationSite: String) = {
for (
pi <- partnerInfos if ( pi.publicationSite == publicationSite )
) yield toCaseClass( _ ) // Pass all the table columns to toCaseClass()
}
implicit val partnerInfoWrites = Json.writes[PartnerInfo]
}
What I cannot get is how to implement the toCaseClass() method in order to transform the column types from Slick 2 to Scala types - notice the function fx() in the body of toCaseClass() is only meant to give emphasis to that.
I'm wondering if is it possible to get the Scala type from Slick column type because it is clearly passed in the table definition, but I cannot find how to get it.
Any idea?

I believe the simplest method here would be to map PartnerInfo in the table schema:
class PartnerInfoTable(tag: Tag) extends Table[PartnerInfo](tag, "PARTNER"){
def id = column[Int]("id")
def name = column[String]("name")
def site = column[String]("site")
def largeLogo = column[String]("large_logo")
def smallLogo = column[String]("small_logo")
def publicationSite = column[String]("publication_site")
def * = (id, name, site, largeLogo, smallLogo, publicationSite) <> (PartnerInfo.tupled, PartnerInfo.unapply)
}
val partnerInfos = TableQuery[PartnerInfoTable]
def qPartnerInfosForPuglisher(publicationSite: String) = {
for (
pi <- partnerInfos if ( pi.publicationSite == publicationSite )
) yield pi
}
Otherwise PartnerInfo.tupled should do the trick:
def toCaseClass(pi:(Int, String, String, String, String, String)) = PartnerInfo.tupled(pi)

Related

How to pick a field's type based off of another field's value when reading JSON using Play (Scala)?

New to Scala and Play and running into a problem where I have the following:
case class Media(
name: String,
id: Id,
type: String,
properties: PodcastProperties
)
object Media {
implicit val format: OFormat[Media] = Json.format[Media]
case class PodcastProperties(
x: Int,
y: DateTime,
z: String
)
object PodcastProperties {
implicit val format: OFormat[PodcastProperties] = Json.format[PodcastProperties]
Say I want to define Media to accept different media types. Let's say I have a Json Media object, and it's type is "newspaper" and it's properties should be parse using "NewspaperProperties"
case class NewspaperProperties(
Title: String,
Publisher: String
)
object NewspaperProperties {
implicit val format: OFormat[NewspaperProperties] = Json.format[NewspaperProperties]
How can I define Media, so it can parse the "type" field, and then read the "properties" field correctly using the right Json parser?
You need to defined the media properties as sealed family.
import play.api.libs.json._
import java.time.OffsetDateTime
sealed trait MediaProperties
case class NewspaperProperties(
title: String, // Do not use 'Title' .. initial cap is not valid
publisher: String // ... not 'Publisher'
) extends MediaProperties
object NewspaperProperties {
implicit val format: OFormat[NewspaperProperties] = Json.format[NewspaperProperties]
}
case class PodcastProperties(
x: Int,
y: OffsetDateTime,
z: String
) extends MediaProperties
object PodcastProperties {
implicit val format: OFormat[PodcastProperties] =
Json.format[PodcastProperties]
}
Then a OFormat can be materialized for MediaProperties.
implicit val mediaPropertiesFormat: OFormat[MediaProperties] = Json.format
This managed discriminator in the JSON representation (by default _type field, naming can be configured).
val props1: MediaProperties = PodcastProperties(1, OffsetDateTime.now(), "z")
val obj1 = Json.toJson(props1)
// > JsValue = {"_type":"PodcastProperties","x":1,"y":"2020-11-23T22:53:35.301603+01:00","z":"z"}
obj1.validate[MediaProperties]
JsResult[MediaProperties] = JsSuccess(PodcastProperties(1,2020-11-23T23:02:24.752063+01:00,z),)
The implicit format for MediaProperties should probably be defined in the companion object MediaProperties.
Then the format for Media can be materialized automatically.
final class Id(val value: String) extends AnyVal
object Id {
implicit val format: Format[Id] = Json.valueFormat
}
case class Media(
name: String,
id: Id,
//type: String, -- Not needed for the JSON representation
properties: MediaProperties
)
object Media {
implicit val format: OFormat[Media] = Json.format[Media] // <--- HERE
}

DSL in scala using case classes

My use case has case classes something like
case class Address(name:String,pincode:String){
override def toString =name +"=" +pincode
}
case class Department(name:String){
override def toString =name
}
case class emp(address:Address,department:Department)
I want to create a DSL like below.Can anyone share the links about how to create a DSL and any suggestions to achieve the below.
emp.withAddress("abc","12222").withDepartment("HR")
Update:
Actual use case class may have more fields close to 20. I want to avoid redudancy of code
I created a DSL using reflection so that we don't need to add every field to it.
Disclamer: This DSL is extremely weakly typed and I did it just for fun. I don't really think this is a good approach in Scala.
scala> create an Employee where "homeAddress" is Address("a", "b") and "department" is Department("c") and that_s it
res0: Employee = Employee(a=b,null,c)
scala> create an Employee where "workAddress" is Address("w", "x") and "homeAddress" is Address("y", "z") and that_s it
res1: Employee = Employee(y=z,w=x,null)
scala> create a Customer where "address" is Address("a", "b") and "age" is 900 and that_s it
res0: Customer = Customer(a=b,900)
The last example is the equivalent of writing:
create.a(Customer).where("address").is(Address("a", "b")).and("age").is(900).and(that_s).it
A way of writing DSLs in Scala and avoid parentheses and the dot is by following this pattern:
object.method(parameter).method(parameter)...
Here is the source:
// DSL
object create {
def an(t: Employee.type) = new ModelDSL(Employee(null, null, null))
def a(t: Customer.type) = new ModelDSL(Customer(null, 0))
}
object that_s
class ModelDSL[T](model: T) {
def where(field: String): ValueDSL[ModelDSL2[T], Any] = new ValueDSL(value => {
val f = model.getClass.getDeclaredField(field)
f.setAccessible(true)
f.set(model, value)
new ModelDSL2[T](model)
})
def and(t: that_s.type) = new { def it = model }
}
class ModelDSL2[T](model: T) {
def and(field: String) = new ModelDSL(model).where(field)
def and(t: that_s.type) = new { def it = model }
}
class ValueDSL[T, V](callback: V => T) {
def is(value: V): T = callback(value)
}
// Models
case class Employee(homeAddress: Address, workAddress: Address, department: Department)
case class Customer(address: Address, age: Int)
case class Address(name: String, pincode: String) {
override def toString = name + "=" + pincode
}
case class Department(name: String) {
override def toString = name
}
I really don't think you need the builder pattern in Scala. Just give your case class reasonable defaults and use the copy method.
i.e.:
employee.copy(address = Address("abc","12222"),
department = Department("HR"))
You could also use an immutable builder:
case class EmployeeBuilder(address:Address = Address("", ""),department:Department = Department("")) {
def build = emp(address, department)
def withAddress(address: Address) = copy(address = address)
def withDepartment(department: Department) = copy(department = department)
}
object EmployeeBuilder {
def withAddress(address: Address) = EmployeeBuilder().copy(address = address)
def withDepartment(department: Department) = EmployeeBuilder().copy(department = department)
}
You could do
object emp {
def builder = new Builder(None, None)
case class Builder(address: Option[Address], department: Option[Department]) {
def withDepartment(name:String) = {
val dept = Department(name)
this.copy(department = Some(dept))
}
def withAddress(name:String, pincode:String) = {
val addr = Address(name, pincode)
this.copy(address = Some(addr))
}
def build = (address, department) match {
case (Some(a), Some(d)) => new emp(a, d)
case (None, _) => throw new IllegalStateException("Address not provided")
case _ => throw new IllegalStateException("Department not provided")
}
}
}
and use it as emp.builder.withAddress("abc","12222").withDepartment("HR").build().
You don't need optional fields, copy, or the builder pattern (exactly), if you are willing to have the build always take the arguments in a particular order:
case class emp(address:Address,department:Department, id: Long)
object emp {
def withAddress(name: String, pincode: String): WithDepartment =
new WithDepartment(Address(name, pincode))
final class WithDepartment(private val address: Address)
extends AnyVal {
def withDepartment(name: String): WithId =
new WithId(address, Department(name))
}
final class WithId(address: Address, department: Department) {
def withId(id: Long): emp = emp(address, department, id)
}
}
emp.withAddress("abc","12222").withDepartment("HR").withId(1)
The idea here is that each emp parameter gets its own class which provides a method to get you to the next class, until the final one gives you an emp object. It's like currying but at the type level. As you can see I've added an extra parameter just as an example of how to extend the pattern past the first two parameters.
The nice thing about this approach is that, even if you're part-way through the build, the type you have so far will guide you to the next step. So if you have a WithDepartment so far, you know that the next argument you need to supply is a department name.
If you want to avoid modifying the origin classes you can use implicit class, e.g.
implicit class EmpExtensions(emp: emp) {
def withAddress(name: String, pincode: String) {
//code omitted
}
// code omitted
}
then import EmpExtensions wherever you need these methods

In Slick, how to filter based on a custom column properties?

I have a custom column in my Slick as follow:
class PgACL(tag: Tag) extends Table[ACL](tag, Some(schemaName), "ACL") {
def id = column[UUID]("ID", O.PrimaryKey)
def resourceSpec = column[ResourceSpec]("RESOURCE_SPEC")
def * = (id, resourceSpec) <>(ACL.tupled, ACL.unapply)
}
and the custom class:
case class ResourceSpec (val resourceType: String, val resourceId: String)
And I map them like this:
implicit val ResourceSpecMapper = MappedColumnType.base[ResourceSpec, String](
resourceSpec => resourceSpec.resourceSpecStr,
str => ResourceSpec.fromString(str)
)
I am trying to have a query which filter based on a property of the custom column, but I don't know how I can access to it. For example I want to have:
TableQuery[PgACL].filter(x => x.resourceSpec.resourceType === "XYZ")
But x.resourceSpec returns Rep[ResourceSpec] and I don't know how to get its "resourceType" property.
Any help?

Scala reduce on Case class with keys

Below is the scenario of 2 case classes that I have -
case class Pack(name: String, age: Int, dob: String)
object Pack{
def getPack(name:String, age: Int, dob:String): Pack = {
Pack(name,age,dob)
}
}
case class NewPack(name: String, pack: (List[Pack]))
object NewPack{
def getNewPackList(data: List[Pack]): List[NewPack] = {
val newData = for(x <- data )yield(x.name,List(x))
val newPackData = for(x <- newData)yield(NewPack(x._1,x._2))
newPackData
}
}
val someData = List(Pack.getPack("x",12,"day1"), Pack.getPack("y",23,"day2"),Pack.getPack("x",34,"day3") )
val somePackData = NewPack.getNewPackList(someData)
the values of someData and somePackData is like this -
someData: List[Pack] = List(Pack(x,12,day1), Pack(y,23,day2), Pack(x,34,day3))
somePackData: List[NewPack] = List(NewPack(x,List(Pack(x,12,day1))), NewPack(y,List(Pack(y,23,day2))), NewPack(x,List(Pack(x,34,day3))))
Now I want to get the final data in the below mentioned format. Could anyone suggest me the better way of doing that.
finalData: List[NewPack] = List(NewPack(x,List(Pack(x,12,day1),Pack(x,34,day3)),NewPack(y,List(Pack(y,23,day2))))
Here is small simplification.
case class Pack(name: String, age: Int, dob: String)
case class NewPack(name: String, pack: List[Pack])
object NewPack{
def getNewPackList(data: List[Pack]): List[NewPack] =
data.map{case pack#Pack(name, _, _) => NewPack(name, List(pack))}
}
Note that in general you don't need additional factory method for case class. If you would like to present something tricky later, you can always define custom apply in your companion object.
Additionally if you were intending to group packs by name you can easily define something like
def newPacksGrouped(data: List[Pack]): List[NewPack] =
data.groupBy(_.name).map{case (name, packs) => NewPack(name, packs)}.toList

Scala Skinny ORM - Obligatory Relation

belongsTo relationship obligatory
I don't want to define typeWine as an optional value, but if i don't put it, I have to declare typeWine in the extract method and I don't know how to do that.
In the documentacion of Skinny ORM, it doesn't describe how to do this, and I'm getting stuck.
package app.models.wine
import scalikejdbc._
import skinny.orm.SkinnyCRUDMapper
case class Wine (id: Option[Long], typeWine: Option[Type] = None, name: String)
object Wine extends SkinnyCRUDMapper[Wine] {
override def defaultAlias = createAlias("w")
override def extract (rs: WrappedResultSet, n: ResultName[Wine]): Wine = new Wine(
id = rs.get(n.id),
name = rs.get(n.name)
)
belongsTo[Type](Type, (w, t) => w.copy(typeWine = t)).byDefault
}
package app.models.wine
import scalikejdbc._
import skinny.orm.SkinnyCRUDMapper
case class Type (id: Option[Long], typeName: String)
object Type extends SkinnyCRUDMapper[Type] {
override def defaultAlias = createAlias("t")
override def columnNames = Seq("id", "type_name")
override def extract (rs: WrappedResultSet, n: ResultName[Type]): Type = new Type(
id = rs.get(n.id),
typeName = rs.get(n.typeName)
)
}
Defining belongsTo/byDefault relationship to typeWine and extracting typeWine value with its resultName should work for you.