I am just trying things out in scala and I wrote this code
object Main:
def main(args: Array[String]): Unit =
val dInt: Data[Int] = IntData(1)
val dString: Data[String] = StringData("hello")
val Data(deconstructedInt) = dInt // unapply
val Data(deconstructedString) = dString // unapply
println(deconstructedInt)
println(deconstructedString)
sealed trait Data[+T]:
def get: T
case class IntData(get: Int) extends Data[Int]
case class StringData(get: String) extends Data[String]
object Data:
def apply[T](input: T): Data[T] = input match {
case i: Int => IntData(i) //compile error here
case s: String => StringData(s) //compile error here
}
def unapply[T](d: Data[T]): Option[String] = d match {
case IntData(get) => Some(s"int data => get = $get")
case StringData(get) => Some(s"string data => get = $get")
}
I get at the location commented in the code this error
Found: IntData
Required: Data[T]
case i: Int => IntData(i)
why I am getting this error, isn't IntData (or StringData) a subclass of Data ?
IntData is a subtype of Data[Int]. So if T is not Int, IntData is not a subtype of Data[T]. Now, you might say, if input matches the first case, then clearly T is Int. But the compiler is not smart to realise this!
You could try using Scala 3's new match types:
type DataOf[T] = T match {
case Int => IntData
case String => StringData
}
def apply[T](input: T): DataOf[T] = input match {
case i: Int => IntData(i)
case s: String => StringData(s)
}
Another alternative is union types:
def apply(input: Int | String): Data[Int | String] = input match {
case i: Int => IntData(i)
case s: String => StringData(s)
}
However, you will lose type information in doing this. Using the match types solution, apply(42) is inferred to have type IntData. Using the union types solution, it is inferred to have type Data[Int | String].
This way compiler connects the dots between Ts and it works:
object Main:
def main(args: Array[String]): Unit =
val dInt: Data[Int] = IntData(1)
val dString: Data[String] = StringData("hello")
val Data(deconstructedInt) = dInt // unapply
val Data(deconstructedString) = dString // unapply
println(deconstructedInt)
println(deconstructedString)
sealed trait Data[+T]:
def get: T
case class IntData[T <: Int](get: T) extends Data[T]
case class StringData[T <: String](get: T) extends Data[T]
object Data:
def apply[T](input: T): Data[T] = input match {
case i: Int => IntData(i)
case s: String => StringData(s)
}
def unapply[T](d: Data[T]): Option[String] = d match {
case IntData(get) => Some(s"int data => get = $get")
case StringData(get) => Some(s"string data => get = $get")
}
Have a slick table with columns:
def name: Rep[Option[String]] = ???
def email: Rep[String] = ???
def fraudScores: Rep[Int] = ???
also there is typeclass to calcualte rate for different fields:
trait Rater[T] {
def apply(rep: T): Result
}
object Rater {
def getRater[T](t: T)(implicit rater: Rater[T]): Result = rater(t)
implicit val int: Rater[Rep[Int]] = v => calculateRate(v, _)
implicit val str: Rater[Rep[String]] = v => calculateRate(v, _)
implicit val strOpt: Rater[Rep[Option[String]]] = v => calculateRate(v, _)
}
and map:
val map: Map[String, Rep[_ >: Option[String] with String with Int]] = Map(
"name" -> name,
"email" -> email,
"scores" -> fraudScores
)
what I'd like to do is getting correct instance based on dynamic input, like:
val fname = "scores" // value getting from http request
map.get(fname).fold(default)(f => {
val rater = getRater(f)
rater(someVal)
})
but getting error that there is no implicit for Rep[_ >: Option[String] with String with Int], is there some workaround for this?
I think, your problem is that Map is a wrong way to represent this.
Use a case class:
case class Foo(
name: Rep[Option[String]],
email: Rep[String],
score: Rep[Int]
)
def applyRater[T : Rater](t: T) = implicitly[Rater[T]](t)
def rate(foo: Foo, what: String) = what match {
case "name" => applyRater(foo.name)
case "email" => applyRater(foo.email)
case "score" => applyRater(foo.score)
case _ => default
}
I have about 24 Case classes that I need to programatically enhance by changing several common elements prior to serialization in a datastore that doesn't support joins. Since case classes don't have a trait defined for the copy(...) constructor, I have been attempting to use Macros - As a base I've looked at
this post documenting a macro and come up with this macro:
When I try to compile, I get the following:
import java.util.UUID
import org.joda.time.DateTime
import scala.language.experimental.macros
trait RecordIdentification {
val receiverId: Option[String]
val transmitterId: Option[String]
val patientId: Option[UUID]
val streamType: Option[String]
val sequenceNumber: Option[Long]
val postId: Option[UUID]
val postedDateTime: Option[DateTime]
}
object WithRecordIdentification {
import scala.reflect.macros.Context
def withId[T, I](entity: T, id: I): T = macro withIdImpl[T, I]
def withIdImpl[T: c.WeakTypeTag, I: c.WeakTypeTag](c: Context)(
entity: c.Expr[T], id: c.Expr[I]
): c.Expr[T] = {
import c.universe._
val tree = reify(entity.splice).tree
val copy = entity.actualType.member(newTermName("copy"))
val params = copy match {
case s: MethodSymbol if (s.paramss.nonEmpty) => s.paramss.head
case _ => c.abort(c.enclosingPosition, "No eligible copy method!")
}
c.Expr[T](Apply(
Select(tree, copy),
AssignOrNamedArg(Ident("postId"), reify(id.splice).tree) ::
AssignOrNamedArg(Ident("patientId"), reify(id.splice).tree) ::
AssignOrNamedArg(Ident("receiverId"), reify(id.splice).tree) ::
AssignOrNamedArg(Ident("transmitterId"), reify(id.splice).tree) ::
AssignOrNamedArg(Ident("sequenceNumber"), reify(id.splice).tree) :: Nil
))
}
}
And I invoke it with something like:
class GenericAnonymizer[A <: RecordIdentification]() extends Schema {
def anonymize(dataPost: A, header: DaoDataPostHeader): A = WithRecordIdentification.withId(dataPost, header)
}
But I get a compile error:
Error:(44, 71) type mismatch;
found : com.dexcom.rt.model.DaoDataPostHeader
required: Option[String]
val copied = WithRecordIdentification.withId(sampleGlucoseRecord, header)
Error:(44, 71) type mismatch;
found : com.dexcom.rt.model.DaoDataPostHeader
required: Option[java.util.UUID]
val copied = WithRecordIdentification.withId(sampleGlucoseRecord, header)
Error:(44, 71) type mismatch;
found : com.dexcom.rt.model.DaoDataPostHeader
required: Option[Long]
val copied = WithRecordIdentification.withId(sampleGlucoseRecord, header)
I'm not quite sure how to change the macro to support multiple parameters... any sage advice?
Assuming you have a set of following case classes, which you wish to anonymize on certain attributes prior to serialization.
case class MyRecordA(var receiverId: String, var y: Int)
case class MyRecordB(var transmitterId: Int, var y: Int)
case class MyRecordC(var patientId: UUID, var y: Int)
case class MyRecordD(var streamType: String, var y: Int)
case class MyRecordE(var sequenceNumber: String, var streamType: String, var y: Int)
You can use scala reflection library to mutate an instance's attributes in runtime. You can implement your custom anonymize/enhancing logic in implicit anonymize method that the Mutator can use to alter a given instance's field selectively if required as per your implementation.
import java.util.UUID
import scala.reflect.runtime.{universe => ru}
implicit def anonymize(field: String /* field name */, value: Any /* use current field value if reqd */): Option[Any] = field match {
case "receiverId" => Option(value.toString.hashCode)
case "transmitterId" => Option(22)
case "patientId" => Option(UUID.randomUUID())
case _ => None
}
implicit class Mutator[T: ru.TypeTag](i: T)(implicit c: scala.reflect.ClassTag[T], anonymize: (String, Any) => Option[Any]) {
def mask = {
val m = ru.runtimeMirror(i.getClass.getClassLoader)
ru.typeOf[T].members.filter(!_.isMethod).foreach(s => {
val fVal = m.reflect(i).reflectField(s.asTerm)
anonymize(s.name.decoded.trim, fVal.get).foreach(fVal.set)
})
i
}
}
Now you can invoke masking on any instance as:
val maskedRecord = MyRecordC(UUID.randomUUID(), 2).mask
Having a table with the columns
class Data(tag: Tag) extends Table[DataRow](tag, "data") {
def id = column[Int]("id", O.PrimaryKey)
def name = column[String]("name")
def state = column[State]("state")
def price = column[Int]("price")
def * = (id.?, name, state, price) <> ((DataRow.apply _).tupled, DataRow.unapply)
}
I'd like to write a function that would select a single row, and update the columns where the supplied values are not null.
def update(id: Int, name: Option[String], state: Option[State], price: Option[Int])
eg.
update(1, None, None, Some(5)) would update only the price of the data row 1, leaving the name and state intact
update(1, Some("foo"), None, Some(6)) would update the name and price, but leave its state intact.
I guess some smart mapping could be used, but I'm having a hard time expressing it, not sure how it could spit out different length tuples depending on the inputs (wether their value is defined), since they are more or less "unrelated" classes.
def update(id: Int, name: Option[String], state: Option[State], price: Option[Int]) = {
table.fiter(_.id == id). ???? .update(name, state, price)
}
I solved it in the following way.
The implementation below works only if it is a Product object.
Execute the update statement except for None for the Option type and null for the object type.
package slick.extensions
import slick.ast._
import slick.dbio.{ Effect, NoStream }
import slick.driver.JdbcDriver
import slick.jdbc._
import slick.lifted._
import slick.relational.{ CompiledMapping, ProductResultConverter, ResultConverter, TypeMappingResultConverter }
import slick.util.{ ProductWrapper, SQLBuilder }
import scala.language.{ existentials, higherKinds, implicitConversions }
trait PatchActionExtensionMethodsSupport { driver: JdbcDriver =>
trait PatchActionImplicits {
implicit def queryPatchActionExtensionMethods[U <: Product, C[_]](
q: Query[_, U, C]
): PatchActionExtensionMethodsImpl[U] =
createPatchActionExtensionMethods(updateCompiler.run(q.toNode).tree, ())
}
///////////////////////////////////////////////////////////////////////////////////////////////
//////////////////////////////////////////////////////////// Patch Actions
///////////////////////////////////////////////////////////////////////////////////////////////
type PatchActionExtensionMethods[T <: Product] = PatchActionExtensionMethodsImpl[T]
def createPatchActionExtensionMethods[T <: Product](tree: Node, param: Any): PatchActionExtensionMethods[T] =
new PatchActionExtensionMethodsImpl[T](tree, param)
class PatchActionExtensionMethodsImpl[T <: Product](tree: Node, param: Any) {
protected[this] val ResultSetMapping(_, CompiledStatement(_, sres: SQLBuilder.Result, _),
CompiledMapping(_converter, _)) = tree
protected[this] val converter = _converter.asInstanceOf[ResultConverter[JdbcResultConverterDomain, Product]]
protected[this] val TypeMappingResultConverter(childConverter, toBase, toMapped) = converter
protected[this] val ProductResultConverter(elementConverters # _ *) =
childConverter.asInstanceOf[ResultConverter[JdbcResultConverterDomain, Product]]
private[this] val updateQuerySplitRegExp = """(.*)(?<=set )((?:(?= where)|.)+)(.*)?""".r
private[this] val updateQuerySetterRegExp = """[^\s]+\s*=\s*\?""".r
/** An Action that updates the data selected by this query. */
def patch(value: T): DriverAction[Int, NoStream, Effect.Write] = {
val (seq, converters) = value.productIterator.zipWithIndex.toIndexedSeq
.zip(elementConverters)
.filter {
case ((Some(_), _), _) => true
case ((None, _), _) => false
case ((null, _), _) => false
case ((_, _), _) => true
}
.unzip
val (products, indexes) = seq.unzip
val newConverters = converters.zipWithIndex
.map(c => (c._1, c._2 + 1))
.map {
case (c: BaseResultConverter[_], idx) => new BaseResultConverter(c.ti, c.name, idx)
case (c: OptionResultConverter[_], idx) => new OptionResultConverter(c.ti, idx)
case (c: DefaultingResultConverter[_], idx) => new DefaultingResultConverter(c.ti, c.default, idx)
case (c: IsDefinedResultConverter[_], idx) => new IsDefinedResultConverter(c.ti, idx)
}
val productResultConverter =
ProductResultConverter(newConverters: _*).asInstanceOf[ResultConverter[JdbcResultConverterDomain, Any]]
val newConverter = TypeMappingResultConverter(productResultConverter, (p: Product) => p, (a: Any) => toMapped(a))
val newValue: Product = new ProductWrapper(products)
val newSql = sres.sql match {
case updateQuerySplitRegExp(prefix, setter, suffix) =>
val buffer = StringBuilder.newBuilder
buffer.append(prefix)
buffer.append(
updateQuerySetterRegExp
.findAllIn(setter)
.zipWithIndex
.filter(s => indexes.contains(s._2))
.map(_._1)
.mkString(", ")
)
buffer.append(suffix)
buffer.toString()
}
new SimpleJdbcDriverAction[Int]("patch", Vector(newSql)) {
def run(ctx: Backend#Context, sql: Vector[String]): Int =
ctx.session.withPreparedStatement(sql.head) { st =>
st.clearParameters
newConverter.set(newValue, st)
sres.setter(st, newConverter.width + 1, param)
st.executeUpdate
}
}
}
}
}
Example
// Model
case class User(
id: Option[Int] = None,
name: Option[String] = None,
username: Option[String] = None,
password: Option[String] = None
)
// Table
class Users(tag: Tag) extends Table[User](tag, "users") {
def id = column[Int]("id", O.PrimaryKey, O.AutoInc)
def name = column[String]("name")
def username = column[String]("username")
def password = column[String]("password")
override def * = (id.?, name.?, username.?, password.?) <>(User.tupled, User.unapply)
}
// TableQuery
object Users extends TableQuery(new Users(_))
// CustomDriver
trait CustomDriver extends PostgresDriver with PatchActionExtensionMethodsSupport {
override val api: API = new API {}
trait API extends super.API with PatchActionImplicits
}
// Insert
Users += User(Some(1), Some("Test"), Some("test"), Some("1234"))
// User patch
Users.filter(_.id === 1).patch(User(name = Some("Change Name"), username = Some("")))
https://gist.github.com/bad79s/1edf9ea83ba08c46add03815059acfca
Building on JonasAnso's answer, converting that to slick v3.0+, and putting it into a transaction:
def partialUpdate(id: Int, name: Option[String], login: Option[String]): Future[Int] = {
val selectQ = users.filter(_.id === id)
val query = selectQ.result.head.flatMap { data =>
selectQ.update(data.patch(name, login))
}
db.run(query)
}
As I commented the question is similar to an existing one, but you don't seem to have any extra requirements.
The simplest approach is just SELECT + UPDATE. For example you add a patch function in your DataRow class defining how you want to update your model
def patch(name: Option[String], state: Option[State], price: Option[Int]): Data {
this.copy(name = name.getOrElse(this.name), ...)
}
And you add a partialUpdate method in your repo class
class DataRepo {
private val Datas = TableQuery[Data]
val db = ???
def partialUpdate(id: Int, name: Option[String], state: Option[State], price: Option[Int]): Future[Int] = {
val query = Datas.filter(_.id === id)
for {
data <- db.run(query.result.head)
result <- db.run(query.update(data.patch(name, state, price)))
} yield result
}
}
As you see the main problem of this solution is that there are 2 SQL statements, SELECT and UPDATE.
Other solution is to use plain SQL (http://slick.typesafe.com/doc/3.0.0/sql.html) but of course this gives other problems.
Long story short, I'm trying to figure out how to define a function from a generic input to a single-typed output.
The background: This is a continuation of Mapping over Shapeless record. After Travis's excellent answer, I now have the following:
import shapeless._
import poly._
import syntax.singleton._
import record._
type QueryParams = Map[String, Seq[String]]
trait RequestParam[T] {
def value: T
/** Convert value back to a query parameter representation */
def toQueryParams: Seq[(String, String)]
/** Mark this parameter for auto-propagation in new URLs */
def propagate: Boolean
protected def queryStringPresent(qs: String, allParams: QueryParams): Boolean = allParams.get(qs).nonEmpty
}
type RequestParamBuilder[T] = QueryParams => RequestParam[T]
def booleanRequestParam(paramName: String, willPropagate: Boolean): RequestParamBuilder[Boolean] = { params =>
new RequestParam[Boolean] {
def propagate: Boolean = willPropagate
def value: Boolean = queryStringPresent(paramName, params)
def toQueryParams: Seq[(String, String)] = Seq(paramName -> "true").filter(_ => value)
}
}
def stringRequestParam(paramName: String, willPropagate: Boolean): RequestParamBuilder[Option[String]] = { params =>
new RequestParam[Option[String]] {
def propagate: Boolean = willPropagate
def value: Option[String] = params.get(paramName).flatMap(_.headOption)
def toQueryParams: Seq[(String, String)] = value.map(paramName -> _).toSeq
}
}
In reality, the following would be a class constructor that takes this Map read from the query string as a parameter, but for simplicity's sake, I'm just defining a val:
val requestParams = Map("no_ads" -> Seq("true"), "edition" -> Seq("us"))
// In reality, there are many more possible parameters, but this is simplified
val options = ('adsDebug ->> booleanRequestParam("ads_debug", true)) ::
('hideAds ->> booleanRequestParam("no_ads", true)) ::
('edition ->> stringRequestParam("edition", false)) ::
HNil
object bind extends FieldPoly {
implicit def rpb[T, K](implicit witness: Witness.Aux[K]): Case.Aux[
FieldType[K, RequestParamBuilder[T]],
FieldType[K, RequestParam[T]]
] = atField(witness)(_(requestParams))
}
// Create queryable option values record by binding the request parameters
val boundOptions = options.map(bind)
This lets me do:
boundOptions.get('hideAds).value // -> true
The problem: Now I want to be able to reserialize options that have propagate = true. So basically, I need to filter my HList on the propagate field of every member, which should always return a Boolean, and then have each parameter reserialize itself to a Seq[(String, String)]. I've tried the following:
object propagateFilter extends (RequestParam ~> Const[Boolean]) {
override def apply[T](r: RequestParam[T]): Boolean = r.propagate
}
object unbind extends (RequestParam ~> Const[Seq[(String, String)]]) {
override def apply[T](r: RequestParam[T]): Seq[(String, String)] = r.toQueryParams
}
// Reserialize a query string for options that should be propagated
val propagatedParams = boundOptions.values.filter(propagateFilter).map(unbind).toList
// (followed by conventional collections methods)
, but it doesn't like my functions. I got the following error:
<console>:31: error: type mismatch;
found : Boolean
required: shapeless.Const[T]
(which expands to) AnyRef{type λ[T] = T}
override def apply[T](r: RequestParam[T]) = r.propagate
I believe I'm taking the wrong approach to a function that's supposed to have a polymorphic input but monomorphic output.
Other failed attempts:
object propagateFilter extends Poly1 {
implicit def default[T](implicit st: Case.Aux[RequestParam[T], Boolean]) = at[RequestParam[T]](_.propagate)
}
and
def propagate[T](x: RequestParam[T]): Boolean = x.propagate
and
object propagateFilter extends Poly1 {
implicit def default = at[RequestParam[_]](_.propagate)
}
and
object propagateFilter extends FieldPoly {
implicit def rpb[T, K](implicit witness: Witness.Aux[K]): Case.Aux[
FieldType[K, RequestParam[T]],
Boolean
] = atField(witness)(_.propagate)
}
None of which work, probably due to my own misunderstanding of what's going on.