I am attempting to use scala-cass in order to read from cassandra and convert the resultset to a case class using resultSet.as[CaseClass]. This works great when running the following.
import com.weather.scalacass.syntax._
case class TestTable(id: String, data1: Int, data2: Long)
val resultSet = session.execute(s"select * from test.testTable limit 10")
resultSet.one.as[TestTable]
Now I am attempting to make this more generic and I am unable to find the proper type constraint for the generic class.
import com.weather.scalacass.syntax._
case class TestTable(id: String, data1: Int, data2: Long)
abstract class GenericReader[T] {
val table: String
val keyspace: String
def getRows(session: Session): T = {
val resultSet = session.execute(s"select * from $keyspace.$table limit 10")
resultSet.one.as[T]
}
}
I implement this class with the desired case class and attempt to call getRows on the created Object.
object TestTable extends GenericReader[TestTable] {
val keyspace = "test"
val table = "TestTable"
}
TestTable.getRows(session)
This throws an exception could not find implicit value for parameter ccd: com.weather.scalacass.CCCassFormatDecoder[T].
I am trying to add a type constraint to GenericReader in order to ensure the implicit conversion will work. However, I am unable to find the proper type. I am attempting to read through scala-cass in order to find the proper constraint but I have had no luck so far.
I would also be happy to use any other library that can achieve this.
Looks like as[T] requires an implicit value that you don't have in scope, so you'll need to require that implicit parameter in the getRows method as well.
def getRows(session: Session)(implicit cfd: CCCassFormatDecoder[T]): T
You could express this as a type constraint (what you were looking for in the original question) using context bounds:
abstract class GenericReader[T:CCCassFormatDecoder]
Rather than try to bound your generic T type, it might be easier to just pass through the missing implicit parameter:
abstract class GenericReader[T](implicit ccd: CCCassFormatDecoder[T]) {
val table: String
val keyspace: String
def getRows(session: Session): T = {
val resultSet = session.execute(s"select * from $keyspace.$table limit 10")
resultSet.one.as[T]
}
}
Finding a concrete value for that implicit can then be deferred to when you narrow that T to a specific class (like object TestTable extends GenericReader[TestTable])
Related
I am using case class which has nested case classes and Seq[Nested Case Classes]
The problem is when I try to serialize it using KafkaAvroSerializer it throws:
Caused by: java.lang.IllegalArgumentException: Unsupported Avro type. Supported types are null, Boolean, Integer, Long, Float, Double, String, byte[] and IndexedRecord
at io.confluent.kafka.serializers.AbstractKafkaAvroSerDe.getSchema(AbstractKafkaAvroSerDe.java:115)
at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:71)
at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:54)
at org.apache.kafka.common.serialization.Serializer.serialize(Serializer.java:60)
at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:879)
at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:841)
at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:728)```
If you want to use Avro with Scala constructs like case classes I recommend you use Avro4s. This has native support for all scala features and can even create the schema from your model if that is what you want.
There are some gotcha's with automatic type class derivation. This is what I learned.
Use at least avro4s version 2.0.4
Some of the macros generate code with compiler warnings and also break wart remover. We had to add the following annotations to get our code to compile (sometimes the error is cannot find implicit, but its caused by error in macro generated code):
#com.github.ghik.silencer.silent
#SuppressWarnings(Array("org.wartremover.warts.Null", "org.wartremover.warts.AsInstanceOf", "org.wartremover.warts.StringPlusAny"))
Next automatic type class derivation only works one level at a time. I created an object to hold all my SchemaFor, Decoder and Encoder instances for my schema. Then I built up the type classes instances explicitly starting from the inner most types. I also used implicitly to verify each ADT would resolve before moving to the next one. For example:
sealed trait Notification
object Notification {
final case class Outstanding(attempts: Int) extends Notification
final case class Complete(attemts: Int, completedAt: Instant) extends Notification
}
sealed trait Job
final case class EnqueuedJob(id: String, enqueuedAt: Instant) extends Job
final case class RunningJob(id: String, enqueuedAt: Instant, startedAt: Instant) extends Job
final case class FinishedJob(id: String, enqueuedAt: Instant, startedAt: Instant, completedAt: Instant) extends Job
object Schema {
// Explicitly define schema for ADT instances
implicit val schemaForNotificationComplete: SchemaFor[Notification.Complete] = SchemaFor.applyMacro
implicit val schemaForNotificationOutstanding: SchemaFor[Notification.Outstanding] = SchemaFor.applyMacro
// Verify Notification ADT is defined
implicitly[SchemaFor[Notification]]
implicitly[Decoder[Notification]]
implicitly[Encoder[Notification]]
// Explicitly define schema, decoder and encoder for ADT instances
implicit val schemaForEnqueuedJob: SchemaFor[EnqueuedJob] = SchemaFor.applyMacro
implicit val decodeEnqueuedJob: Decoder[EnqueuedJob] = Decoder.applyMacro
implicit val encodeEnqueuedJob: Encoder[EnqueuedJob] = Encoder.applyMacro
implicit val schemaForRunningJob: SchemaFor[RunningJob] = SchemaFor.applyMacro
implicit val decodeRunningJob: Decoder[RunningJob] = Decoder.applyMacro
implicit val encodeRunningJob: Encoder[RunningJob] = Encoder.applyMacro
implicit val schemaForFinishedJob: SchemaFor[FinishedJob] = SchemaFor.applyMacro
implicit val decodeFinishedJob: Decoder[FinishedJob] = Decoder.applyMacro
implicit val encodeFinishedJob: Encoder[FinishedJob] = Encoder.applyMacro
// Verify Notification ADT is defined
implicitly[Encoder[Job]]
implicitly[Decoder[Job]]
implicitly[SchemaFor[Job]]
// And so on until complete nested ADT is defined
}
I have a very generic message object that I get back from a queue like:
case class Message(key: String, properties: Map[String, String])
I then have a bunch of very specific classes that represent a message, and I use properties.get("type") to determine which particular message it is:
sealed trait BaseMessage
case class LoginMessage(userId: Int, ....) extends BaseMessage
case class RegisterMessage(email: String, firstName: String, ....) extends BaseMessage
Now in my code I have to convert from a generic Message to a particular message in many places, and I want to create this in a single place like:
Currently I am doing something like:
val m = Message(....)
val myMessage = m.properties.get("type") match {
case Some("login") => LoginMessage(m.properties("userID"), ...)
case ...
}
What options do I have in making this less cumbersome in scala?
I don't know all your context here, but I can suggest using implicit conversions if you don't want to bring another library in your project. Anyway, implicit conversions can help you separate a lot the implementation or override it "on-the-fly" as needed.
We can start by defining a MessageConverter trait that is actually a function:
/**
* Try[T] here is useful to track deserialization errors. If you don't need it you can use Option[T] instead.
*/
trait MessageConverter[T <: BaseMessage] extends (Message => Try[T])
Now define an object that holds both the implementations and also enables a nice #as[T] method on Message instances:
object MessageConverters {
/**
* Useful to perform conversions such as:
* {{{
* import MessageConverters._
*
* message.as[LoginMessage]
* message.as[RegisterMessage]
* }}}
*/
implicit class MessageConv(val message: Message) extends AnyVal {
def as[T <: BaseMessage : MessageConverter]: Try[T] =
implicitly[MessageConverter[T]].apply(message)
}
// Define below message converters for each particular type
implicit val loginMessageConverter = new MessageConverter[LoginMessage] {
override def apply(message: Message): Try[LoginMessage] = {
// Parse the properties and build the instance here or fail if you can't.
}
}
}
That's it! It may not be the best solution as implicits bring complexity and they make code harder to follow. However, if you follow a well-defined structure for storing these implicit values and be careful how you pass them around, then you shouldn't have any issues.
You can convert the properties map to Json and read it as a case class. Assuming that the keys to the map have the same name as your case class fields you can write a formatter using playjson:
object LoginMessage {
implicit val fmtLoginMessage = Json.format[LoginMessage]
}
If the fields don't have the same name you will have to specify the reads object manually. Your code to convert it into a case class would be something like:
object BaseMessageFactory {
def getMessage(msg: Message): Option[BaseMessage] = {
val propertiesJson = Json.toJson(msg.properties)
msg.properties.get("type").map {
case "login" => propertiesJson.as[LoginMessage]
...
case _ => //Some error
}
}
}
The signature may differ depending on how you want to deal with error handling.
Let's say I have a following case class:
case class Product(name: String, categoryId: Option[Long]/*, other fields....*/)
Here you can see that categoryId is optional.
Now let's say I have a following method in my DAO layer:
getCategoryProducts(): List[Product] = {
// query products that have categoryId defined
}
You see, that this method returns products, that are guaranteed to have categoryId defined with some value.
What I would like to do is something like this:
trait HasCategory {
def categoryId_!: Long
}
// and then specify in method signature
getCategoryProducts(): List[Product with HasCategory]
This will work, but then such a product will have two methods: categoryId_! and categoryId that smells bad.
Another way would be:
sealed trait Product {
def name: String
/*other fields*/
}
case class SimpleProduct(name: String, /*, other fields....*/) extends Product
case class ProductWithCategory(name: String, categoryId: Long/*, other fields....*/) extends Product
def getCategoryProducts: List[ProductWithCategory] = ...
This method helps to avoid duplicate methods categoryId and categoryId_!, but it requires you to create two case classes and a trait duplicating all the fields, which also smells.
My question: how can I use Scala type system to declare this specific case without these fields duplications ?
Not sure how much this will scale for your particular case, but one solution that comes to mind is to parameterize over the Option type using a higher-kinded generic type:
object Example {
import scala.language.higherKinds
type Id[A] = A
case class Product[C[_]](name: String, category: C[Long])
def productsWithoutCategories: List[Product[Option]] = ???
def productsWithCategories: List[Product[Id]] = ???
}
A way to do it is to use type classes -
import scala.language.implicitConversions
object Example {
sealed class CartId[T]
implicit object CartIdSomeWitness extends CartId[Some[Long]]
implicit object CartIdNoneWitness extends CartId[None.type]
implicit object CartIdPresentWitness extends CartId[Long]
case class Product[T: CartId](name: String, categoryId: T /*, other fields....*/)
val id: Long = 7
val withId = Product("dsds", id)
val withSomeId = Product("dsds", Some(id))
val withNoneId = Product("dsds", None)
val presentId: Long = withId.categoryId
val maybeId: Some[Long] = withSomeId.categoryId
val noneId: None.type = withNoneId.categoryId
val p = Product("sasa", true) //Error:(30, 18) could not find implicit value for evidence parameter of type com.novak.Program.CartId[Boolean]
}
This solution involves some code and dependent on implicits but does what you're trying to achieve.
Be aware that this solution is not completely sealed and can be 'hacked'. You can cheat and do something like -
val hack: Product[Boolean] = Product("a", true)(new CartId[Boolean])
val b: Boolean =hack.categoryId
For some more - advanced solutions which include
* Miles Sabin (#milessabin)’s Unboxed union types in Scala via the Curry-Howard isomorphism
* Scalaz / operator
http://eed3si9n.com/learning-scalaz/Coproducts.html
I use case class to transform the class object to data for slick2 before, but current I use another play plugin, the plugin object use the case class, my class is inherent from this case class. So, I can not use case class as the scala language forbidden use case class to case class inherent.
before:
case class User()
class UserTable(tag: Tag) extends Table[User](tag, "User") {
...
def * = (...)<>(User.tupled,User.unapply)
}
it works.
But now I need to change above to below:
case class BasicProfile()
class User(...) extends BasicProfile(...){
...
def unapply(i:User):Tuple12[...]= Tuple12(...)
}
class UserTable(tag: Tag) extends Table[User](tag, "User") {
...
def * = (...)<>(User.tupled,User.unapply)
}
I do not know how to write the tupled and unapply(I am not my writing is correct or not) method like the case class template auto generated. Or you can should me other way to mapping the class to talbe by slick2.
Any one can give me an example of it?
First of all, this case class is a bad idea:
case class BasicProfile()
Case classes compare by their member values, this one doesn't have any. Also the name is not great, because we have the same name in Slick. May cause confusion.
Regarding your class
class User(...) extends BasicProfile(...){
...
def unapply(i:User):Tuple12[...]= Tuple12(...)
}
It is possible to emulate case classes yourself. Are you doing that because of the 22 field limit? FYI: Scala 2.11 supports larger case classes. We are doing what you are trying at Sport195, but there are several aspects to take care of.
apply and unapply need to be members of object User (the companion object of class User). .tupled is not a real method, but generated automatically by the Scala compiler. it turns a method like .apply that takes a list of arguments into a function that takes a single tuple of those arguments. As tuples are limited to 22 columns, so is .tupled. But you could of course auto-generated one yourself, may have to give it another name.
We are using the Slick code generator in combination with twirl template engine (uses # to insert expressions. The $ are inserted as if into the generated Scala code and evaluated, when the generated code is compiled/run.). Here are a few snippets that may help you:
Generate apply method
/** Factory for #{name} objects
#{indentN(2,entityColumns.map(c => "* #param "+c.name+" "+c.doc).mkString("\n"))}
*/
final def apply(
#{indentN(2,
entityColumns.map(c =>
colWithTypeAndDefault(c)
).mkString(",\n")
)}
) = new #{name}(#{columnsCSV})
Generate unapply method:
#{if(entityColumns.size <= 22)
s"""
/** Extractor for ${name} objects */
final def unapply(o: ${name}) = Some((${entityColumns.map(c => "o."+c.name).mkString(", ")}))
""".trim
else
""}
Trait that can be mixed into User to make it a Scala Product:
trait UserBase with Product{
// Product interface
def canEqual(that: Any): Boolean = that.isInstanceOf[#name]
def productArity: Int = #{entityColumns.size}
def productElement(n: Int): Any = Seq(#{columnsCSV})(n)
override def toString = #{name}+s"(${productIterator.toSeq.mkString(",")})"
...
case-class like .copy method
final def copy(
#{indentN(2,columnsCopy)}
): #{name} = #{name}(#{columnsCSV})
To use those classes with Slick you have several options. All are somewhat newer and not documented (well). The normal <> operator Slick goes via tuples, but that's not an option for > 22 columns. One option are the new fastpath converters. Another option is mapping via a Slick HList. No examples exist for either. Another option is going via a custom Shape, which is what we do. This will require you to define a custom shape for your User class and another class defined using Column types to mirror user within queries. Like this: http://slick.typesafe.com/doc/2.1.0/api/#scala.slick.lifted.ProductClassShape Too verbose to write by hand. We use the following template code for this:
/** class for holding the columns corresponding to #{name}
* used to identify this entity in a Slick query and map
*/
class #{name}Columns(
#{indent(
entityColumns
.map(c => s"val ${c.name}: Column[${c.exposedType}]")
.mkString(", ")
)}
) extends Product{
def canEqual(that: Any): Boolean = that.isInstanceOf[#name]
def productArity: Int = #{entityColumns.size}
def productElement(n: Int): Any = Seq(#{columnsCSV})(n)
}
/** shape for mapping #{name}Columns to #{name} */
object #{name}Implicits{
implicit object #{name}Shape extends ClassShape(
Seq(#{
entityColumns
.map(_.exposedType)
.map(t => s"implicitly[Shape[ShapeLevel.Flat, Column[$t], $t, Column[$t]]]")
.mkString(", ")
}),
vs => #{name}(#{
entityColumns
.map(_.exposedType)
.zipWithIndex
.map{ case (t,i) => s"vs($i).asInstanceOf[$t]" }
.mkString(", ")
}),
vs => new #{name}Columns(#{
entityColumns
.map(_.exposedType)
.zipWithIndex
.map{ case (t,i) => s"vs($i).asInstanceOf[Column[$t]]" }
.mkString(", ")
})
)
}
import #{name}Implicits.#{name}Shape
A few helpers we put into the Slick code generator:
val columnsCSV = entityColumns.map(_.name).mkString(", ")
val columnsCopy = entityColumns.map(c => colWithType(c)+" = "+c.name).mkString(", ")
val columnNames = entityColumns.map(_.name.toString)
def colWithType(c: Column) = s"${c.name}: ${c.exposedType}"
def colWithTypeAndDefault(c: Column) =
colWithType(c) + colDefault(c).map(" = "+_).getOrElse("")
def indentN(n:Int,code: String): String = code.split("\n").mkString("\n"+List.fill(n)(" ").mkString(""))
I know this may a bit troublesome to replicate, especially if you are new to Scala. I hope to to find the time get it into the official Slick code generator at some point.
I wrote a trait to mix into a class the ability to serialize itself to query string parameters, leveraging an existing JSON Writes instance. In order to use that Writes instance as a parameter, I need to know the type into which this trait is being mixed. I'm getting that using a type parameter (which should be the class itself) and a self-type annotation. I'm wondering if there's a DRYer way of doing this, which doesn't require the type parameter?
Here's my code:
trait ConvertibleToQueryString[T] {
this: T =>
/** Transformation of field names in obj to query string keys */
def objToQueryStringMapping: Map[JsPath, JsPath] = Map.empty
/**
* Convert a model to a Map, for serialization to a query string, to be used
* in a REST API call.
* #param writes writer for `obj`
* #return
*/
def toQueryStringMap(implicit writes: Writes[T]): Map[String, String] = {
// Get a map of key -> JsValue from obj
val mapObj = Json.toJson(this).transform(moveKeys(objToQueryStringMapping)).get.value
// Convert the JsValue values of the map to query strings
mapObj.mapValues(jsValueToQueryStringValue).filter(_._2.nonEmpty).toMap
}
}
, to be used as such:
case class MyClass(param1: String, param2: Int) extends ConvertibleToQueryString[MyClass]
, that final type parameter being the thing that's annoying me. It's fully unconstrained, but it should really just be "the type of the class I get mixed into". Is there a way to express this?
Why not use the pimp-encrich-my-library pattern:
implicit class ConvertibleToQueryString[T: Writes](x: T) {
def objToQueryStringMapping: Map[JsPath, JsPath] = Map.empty
def toQueryStringMap: Map[String, String] = {
// Get a map of key -> JsValue from obj
val mapObj = Json.toJson(x).transform(moveKeys(objToQueryStringMapping)).get.value
// Convert the JsValue values of the map to query strings
mapObj.mapValues(jsValueToQueryStringValue).filter(_._2.nonEmpty).toMap
}
}
Now you don't need the extends ... at all on the classes you want to serialize.