Scala Type Classes Understanding Interface Syntax - scala

I'm was reading about cats and I encountered the following code snippet which is about serializing objects to JSON!
It starts with a trait like this:
trait JsonWriter[A] {
def write(value: A): Json
}
After this, there are some instances of our domain object:
final case class Person(name: String, email: String)
object JsonWriterInstances {
implicit val stringWriter: JsonWriter[String] =
new JsonWriter[String] {
def write(value: String): Json =
JsString(value)
}
implicit val personWriter: JsonWriter[Person] =
new JsonWriter[Person] {
def write(value: Person): Json =
JsObject(Map(
"name" -> JsString(value.name),
"email" -> JsString(value.email)
))
}
// etc...
}
So far so good! I can then use this like this:
import JsonWriterInstances._
Json.toJson(Person("Dave", "dave#example.com"))
Later on I come across something called the interface syntax, which uses extension methods to extend existing types with interface methods like below:
object JsonSyntax {
implicit class JsonWriterOps[A](value: A) {
def toJson(implicit w: JsonWriter[A]): Json =
w.write(value)
}
}
This then simplifies the call to serializing a Person as:
import JsonWriterInstances._
import JsonSyntax._
Person("Dave", "dave#example.com").toJson
What I don't understand is that how is the Person boxed into JsonWriterOps such that I can directly call the toJson as though toJson was defined in the Person case class itself. I like this magic, but I fail to understand this one last step about the JsonWriterOps. So what is the idea behind this interface syntax and how does this work? Any help?

This is actually a standard Scala feature, since JsonWriterOps is marked implicit and is in scope, the compiler can apply it at compilation-time when needed.
Hence scalac will do the following transformations:
Person("Dave", "dave#example.com").toJson
new JsonWriterOps(Person("Dave", "dave#example.com")).toJson
new JsonWriterOps[Person](Person("Dave", "dave#example.com")).toJson
Side note:
It's much more efficient to implicit classes as value classes like this:
implicit class JsonWriterOps[A](value: A) extends AnyVal
This makes the compiler also optimize away the new object construction, if possible, compiling the whole implicit conversion + method call to a simple function call.

Related

Testing a unit with implicit class in Scala

Imagine I have a service:
class ServiceA(serviceB: ServiceB) {
import Extractor._
def send(typeA: A) = serviceB.send(typeA.extract)
}
object Extractor {
implicit class Extractor(type: A) {
def extract = ???
}
}
I want the extract method to be an implicitly defined because it doesn't directly relate to A type/domain and is a solution specific adhoc extension.
Now I would like to write a very simple unit test that confirms that serviceB.send is called.
For that, I mock service and pass a mocked A to send. Then I could just assert that serviceB.send was called with the mocked A.
As seen in the example, the send method also does some transformation on typeA parameter so I would need to mock extract method to return my specified value. However, A doesn't have extract method - it comes from the implicit class.
So the question is - how do I mock out the implicit class as in the example above as imports are not first class citizens.
If you want to specify a bunch of customised extract methods, you can do something like this:
sealed trait Extractor[T] {
// or whatever signature you want
def extract(obj: T): String
}
object Extractor {
implicit case object IntExtractor extends Extractor[Int] {
def extract(obj: Int): String = s"I am an int and my value is $obj"
}
implicit case object StringExtractor extends Extractor[String] {
def extract(obj: String): String = s"I am "
}
def apply[A : Extractor](obj: A): String = {
implicitly[Extractor[A]].extract(obj)
}
}
So you have basically a sealed type family that's pre-materialised through case objects, which are arguably only useful in a match. But that would let you decouple everything.
If you don't want to mix this with Extractor, call them something else and follow the same approach, you can then mix it all in with a context bound.
Then you can use this with:
println(Extractor(5))
For testing, simply override the available implicits if you need to. A bit of work, but not impossible, you can simply control the scope and you can spy on whatever method calls you want.
e.g instead of import Extractor._ have some other object with test only logic where you can use mocks or an alternative implementation.

Scala implicit conversion not recognised

I have the following object for making a conversion of an object ParsedItemDocument to a json String. I should note that ParsedItemDocument is a trait. My problem is that the implicit conversion that is called on the second snippet is not recognized by the compiler. Is there anything more that needs to be done for the implicit conversion to work?
import scala.language.implicitConversions
import wikidataParser.ParsedItemDocument
object Converters {
def toJson(obj: Any): String = {
val mapper = new ObjectMapper()
mapper.registerModule(DefaultScalaModule)
val out = new StringWriter
mapper.writeValue(out, obj)
return out.toString()
}
implicit def parsedItemDocumentToJsonString
(item: ParsedItemDocument): String = {
Converters.toJson(item)
}
}
Now, I use the following code-snippet in my code
import tools.Converters._
import wikidataParser.ParsedItemDocument
class WikipediaRankingTester2 extends FlatSpec {
"It" should "do something" in {
val jsonrdd:RDD[String]=rankedItems.map(t:Long,ParsedItemDocument)=>
t._2.parsedItemDocumentToJsonString)//compilation error here
}
}
You are mixing up implicit conversions and implicit classes.
If you want to use parsedItemDocumentToJsonString as a "method" of an object of type ParsedItemDocument, then you would need to define your implicit as
implicit class JSONParsing(item: ParsedItemDocument): String {
def parsedItemDocumentToJsonString = Converters.toJson(item)
}
If you declare it as an implicit conversion, as you did, then it means that you can call any methods of String on an object of type ParsedItemDocument, as the object will be implicitly converted to a String through the implicit method.
Also, it is not great practice to declare an entire implicit class / conversion, unless you 1) cannot add it to the original class, or 2) will be reusing the conversion very often, and it would save great amounts of code/readability. This does not seem to be the case here, as you are only wrapping in Converters.toJSON, which is not very verbose, and jsut as readable.
PS: your syntax in your "map" is wrong, the right syntax would be
val jsonrdd = rankedItems.map(t => t._2.parsedItemDocumentToJsonString)
If your implicit is working this should do it:
rankedItems.map(t => t._2)
You can test this by making the call explicit
rankedItems.map(t => parsedItemDocumentToJsonString(t._2))
If you want to add an extra method to a ParsedItemDocument you could use an implicit class, I don't think you need it but your code looks a bit that way.

How to design immutable model classes when using inheritance

I'm having trouble finding an elegant way of designing a some simple classes to represent HTTP messages in Scala.
Say I have something like this:
abstract class HttpMessage(headers: List[String]) {
def addHeader(header: String) = ???
}
class HttpRequest(path: String, headers: List[String])
extends HttpMessage(headers)
new HttpRequest("/", List("foo")).addHeader("bar")
How can I make the addHeader method return a copy of itself with the new header added? (and keep the current value of path as well)
Thanks,
Rob.
It is annoying but the solution to implement your required pattern is not trivial.
The first point to notice is that if you want to preserve your subclass type, you need to add a type parameter. Without this, you are not able to specify an unknown return type in HttpMessage
abstract class HttpMessage(headers: List[String]) {
type X <: HttpMessage
def addHeader(header: String):X
}
Then you can implement the method in your concrete subclasses where you will have to specify the value of X:
class HttpRequest(path: String, headers: List[String])
extends HttpMessage(headers){
type X = HttpRequest
def addHeader(header: String):HttpRequest = new HttpRequest(path, headers :+header)
}
A better, more scalable solution is to use implicit for the purpose.
trait HeaderAdder[T<:HttpMessage]{
def addHeader(httpMessage:T, header:String):T
}
and now you can define your method on the HttpMessage class like the following:
abstract class HttpMessage(headers: List[String]) {
type X <: HttpMessage
def addHeader(header: String)(implicit headerAdder:HeaderAdder[X]):X = headerAdder.add(this,header) }
}
This latest approach is based on the typeclass concept and scales much better than inheritance. The idea is that you are not forced to have a valid HeaderAdder[T] for every T in your hierarchy, and if you try to call the method on a class for which no implicit is available in scope, you will get a compile time error.
This is great, because it prevents you to have to implement addHeader = sys.error("This is not supported")
for certain classes in the hierarchy when it becomes "dirty" or to refactor it to avoid it becomes "dirty".
The best way to manage implicit is to put them in a trait like the following:
trait HeaderAdders {
implicit val httpRequestHeaderAdder:HeaderAdder[HttpRequest] = new HeaderAdder[HttpRequest] { ... }
implicit val httpRequestHeaderAdder:HeaderAdder[HttpWhat] = new HeaderAdder[HttpWhat] { ... }
}
and then you provide also an object, in case user can't mix it (for example if you have frameworks that investigate through reflection properties of the object, you don't want extra properties to be added to your current instance) (http://www.artima.com/scalazine/articles/selfless_trait_pattern.html)
object HeaderAdders extends HeaderAdders
So for example you can write things such as
// mixing example
class MyTest extends HeaderAdders // who cares about having two extra value in the object
// import example
import HeaderAdders._
class MyDomainClass // implicits are in scope, but not mixed inside MyDomainClass, so reflection from Hiberante will still work correctly
By the way, this design problem is the same of Scala collections, with the only difference that your HttpMessage is TraversableLike. Have a look to this question Calling map on a parallel collection via a reference to an ancestor type

how to pass generic types to Argonaut

I am trying to wrap Argonaut (http://argonaut.io) in order to serialize/deserialize JSON in a Scala project. We where using Jerkson before but as it has been discontinued we are looking for an alternative.
This is the basic JSON wrapper
import argonaut._, Argonaut._
object Json {
def Parse[T](input: String): T = {
input.decodeOption[T].get
}
}
When I try and compile this I get the following errors.
could not find implicit value for evidence parameter of type argonaut.DecodeJson[T]
input.decodeOption[T]
^
not enough arguments for method decodeOption: (implicit evidence$6: argonaut.DecodeJson[T]) Option[T].
Unspecified value parameter evidence$6.
input.decodeOption[T]
^
Any suggestions on how to fix this or pointers on what I am doing wrong would be most appreciated.
Also suggestions on alternative JSON frameworks are very welcome.
I'm kind of new to Scala/Java and how generics work there but I have been writing .NET/C# for many years.
In order to make your code work, you will need to redefine the Json object like so:
object Json {
def Parse[T](input: String)(implicit decode:DecodeJson[T]): Option[T] = {
input.decodeOption[T]
}
}
The thing you were missing was the implicit DecodeJson instance that the decodeOption function needs in order to figure out how to decode. You also need to define the return type as Option[T] instead of just T. A full example of this all working would look like this:
import argonaut._, Argonaut._
case class User(id:Long, email:String, name:String)
object Json {
def Parse[T](input: String)(implicit decode:DecodeJson[T]): Option[T] = {
input.decodeOption[T]
}
}
object JsonTest{
implicit val userDecode = casecodec3(User.apply, User.unapply)("id", "email", "name")
def main(args: Array[String]) {
val json = """{
"id": 1,
"email": "foo#test.com",
"name": "foo bar"
}"""
val userOpt = Json.Parse[User](json)
println(userOpt)
}
}
As far as other Json frameworks, you could look into:
Play Json
json4s
spray-json
Jackson Scala Module
It seems that Argonaut, like pretty much all scala serialization libraries, uses the type class pattern. This sounds like a fancy thing, but actually it just means that when serializing/deserializing an object of type T, it needs you to implicitly pass an instance of another object to which part or all of the process is deferred to.
Specifically, when you do decodeOption[T], you need to have in scope an instance of argonaut.DecodeJson[T] (which decodeOption will use during the deserialization).
What you should do is simply to require this implicit value to be passed to Parse (it will then automatically be passed along to decodeOption:
def Parse[T](input: String)(implicit decoder: argonaut.DecodeJson[T]): Option[T] = {
input.decodeOption[T]
}
Scala even provides some syntactic sugar to make the declaration shorter (this is called a "context bound"):
def Parse[T:argonaut.DecodeJson](input: String): Option[T] = {
input.decodeOption[T]
}
Now, when calling Parse, you'll need to bring in scope an implicit value of argonaut.DecodeJson, or the call will fail to compile. Apparently the Argonaut object already defines decoders for many standard types, so for those types you won't have anything special to do.
For other types (such as custom types of yours), you'll have to define decoders and import them.

getting "incompatibe type" in returning an object instace

I'm writing a Play! 2.1 application using ReactiveMongo. each persistable case class has an object that holds 2 implicit objects, implementing BSONReader[...] and BSONWriter[...], and each case class has methods to return these:
trait Persistable {
implicit def getReader: BSONReader[Persistable]
implicit def getWriter: BSONWriter[Persistable]
val collectionName: String
}
case class MyObj () extends Persistable {
override val collectionName: String = MyObj.collectionName
override def getReader: BSONReader[MyObj] = MyObj.MyObjBSONReader
override def getWriter: BSONWriter[MyObj] = MyObj.MyObjBSONWriter
}
object MyObj{
val collectionName: String = "MyObj"
implicit object MyObjBSONReader extends BSONReader[MyObj] {
def fromBSON(document: BSONDocument): MyObj = {
val doc = document.toTraversable
new MyObj(
)
}
}
implicit object MyObjBSONWriter extends BSONWriter[MyObj] {
def toBSON(myObj: MyObj) = {
BSONDocument(
)
}
}
for some reason, getReader seems to work fine, but getWriter errors:
overriding method getWriter in trait Persistable of type =
reactivemongo.bson.handlers.BSONWriter[models.persistable.Persistable];
method getWriter has incompatible type
what am i doing wrong? both seem to have similar signatures.
another hint is that if i remove the return type from getWriter, i get complie time error in eclipse:
type mismatch; found : models.persistable.MyObj.MyObjBSONWriter.type required:
reactivemongo.bson.handlers.BSONWriter[models.persistable.Persistable]
UPDATE:
I did as #AndrzejDoyle said below, but then the implementation of Persister, which was the heart of this exercise, complains:
def insert(persistable: Persistable) = {
val collection = db(persistable.collectionName)
import play.api.libs.concurrent.Execution.Implicits._
implicit val reader = persistable.getReader
implicit val writer = persistable.getWriter
collection.insert(persistable)
}
error:
trait Persistable takes type
parameters
It is due to covariance and contravariance.
The mongodb reader is defined as BSONReader[+DocumentType]. The + in the generic parameter, means that this class is covariant in that parameter. Or more fully,
If B is a subclass of A, then BSONReader[B] is a subclass of BSONReader[A].
Therefore you can use a BSONReader[MyObj] everywhere that a BSONReader[Persistable] is required.
On the other hand, the writer is contravariant: BSONWriter[-DocumentType]. This means that
If B is a subclass of A, then BSONWriter[B] is a superclass of BSONWriter[A].
Therefore your BSONWriter[MyObj] is not a subclass of BSONWriter[Persistable], and so cannot be used in its place.
This might seem confusing initially (i.e. "why does contravariance make sense when it's 'backwards'?"). However if you think about what the classes are doing, it becomes clearer. The reader probably produces some instance of its generic parameter. A caller then might expect it to produce a Persistable - if you have a version that specifically produces MyObjs instead then this is fine.
The writer on the other hand, is probably given an object of its generic parameter. A caller with a BSONWriter[Persistable] will call the write() method, passing in an instance of Persistable to be written. Your implementation can only write instances of MyObj, and so it doesn't actually match the interface. On the other hand, a BSONWriter[Object] would be a subclass of any BSONWriter, since it can (from a type perspective) accept any type as an argument.
The fundamental problem seems to be that your Persistable trait is looser than you intended. You probably want each implementation to return a reader and writer parameterized on itself, rather than on Persistable in general. You can achieve this via self-bounded generics:
trait Persistable[T <: Persistable[T]] {
implicit def getReader: BSONReader[T]
implicit def getWriter: BSONWriter[T]
val collectionName: String
}
and then declare the class as MyObj[MyObj]. Now the reader and writer are expected to be parameterised on MyObj, and your existing implementations will compile.