Scala function return type based on generic - scala

Using Scala generics I'm trying to abstract some common functions in my Play application. The functions return Seqs with objects deserialized from a REST JSON service.
def getPeople(cityName: String): Future[Seq[People]] = {
getByEndpoint[People](s"http://localhost/person/$cityName")
}
def getPeople(): Future[Seq[Dog]] = {
getByEndpoint[Dog]("http://localhost/doge")
}
The fetch and deserialization logic is packed into a single function using generics.
private def getByEndpoint[T](endpoint: String): Future[Seq[T]] = {
ws.url(endpoint)
.get()
.map(rsp => rsp.json)
.flatMap { json =>
json.validate[Seq[T]] match {
case s: JsSuccess[Seq[T]] =>
Future.successful(s.get)
case e: JsError =>
Future.failed(new RuntimeException(s"Get by endpoint JSON match failed: $e"))
}
}
}
Problem is is I'm getting "No Json deserializer found for type Seq[T]. Try to implement an implicit Reads or Format for this type.". I'm sure I'm not using T properly in Seq[T] (according to my C#/Java memories at least), but I can't find any clue how to do it the proper way in Scala. Everything works as expected without using generics.

Play JSON uses type classes to capture information about which types can be (de-)serialized to and from JSON, and how. If you have an implicit value of type Format[Foo] in scope, that's referred to as an instance of the Format type class for Foo.
The advantage of this approach is that it gives us a way to constrain generic types (and have those constraints checked at compile time) that doesn't depend on subtyping. For example, there's no way the standard library's String will ever extend some kind of Jsonable trait that Play (or any other library) might provide, so we need some way of saying "we know how to encode Strings as JSON" that doesn't involve making String a subtype of some trait we've defined ourselves.
In Play JSON you can do this by defining implicit Format instances, and Play itself provides many of these for you (e.g., if you've got one for T, it'll give you one for Seq[T]). The validate method on JsValue requires one of these instances (actually a subtype of Format, Reads, but that's not terribly relevant here) for its type parameter—Seq[T] in this case—and it won't compile unless the compiler can find that instance.
You can provide this instance by adding the constraint to your own generic method:
private def getByEndpoint[T: Format](endpoint: String): Future[Seq[T]] = {
...
}
Now with the T: Format syntax you've specified that there has to be a Format instance for T (even though you don't constraint T in any other way), so the compiler knows how to provide the Format instance for Seq[T] that the json.validate[Seq[T]] call requires.

Related

json4s Serialize and deserialize generic type

Because I am dealing with generic type therefore I can't use specific case classes. Then I created a generic util which serializes and deserializes generic object.
import org.json4s
import org.json4s.Formats._
import org.json4s.native.JsonMethods._
object JsonHelper {
def json2Object[O](input: String) : O = {
parse(json4s.string2JsonInput(input)).asInstanceOf[O]
}
def object2Json[O](input: O) : String = {
write(input).toString
}
}
The compiler throws the error:
No JSON serializer found for type O. Try to implement an implicit Writer or JsonFormat for this type.
write(input).toString
This should be thrown at runtime but why it's thrown at compile time?
In a comment above, you asked "So how jackson can work with java object? It use reflection right? And why it's different from Scala?", which gets to the heart of this question.
The json4s "native" serializer you have imported uses compile-time reflection to create the Writer.
Jackson uses run-time reflection to do the same.
The compile-time version is more efficient; the run-time version is more flexible.
To use the compile-time version, you need to let the compiler have enough information to choose the correct Writer based on the declared type of the object to be serialized. This will rule out very generic writer methods like the one you propose. See #TimP's answer for how to fix your code for that version.
To use the run-time version, you can use Jackson via the org.json4s.jackson.JsonMethods._ package. See https://github.com/json4s/json4s#jackson
The compiler error you posted comes from this location in the json4s code. The write function you're calling takes an implicit JSON Writer, which is how the method can take arbitrary types. It's caught at compile time because implicit arguments are compiled the same way explicit ones are -- it's as if you had:
def f(a: Int, b: Int) = a + b
f(5) // passed the wrong number of arguments
I'm having a bit of trouble seeing exactly which write method you're calling here -- the json4s library is pretty big and things are overloaded. Can you paste the declared write method you're using? It almost certainly has a signature like this:
def write[T](value: T)(implicit writer: Writer[T]): JValue
If it looks like the above, try including the implicit writer parameter in your method as so:
object JsonHelper {
def json2Object[O](input: String)(implicit reader: Reader[O]) : O = {
parse(json4s.string2JsonInput(input)).asInstanceOf[O]
}
def object2Json[O](input: O)(implicit writer: Writer[O]) : String = {
write(input).toString
}
}
In this example, you have a deal with generic types, Scala, like another jvm languages, has type erasing mechanism at compile time (error message at compile time may don't contain message about generic in a whole), so try to append this fragment to the signature of both methods:
(implicit tag: ClassTag[T])
it's similar to you example with generic, but with jackson.
HTH

Can structural typing work with generics?

I have an interface defined using a structural type like this:
trait Foo {
def collection: {
def apply(a: Int) : String
def values() : collection.Iterable[String]
}
}
}
I wanted to have one of the implementers of this interface do so using a standard mutable HashMap:
class Bar {
val collection: HashMap[Int, String] = HashMap[Int, String]()
}
It compiles, but at runtime I get a NoSuchMethod exception when referring a Bar instance through a Foo typed variable. Dumping out the object's methods via reflection I see that the HashMap's apply method takes an Object due to type erasure, and there's some crazily renamed generated apply method that does take an int. Is there a way to make generics work with structural types? Note in this particular case I was able to solve my problem using an actual trait instead of a structural type and that is overall much cleaner.
Short answer is that the apply method parameter is causing you grief because it requires some implicit conversions of the parameter (Int => Integer). Implicits are resolved at compile time, the NoSuchMethodException is likely a result of these missing implicits.
Attempt to use the values method and it should work since there are no implicits being used.
I've attempted to find a way to make this example work but have had no success so far.

How to infer types when one is a return value?

I'm trying to define a method which is generic both in its parameter and return type. Basically to make a helper function for JSON serialization to/from case classes.
so I want to write something like this pseudocode:
def post[Request,Response](data:Request) : Response = ???
case class A(i:String)
case class B(j:Int)
val result = post[A,B]("input")
in this case (assuming no errors) result is of type B.
It's understandable that the compiler can't infer the return value, but I'd like it to infer the Request type. In other words I'd like to write something like
val result = post[B]("input")
where the type of A is inferred by the data parameter, so the user need only specify the return type when calling the function.
I don't know many details of Scala specifically, but in Haskell that ability is enabled by a compiler option called "Functional dependencies", whereby you have a typeclass with two type variables, one of which can be derived from the other - see section 7.4.3 of http://www.haskell.org/ghc/docs/6.6/html/users_guide/type-extensions.html. Obviously you can't just use this feature, since it's in a different language, but knowing what it's called should help you find a solution. For example, Functional dependencies in Scala looks like a good guess; although again, I don't know enough Scala to read that article and then tell you exactly how to answer your original JSON question.
Following on from #amalloy's answer, and the link he provides, the Scala equivalent for what you are trying to achieve would be something like the following:
trait ReqResp[Request,Response] {
def apply(req: Request): Response
}
def post[Request,Response](data:Request)(implicit rr: ReqResp[Request,Response]): Response = rr(data)
case class A(i:String)
case class B(j:Int)
implicit object reqRespAB extends ReqResp[A,B] {
def apply(a: A) = B(a.i.toInt)
}
val result = post(A("456"))
This gives the output:
result: B = B(456)

Designing serialization library in Scala with type classes

I have system where I need to serialize different kinds of objects to json and xml. Some of them are Lift MetaRecords, some are case classes. I wanted to use type classes and create something like:
trait Serializable[T] {
serialize[T](obj: T): T
}
And usual implementations for json, xml and open for extension.
Problem I'm facing now is serialization itself. Currently there are different contexts in which objects are serialized. Imagine news feed system. There are three objects: User, Post (feed element) and Photo. Those objects have some properties and can reference each other. Now in same cases I want to serialize object alone (user settings, preferences, etc.) in other cases I need other objects to be serialized as well ie. Feed: List[Post] + related photos. In order to do that I need to provide referenced objects.
My current implementation is bloated with optional parametered functions.
def feedAsJson(post: MPost, grp: Option[PrivateGroup], commentsBox: Option[List[MPostComment]] = Empty): JObject
I thought about implementing some kind of context solution. Overload feedAsJson with implicit context parameter that will provide necessary data. I don't know how I'd like to implement it yet as it touches database maybe with cake pattern. Any suggestions very appreciated.
Can't you put the implicits in scope that will create the right kind of serializers that you need? Something to that effect:
def doNothingSerializer[T]: Serializable[T] = ???
implicit def mpostToJson(implicit pgs:Serializable[PrivateGroup]],
cmts:Serializable[List[MPostComment]]) =
new Serializable[MPost] {
def serialize(mpost: MPost): JObject = {
val privateGroupJSon = pgs.serialize(mpost.privateGroup)
// make the mpost json with privateGroupJSon which would be empty
???
}
}
// later where you need to serialize without the inner content:
implicit val privateGroupToJson = doNothingSerializer[PrivateGroup]
implicit val mpostCommentsToJson = doNothingSerializer[List[MPostComment]]
implicitly[Serializable[MPost]].serialize(mpost)
You would need to define default serializable instances in a trait that is then inherited (so that low priority implicits are in scope).
Note that I'm assuming that the trait for Serializable is:
trait Serializable[T] {
def serialize(t: T): JObject
}
(no [T] method type argument and returns a JObject)
Maybe "Scala Pickling" might help you:
http://lampwww.epfl.ch/~hmiller/pickling
I just watched the presentation.

Scala Implicit Conversion Gotchas

EDIT
OK, #Drexin brings up a good point re: loss of type safety/surprising results when using implicit converters.
How about a less common conversion, where conflicts with PreDef implicits would not occur? For example, I'm working with JodaTime (great project!) in Scala. In the same controller package object where my implicits are defined, I have a type alias:
type JodaTime = org.joda.time.DateTime
and an implicit that converts JodaTime to Long (for a DAL built on top of ScalaQuery where dates are stored as Long)
implicit def joda2Long(d: JodaTime) = d.getMillis
Here no ambiguity could exist between PreDef and my controller package implicits, and, the controller implicits will not filter into the DAL as that is in a different package scope. So when I do
dao.getHeadlines(articleType, Some(jodaDate))
the implicit conversion to Long is done for me, IMO, safely, and given that date-based queries are used heavily, I save some boilerplate.
Similarly, for str2Int conversions, the controller layer receives servlet URI params as String -> String. There are many cases where the URI then contains numeric strings, so when I filter a route to determine if the String is an Int, I do not want to stringVal.toInt everytime; instead, if the regex passes, let the implicit convert the string value to Int for me. All together it would look like:
implicit def str2Int(s: String) = s.toInt
get( """/([0-9]+)""".r ) {
show(captures(0)) // captures(0) is String
}
def show(id: Int) = {...}
In the above contexts, are these valid use cases for implicit conversions, or is it more, always be explicit? If the latter, then what are valid implicit conversion use cases?
ORIGINAL
In a package object I have some implicit conversions defined, one of them a simple String to Int:
implicit def str2Int(s: String) = s.toInt
Generally this works fine, methods that take an Int param, but receive a String, make the conversion to Int, as do methods where the return type is set to Int, but the actual returned value is a String.
Great, now in some cases the compiler errors with the dreaded ambiguous implicit:
both method augmentString in object Predef of type (x: String)
scala.collection.immutable.StringOps and method str2Int(s: String) Int
are possible conversion functions from java.lang.String to ?{val
toInt: ?}
The case where I know this is happening is when attempting to do manual inline String-to-Int conversions. For example, val i = "10".toInt
My workaround/hack has been to create an asInt helper along with the implicits in the package object: def asInt(i: Int) = i and used as, asInt("10")
So, is implicit best practice implicit (i.e. learn by getting burned), or are there some guidelines to follow so as to not get caught in a trap of one's own making? In other words, should one avoid simple, common implicit conversions and only utilize where the type to convert is unique? (i.e. will never hit ambiguity trap)
Thanks for the feedback, implicits are awesome...when they work as intended ;-)
I think you're mixing two different use cases here.
In the first case, you're using implicit conversions used to hide the arbitrary distinction (or arbitrary-to-you, anyway) between different classes in cases where the functionality is identical. The JodaTime to Long implicit conversion fits in that category; it's probably safe, and very likely a good idea. I would probably use the enrich-my-library pattern instead, and write
class JodaGivesMS(jt: JodaTime) { def ms = jt.getMillis }
implicit def joda_can_give_ms(jt: JodaTime) = new JodaGivesMS(jt)
and use .ms on every call, just to be explicit. The reason is that units matter here (milliseconds are not microseconds are not seconds are not millimeters, but all can be represented as ints), and I'd rather leave some record of what the units are at the interface, in most cases. getMillis is rather a mouthful to type every time, but ms is not too bad. Still, the conversion is reasonable (if well-documented for people who may modify the code in years to come (including you)).
In the second case, however, you're performing an unreliable transformation between one very common type and another. True, you're doing it in only a limited context, but that transformation is still liable to escape and cause problems (either exceptions or types that aren't what you meant). Instead, you should write those handy routines that you need that correctly handle the conversion, and use those everywhere. For example, suppose you have a field that you expect to be "yes", "no", or an integer. You might have something like
val Rint = """(\d+)""".r
s match {
case "yes" => println("Joy!")
case "no" => println("Woe!")
case Rint(i) => println("The magic number is "+i.toInt)
case _ => println("I cannot begin to describe how calamitous this is")
}
But this code is wrong, because "12414321431243".toInt throws an exception, when what you really want is to say that the situation is calamitous. Instead, you should write code that matches properly:
case object Rint {
val Reg = """([-]\d+)""".r
def unapply(s: String): Option[Int] = s match {
case Reg(si) =>
try { Some(si.toInt) }
catch { case nfe: NumberFormatException => None }
case _ => None
}
}
and use this instead. Now instead of performing a risky and implicit conversion from String to Int, when you perform a match it will all be handled properly, both the regex match (to avoid throwing and catching piles of exceptions on bad parses) and the exception handling even if the regex passes.
If you have something that has both a string and an int representation, create a new class and then have implicit conversions to each if you don't want your use of the object (which you know can safely be either) to keep repeating a method call that doesn't really provide any illumination.
I try not to convert anything implicitly just to convert it from one type to another, but only for the pimp my library pattern. It can be a bit confusing, when you pass a String to a function that takes an Int. Also there is a huge loss of type safety. If you would pass a string to a function that takes an Int by mistake the compiler could not detect it, as it assumes you want to do it. So always do type conversion explicitly and only use implicit conversions to extend classes.
edit:
To answer your updated question: For the sake of readability, please use the explicit getMillis. In my eyes valid use cases for implicits are "pimp my library", view/context bounds, type classes, manifests, builders... but not being too lazy to write an explicit call to a method.