Implicit parameter resolution from surrounding scope - scala

I'm not a fan of bringing implicit parameters into my code so where I use them I want to encapsulate their use. So I am trying to define an object that both wraps up calls to spray-json with exception handling and contains default implicit JsonFormats for each of my model classes. However the implicit parameters are not resolved unless they are imported into the client, calling code, which is exactly where I don't want them to be. Here's what I have so far (which doesn't resolve the implicit formatters), is there a way I can get what I want to work?
package com.rsslldnphy.json
import com.rsslldnphy.models._
import spray.json._
object Json extends DefaultJsonProtocol {
implicit val personFormat = jsonFormat1(Person)
implicit val animalFormat = jsonFormat1(Animal)
def parse[T](s:String)(implicit reader: JsonReader[T]): Option[T] = {
try { Some(JsonParser(s).convertTo[T]) }
catch { case e: DeserializationException => None }
}
}
NB. a JsonFormat is a type of JsonReader
EDIT: Here's what I've written based on #paradigmatic's second suggestion (which I can't get to work, I still get Cannot find JsonReader or JsonFormat type class for T). Am I missing something?
object Protocols extends DefaultJsonProtocol {
implicit val personFormat = jsonFormat1(Person)
implicit val animalFormat = jsonFormat1(Animal)
}
object Json {
def parse[T](s:String): Option[T] = {
import Protocols._
try { Some(JsonParser(s).convertTo[T]) }
catch { case e: DeserializationException => None }
}
}
For the record, this is a code snippet that does work, but that I'm trying to avoid as it requires too much of the client code (ie. it needs to have the implicits in scope):
object Json extends DefaultJsonProtocol {
implicit val personFormat = jsonFormat1(Person)
implicit val animalFormat = jsonFormat1(Animal)
}
object ClientCode {
import Json._
def person(s: String): Person = JsonParser(s).convertTo[Person]
}

You could declare the implicits in the companion objects:
object Person {
implicit val personFormat: JReader[Person] = jsonFormat1(Person)
}
object Animal {
implicit val animalFormat: JReader[Animal] = jsonFormat1(Animal)
}
The implicit resolution rules are very complex. You can find more information in this blog post. But if the compiler is looking for a typeclass T[A], it will look (soon or later) for it in the companion object of class/trait A.
EDIT: If the issue is only a problem of scope "pollution", you could just introduce some braces. With your code example, you could call the function parse as:
package com.rsslldnphy.json
import com.rsslldnphy.models._
import spray.json._
object Json extends DefaultJsonProtocol {
implicit val personFormat = jsonFormat1(Person)
implicit val animalFormat = jsonFormat1(Animal)
def parse[T](s:String)(implicit reader: JsonReader[T]): Option[T] = {
try { Some(JsonParser(s).convertTo[T]) }
catch { case e: DeserializationException => None }
}
}
object JsonFacade {
def optParse[T]( s: String ): Option[T] = {
import Json._
parse[T]( s )
}
}
Here the implicits "pollutes" only the optParse method.

Related

using enumeratum enum as BSONDocument value does not compile

when I try to wrap my query in BSONDocument and putting my enumeratum enum as the value it docent compile.
for example, my enum:
sealed trait ProcessingStatus extends EnumEntry with UpperSnakecase
object ProcessingStatus extends Enum[ProcessingStatus] with ReactiveMongoBsonEnum[ProcessingStatus] {
val values: IndexedSeq[ProcessingStatus] = findValues
case object Processing extends ProcessingStatus
case object Done extends ProcessingStatus
}
and I have play json serializer that explains how to serialize:
object JsonSerialization {
import reactivemongo.api.bson._
implicit object ProcessingStatusReader extends BSONReader[ProcessingStatus] {
override def readTry(bson: BSONValue): Try[ProcessingStatus] = bson match {
case BSONString(s) => bson.asTry[ProcessingStatus]
case _ => Failure(new RuntimeException("String value expected"))
}
}
implicit object ProcessingStatusWriter extends BSONWriter[ProcessingStatus] {
override def writeTry(t: ProcessingStatus): Try[BSONString] = Try(BSONString(t.entryName))
}
//Report Serializers
implicit val ProcessingStatusFormat: Format[ProcessingStatus] = EnumFormats.formats(ProcessingStatus)
implicit val ReportFormat: OFormat[Report] = Json.format[Report]
}
and now in my dao this does not compile:
import reactivemongo.play.json.compat.json2bson.{toDocumentReader, toDocumentWriter}
import serializers.JsonSerialization._
def findReport(reportId: String) = {
val test = BSONDocument("123" -> ProcessingStatus.Processing) // dosent compile
}
screenshot:
compilation error:
overloaded method apply with alternatives:
(elms: Iterable[(String, reactivemongo.api.bson.BSONValue)])reactivemongo.api.bson.BSONDocument <and>
(elms: reactivemongo.api.bson.ElementProducer*)reactivemongo.api.bson.BSONDocument
cannot be applied to ((String, enums.ProcessingStatus.Done.type))
val test = BSONDocument("status" -> ProcessingStatus.Done)
An IDE error is not a compilation error (recommend to use sbt and its console to tests).
Your code (simplified as bellow), is compiling fine, whatever is telling the IDE (which is wrong).
import reactivemongo.api.bson._
import scala.util.Try
trait ProcessingStatus {
def entryName = "foo"
}
object JsonSerialization {
implicit object ProcessingStatusWriter extends BSONWriter[ProcessingStatus] {
override def writeTry(t: ProcessingStatus): Try[BSONString] = Try(BSONString(t.entryName))
}
}
import JsonSerialization._
BSON.write(new ProcessingStatus {})
Note.1: writeTry doesn't override anything, so the modifier is useless (and can lead to missunderstanding).
Note.2: Try(..) with a pure value such as BSONString(t.entryName) is over-engineered, rather use Success(..).
Note.3: Convenient factories are available such as val w = BSONWriter[T] { t => ... }.
Edit:
The typeclass BSONWriter (as most typeclass) is invariant, so having a BSONWriter[T] in the implicit scope doesn't allow to resolve a BSONWriter[U] forSome { U <: T }.
trait ProcessingStatus {
def entryName: String
}
object ProcessingStatus {
case object Done extends ProcessingStatus { val entryName = "done" }
}
object JsonSerialization {
implicit object ProcessingStatusWriter extends BSONWriter[ProcessingStatus] {
override def writeTry(t: ProcessingStatus): Try[BSONString] = Try(BSONString(t.entryName))
}
}
import JsonSerialization._
BSON.write(ProcessingStatus.Done
/*
<console>:32: error: could not find implicit value for parameter writer: reactivemongo.api.bson.BSONWriter[ProcessingStatus.Done.type]
BSON.write(ProcessingStatus.Done)
*/
// --- BUT ---
BSON.write(ProcessingStatus.Done: ProcessingStatus)
// Success(BSONString(done))
Also exposing Done (and other cases) as ProcessingStatus in the API is working.
import reactivemongo.api.bson._
import scala.util.Try
sealed trait ProcessingStatus {
def entryName: String
}
object ProcessingStatus {
val Done: ProcessingStatus = new ProcessingStatus { val entryName = "done" }
}
object JsonSerialization {
implicit object ProcessingStatusWriter extends BSONWriter[ProcessingStatus] {
override def writeTry(t: ProcessingStatus): Try[BSONString] = Try(BSONString(t.entryName))
}
}
import JsonSerialization._
BSON.write(ProcessingStatus.Done)

Spray JSON Format and Conversion Error

I have a trait for which I wanted to write Typeclasses for. This trait actually is a contract to do JSON to Case class conversion and vice versa. The definition of the trait is as below:
trait Converter[T] {
def convertFromJson(msg: String): Either[ConverterError, T]
def convertToJson(msg: T): String
}
Now, for one of the case classes that I have, I have defined the implementation like this:
object Converter extends DefaultJsonProtocol {
implicit object DefaultMessageConversions extends Converter[DefaultMessage] {
implicit val json = jsonFormat(DefaultMessage, "timestamp")
def convertFromJson(msg: String): Either[ConverterError, DefaultMessage] = {
try {
Right(msg.parseJson.convertTo[DefaultMessage])
}
catch {
case _: Exception => Left(ConverterError("Shit happens"))
}
}
def convertToJson(msg: DefaultMessage) = {
implicit val writes = Json.writes[DefaultMessage]
Json.toJson(msg).toString
}
}
def apply[T: Converter] : Converter[T] = implicitly
}
But I run into some compiler errors when I tried to build my project. I'm not sure what I did wrong?
[error] /Users/joesan/ingestion/src/main/scala/com/my/project/common/JsonMessageConversion.scala:28: could not find implicit value for evidence parameter of type com.my.project.common.JsonMessageConversion.Converter.JF[org.joda.time.DateTime]
[error] implicit val json = jsonFormat(DefaultMessage, "timestamp")
Here is how my case class look like:
case class DefaultMessage(timestamp: DateTime) extends KafkaMessage {
def this() = this(DateTime.now(DateTimeZone.UTC))
}
Your DefaultMessage uses org.joda.time.DateTime and spray-json does not know how to serialize/deserialize it out of the box.
Therefore you need to define a RootJsonFormat[DateTime] and bring it in implicit scope.
Here is an example implementation.

Unable to find JSONReader for parameterised-typed custom class in test, despite import

I have a custom class as follows
object SafeList {
def apply[A](x: List[A]): SafeList[A] = if (x == null) EmptyList else HasItems[A](x)
}
sealed abstract class SafeList[+A] extends Product with Serializable {
def get: List[A]
}
final case class HasItems[+A](x: List[A]) extends SafeList[A] {
def get = x
}
case object EmptyList extends SafeList[Nothing] {
def get = Nil
}
And a formatter for the SafeList which looks like this
...
import spray.json.DefaultJsonProtocol._
trait SafeCollectionFormats {
implicit def safeListFormat[A: RootJsonFormat] = new RootJsonFormat[SafeList[A]] {
def read(json: JsValue): SafeList[A] = {
val list: List[A] = listFormat[A].read(json)
SafeList[A](list)
}
def write(sl: SafeList[A]): JsValue =
listFormat[A].write(sl.get)
}
}
object SafeCollectionFormats extends SafeCollectionFormats
And that compiles.
But when I add a test for my formatter, like so....
...
import spray.json.DefaultJsonProtocol._
import marshalling.SafeCollectionFormats._
...
"Unmarshalling a json array with items" should "produce a SafeList with items" in {
val json: JsValue = JsArray(JsString("a"), JsString("b"), JsString("c"))
val list = List("a", "b", "c")
val result = json.convertTo[SafeList[String]]
assertResult(list)(result)
}
...
I get the following compilation error
Error:(14, 32) Cannot find JsonReader or JsonFormat type class for myapp.types.SafeList[String]
val result = json.convertTo[SafeList[String]]
^
I think there may be something in this answer to help me but it's a bit advanced for me. I thought my implicit safeListFormat was the JsonReader for my SafeList and i'm importing it into my Spec. I dont know if the parameterised types are confusing things?
Any ideas what i'm doing wrong?
Edit: Whilst my test at the moment is creating a SafeList of Strings, the ultimate intention is to create a SafeList of my domain objects. I will need to add a second test that builds a JsArray of MyObjects. And so the type of A - in JSON terms will be different. My SafeList needs to cope with both simple objects like Strings, and domain objects. I think I might raise this as a second SO question but I mention it here for context
It works for me with just one little change: I made SafeCollectionFormats extend DefaultJsonProtocol.
I also had to change the context bound for your safeListFormat to [A: JsonFormat].
trait SafeCollectionFormats extends DefaultJsonProtocol {
implicit def safeListFormat[A: JsonFormat] = new RootJsonFormat[SafeList[A]] {
def read(json: JsValue): SafeList[A] = {
val list: List[A] = listFormat[A].read(json)
SafeList[A](list)
}
def write(sl: SafeList[A]): JsValue =
listFormat[A].write(sl.get)
}
}
object SafeCollectionFormats extends SafeCollectionFormats
Hope this helps you.

Could not find implicit value for parameter x

Just when I thought I understood the basics of Scala's type system... :/
I'm trying to implement a class that reads the contents of a file and outputs a set of records. A record might be a single line, but it could also be a block of bytes, or anything. So what I'm after is a structure that allows the type of Reader to imply the type of the Record, which in turn will imply the correct Parser to use.
This structure works as long as MainApp.records(f) only returns one type of Reader. As soon as it can return more, I get this error:
could not find implicit value for parameter parser
I think the problem lies with the typed trait definitions at the top, but I cannot figure out how to fix the issue...
// Core traits
trait Record[T]
trait Reader[T] extends Iterable[Record[T]]
trait Parser[T] {
def parse(r: Record[T]): Option[Int]
}
// Concrete implementations
class LineRecord[T] extends Record[T]
class FileReader[T](f:File) extends Reader[T] {
val lines = Source.fromFile(f).getLines()
def iterator: Iterator[LineRecord[T]] =
new Iterator[LineRecord[T]] {
def next() = new LineRecord[T]
def hasNext = lines.hasNext
}
}
trait TypeA
object TypeA {
implicit object TypeAParser extends Parser[TypeA] {
def parse(r: Record[TypeA]): Option[Int] = ???
}
}
trait TypeB
object TypeB {
implicit object TypeBParser extends Parser[TypeB] {
def parse(r: Record[TypeB]): Option[Int] = ???
}
}
// The "app"
object MainApp {
def process(f: File) =
records(f) foreach { r => parse(r) }
def records(f: File) = {
if(true)
new FileReader[TypeA](f)
else
new FileReader[TypeB](f)
}
def parse[T](r: Record[T])(implicit parser: Parser[T]): Option[Int] =
parser.parse(r)
}
First off you must import the implicit object in order to use them:
import TypeA._
import TypeB._
That's not enough though. It seems like you're trying to apply implicits dynamically. That's not possible; they have to be found compile time.
If you import the objects as above and change the records so that the compiler finds the correct generic it will run fine:
def records(f: File) = new FileReader[TypeA](f)
But then it may not be what you were looking for ;)
The problem is that the return type of your records method is basically FileReader[_] (since it can return either FileReader[TypeA] or FileReader[TypeB]), and you don't provide an implicit argument of type Parser[Any]. If you remove the if-expression the return type is inferred to FileReader[TypeA], which works fine. I'm not sure what you're trying to do, but obviously the compiler can't select implicit argument based upon a type that is only known at runtime.
1) Using type with implicit inside as type parameter - doesn't bind this implicit to the host type, to do this change objects to the traits and mix them instead of generalizing (type-parametrizing):
def records(f: File) = {
if(true)
new FileReader(f) with TypeA
else
new FileReader(f) with TypeB
}
2) The parser should be in scope of function that calls parse. So you may try smthg like that:
def process(f: File) = {
val reader = records(f);
import reader._
reader foreach { r => parse(r) }
}
PlanB) Simpler alternative is to define type-parameter specific implicit methods inside the AppMain (or some trait mixed in), but it will work only if TypeA/TypeB is known on compile time, so records method can return concrete type:
implicit class TypeAParser(r: Record[TypeA]) {
def parse: Option[Int] = ???
}
implicit class TypeBParser(r: Record[TypeB]) {
def parse: Option[Int] = ???
}
def process[T <: TypeAorB](f: File) =
records[T](f).foreach(_.parse)
def recordsA[T <: TypeAorB](f: File) = new FileReader[T](f)
Here is, I think, the full set of modifications you need to do to get where I think you want to go.
import scala.io.Source
import java.io.File
import reflect.runtime.universe._
// Core traits
trait Record[+T]
trait Reader[+T] extends Iterable[Record[T]]
trait Parser[-T] {
def parse(r: Record[T]): Option[Int]
}
// Concrete implementations [unmodified]
class LineRecord[T] extends Record[T]
class FileReader[T](f:File) extends Reader[T] {
val lines = Source.fromFile(f).getLines()
def iterator: Iterator[LineRecord[T]] =
new Iterator[LineRecord[T]] {
def next() = new LineRecord[T]
def hasNext = lines.hasNext
}
}
sealed trait Alternatives
case class TypeA() extends Alternatives
object TypeA {
implicit object TypeAParser extends Parser[TypeA] {
def parse(r: Record[TypeA]): Option[Int] = ???
}
}
case class TypeB() extends Alternatives
object TypeB {
implicit object TypeBParser extends Parser[TypeB] {
def parse(r: Record[TypeB]): Option[Int] = ???
}
}
class ParseAlternator(parserA: Parser[TypeA], parserB: Parser[TypeB]) extends Parser[Alternatives] {
def parse(r: Record[Alternatives]): Option[Int] = r match {
case x: Record[TypeA #unchecked] if typeOf[Alternatives] =:= typeOf[TypeA] => parserA.parse(x)
case x: Record[TypeB #unchecked] if typeOf[Alternatives] =:= typeOf[TypeB] => parserB.parse(x)
}
}
object ParseAlternator {
implicit def parseAlternator(implicit parserA: Parser[TypeA], parserB: Parser[TypeB]): Parser[Alternatives] = new ParseAlternator(parserA, parserB)
}
// The "app"
object MainApp {
import ParseAlternator._
def process(f: File) =
records(f) foreach { r => parse(r) }
def records(f: File): Reader[Alternatives] = {
if(true)
new FileReader[TypeA](f)
else
new FileReader[TypeB](f)
}
def parse[T](r: Record[T])(implicit parser: Parser[T]): Option[Int] =
parser.parse(r)
}
The gist of it is: all of this would be completely classsical if only your parse instance did not have to pattern-match on a generic type but dealt directly with an Alternative instead.
It's this limitation (inherited from the JVM) that scala can't properly pattern-match on an object of a parametric type that requires the reflection & typeOf usage. Without it, you would just have type alternatives for your content (TypeA, TypeB), which you would add to a sealed trait, and which you would dispatch on, in an implicit that produces a Parser for their supertype.
Of course this isn't the only solution, it's just what I think is the meeting point of what's closest to what you're trying to do, with what's most idiomatic.

Scala: Multiple implicit conversions with same name

Using scala 2.10.3, my goal is to make the following work:
object A {
implicit class Imp(i: Int) {
def myPrint() {
println(i)
}
}
}
object B {
implicit class Imp(i: String) {
def myPrint() {
println(i)
}
}
}
import A._
import B._
object MyApp extends App {
3.myPrint()
}
This fails with
value myPrint is not a member of Int
If I give A.Imp and B.Imp different names (for example A.Imp1 and B.Imp2), it works.
Diving a bit deeper into it, there seems to be the same problem with implicit conversions.
This works:
object A {
implicit def Imp(i: Int) = new {
def myPrint() {
println(i)
}
}
implicit def Imp(i: String) = new {
def myPrint() {
println(i)
}
}
}
import A._
object MyApp extends App {
3.myPrint()
}
Whereas this doesn't:
object A {
implicit def Imp(i: Int) = new {
def myPrint() {
println(i)
}
}
}
object B {
implicit def Imp(i: String) = new {
def myPrint() {
println(i)
}
}
}
import A._
import B._
object MyApp extends App {
3.myPrint()
}
Why? Is this a bug in the scala compiler? I need this scenario, since my objects A and B derive from the same trait (with a type parameter) which then defines the implicit conversion for its type parameter. In this trait, I can only give one name for the implicit conversion. I want to be able to import more of these objects into my scope. Is there a way to do that?
edit: I can't give the implicit classes different names, since the examples above are only breaking down the problem. My actual code looks more like
trait P[T] {
implicit class Imp(i: T) {
def myPrint() {
...
}
}
}
object A extends P[Int]
object B extends P[String]
import A._
import B._
The implicits just have to be available as a simple name, so you can rename on import.
Just to verify:
scala> import A._ ; import B.{ Imp => BImp, _ }
import A._
import B.{Imp=>BImp, _}
scala> 3.myPrint
3
Actually, it works if you replace
import A._
import B._
with
import B._
import A._
What happens, I think, is that A.Imp is shadowed by B.Imp because it has the same name. Apparently shadowing applies on the function's name and do not take the signature into account.
So if you import A then B, then only B.Imp(i: String) will be available, and if you import B then A, then only A.Imp(i: Int) will be available.
If you need to use both A.Imp and B.Imp, you must rename one of them.
In case anyone else runs into this issue, there is a partial workaround, which I found here:
https://github.com/lihaoyi/scalatags/blob/3dea48c42c5581329e363d8c3f587c2c50d92f85/scalatags/shared/src/main/scala/scalatags/generic/Bundle.scala#L120
That code was written by Li Haoyi, so you can be pretty confident that no better solution exists...
Essentially, one can still use traits to define methods in terms of each other, but it will require boilerplate to copy those implicits into unique names. This example might be easier to follow:
trait Chainable
object Chainable {
implicit val _chainableFromInt = IntChainable.chainable _
implicit val _chainableFromIntTrav = IntChainable.traversable _
implicit val _chainableFromIntOpt = IntChainable.optional _
implicit val _chainableFromIntTry = IntChainable.tried _
implicit val _chainableFromDom = DomChainable.chainable _
implicit val _chainableFromDomTrav = DomChainable.traversable _
implicit val _chainableFromDomOpt = DomChainable.optional _
implicit val _chainableFromDomTry = DomChainable.tried _
private object IntChainable extends ImplChainables[Int] {
def chainable(n:Int) = Constant(n)
}
private object DomChainable extends ImplChainables[dom.Element]{
def chainable(e:Element) = Insertion(e)
}
private trait ImplChainables[T] {
def chainable(t:T):Chainable
def traversable(trav:TraversableOnce[T]):Chainable =
SeqChainable(trav.map(chainable).toList)
def optional(opt:Option[T]):Chainable =
opt match {
case Some(t) => chainable(t)
case None => NoneChainable
}
def tried(tried:Try[T]):Chainable =
optional(tried.toOption)
}
}
In other words, never write:
trait P[T] {
implicit def foo(i: T) = ...
}
object A extends P[X]
Because defining implicits in type parameterized traits will lead to these naming conflicts. Although, incidentally, the trait in mentioned in the link above is parameterized over many types, but idea there is that none of the implementations of that trait are ever needed in the same scope. (JSDom vs Text, and all._ vs short._ for those familiar with Scalatags)
I also recommend reading: http://www.lihaoyi.com/post/ImplicitDesignPatternsinScala.html
It does not address this issue specifically but is an excellent summary of how to use implicits.
However, putting all these pieces together, this still seems to be an issue:
trait ImplChainables[AnotherTypeClass]{
type F[A] = A=>AnotherTypeClass
implicit def transitiveImplicit[A:F](t: A):Chainable
implicit def traversable[A:F](trav: TraversableOnce[A]):Chainable = ...
}
What this trait would allow is:
object anotherImpl extends ImplChainables[AnotherTypeClass] {...}
import anotherImpl._
implicit val string2another: String=>AnotherTypeClass = ...
Seq("x"):Chainable
Because of the type parameter and the context binding (implicit parameter) those cannot be eta-expanded (i.e: Foo.bar _ ) into function values. The implicit parameter part has been fixed in Dotty: http://dotty.epfl.ch/blog/2016/12/05/implicit-function-types.html
I do not know if a complete solution would be possible, needless to say this is a complex problem. So it would be nice if same name implicits just worked and the whole issue could be avoided. And in any case, adding an unimport keyword would make so much more sense than turning off implicits by shadowing them.