Scala bring value into implicit scope without naming it - scala

When using spray-json, I need to bring a JsonFormat[A] into implicit scope for every domain type A that I want to serialize.
The recommended approach is to create a custom object with all the implicits as fields:
object MyJsonProtocol extends DefaultJsonProtocol {
implicit val colorFormat = jsonFormat4(Color)
}
import MyJsonProtocol._
My app has a great many domain types, some of which have long names. My MyJsonProtocol is getting long and unreadable:
object MyJsonProtocol extends DefaultJsonProtocol {
... // many more here
implicit val someClassWithALongNameFormat = jsonFormat4(SomeClassWithALongName)
implicit val someClassWithALongNameInnerFormat = jsonFormat4(SomeClassWithALongNameInner)
implicit val someClassWithALongNameVariantBFormat = jsonFormat4(SomeClassWithALongNameVariantB)
... // many more here
}
The long val names have various problems:
they feel redundant (the names are never read)
they make my lines very long
they introduce a copy/paste risk that the name of the format won't match the type of the format
they make the RHS values not aligned, which hides the common pattern here
Is there any way to bring an object into implicit scope without naming it? Something like this would be much neater:
object MyJsonProtocol extends DefaultJsonProtocol {
... // many more here
implicit val _ = jsonFormat4(SomeClassWithALongName)
implicit val _ = jsonFormat4(SomeClassWithALongNameInner)
implicit val _ = jsonFormat4(SomeClassWithALongNameVariantB)
... // many more here
}
... but Scala doesn't allow multiple fields named "_".
Is there any way to bring these formats into implicit scope without naming them all? Is there another way to use spray-json that avoids this issue?

Normally, I define typeclass instances in the companion objects:
case class Foo()
object Foo {
implicit val jsonFormatter = new JsonFormat[Foo] { ... }
}
case class Bar()
object Bar {
implicit val jsonFormatter = new JsonFormat[Bar] { ... }
}
I don't have to import anything, as companion objects are by default included in the implicit search scope, and the implicit members can all have the same names.

Related

Scala resolving Class/Type at runtime + type class constraint

I have a generic function that require a HasMoveCapability implicit instance of the type T (type class pattern)
trait HasMoveCapability[T]
def doLogic[T: TypeTag: HasMoveCapability](): Unit = println(typeTag[T].tpe)
Then I have these two classes which have implicit instances for HasMoveCapability[T]
case class Bird()
object Bird {
implicit val hasMoveCapability = new HasMoveCapability[Bird]{}
}
case class Lion()
object Lion {
implicit val hasMoveCapability = new HasMoveCapability[Lion]{}
}
My question is the following:
I need to resolve the type (Lion or Bird) at runtime depending on an argument and call the function doLogic with the good type.
I tried
val input: String = "bird" // known at runtime
val resolvedType: TypeTag[_] = input match {
case "bird" => typeTag[Bird]
case "lion" => typeTag[Lion]
}
doLogic()(resolvedType) // doesn't compile
// `Unspecified value parameters: hasMoveCapability$T$1: HasMoveCapability[NotInferredT]`
What I would like to do is something like:
val resolvedType: TypeTag[_: HasMoveCapability] = input match{...}
The workaround that I am using so far is to call the function in the pattern match:
input match {
case "bird" => doLogic[Bird]
case "lion" => doLogic[Lion]
}
But by having many functions, the pattern match is getting duplicated and hard to maintain.
I am open to change the design if you have any suggestions :D
You should describe your problem better. Currently your type class HasMoveCapability doesn't seem to do anything useful. Currently what you do seems a hard way to transform the string "bird" into "Bird", "lion" into "Lion".
If you control the code of doLogic you seem not to need TypeTag. TypeTag / ClassTag is a way to persist information from compile time to runtime. You seem to do something in reverse direction.
Type classes / implicits are resolved at compile time. You can't resolve something at compile time based on runtime information (there is no time machine taking you from the future i.e. runtime to the past i.e. compile time). Most probably you need ordinary pattern matching rather than type classes (TypeTag, HasMoveCapability).
In principle you can run compiler at runtime, then you'll have new compile time inside runtime, and you'll be able to infer types, resolve implicits etc.
import scala.tools.reflect.ToolBox
import scala.reflect.runtime.currentMirror
import scala.reflect.runtime.universe.{TypeTag, typeTag}
object App {
trait HasMoveCapability[T]
def doLogic[T: TypeTag: HasMoveCapability](): Unit = println(typeTag[T].tpe)
case class Bird()
object Bird {
implicit val hasMoveCapability = new HasMoveCapability[Bird]{}
}
case class Lion()
object Lion {
implicit val hasMoveCapability = new HasMoveCapability[Lion]{}
}
val input: String = "bird" // known at runtime
val tb = currentMirror.mkToolBox()
tb.eval(tb.parse(s"import App._; doLogic[${input.capitalize}]")) //App.Bird
def main(args: Array[String]): Unit = ()
}
scala get generic type by class

When doing implicit resolution with type parameters, why does val placement matter?

In one file, I have:
trait JsonSchema[T] {
val propertyType: String
override def toString: String = propertyType
}
object JsonSchema {
implicit def stringSchema: JsonSchema[String] = new JsonSchema[String] {
override val propertyType: String = "string"
}
implicit def intSchema: JsonSchema[Int] = new JsonSchema[Int] {
override val propertyType: String = "integer"
}
implicit def booleanSchema: JsonSchema[Boolean] = new JsonSchema[Boolean] {
override val propertyType: String = "boolean"
}
}
In my main file:
case class MetaHolder[T](v: T)(implicit val meta: JsonSchema[T])
object JsonSchemaExample extends App {
println(MetaHolder(3).meta.toString)
println(MetaHolder("wow").meta.toString)
}
That works hunky-dory. Now suppose I do this instead:
case class MetaHolder[T](v: T) {
val meta: JsonSchema[T] = implicitly[JsonSchema[T]]
}
It no longer compiles. Why?
My goal is to modify the anonymous Endpoint classes in the scala Finch library by adding a val meta to everything. I've been able to do this without any fancy-business so far, but now I want to do some fancy implicit resolution with shapeless to provide a JsonSchema definition for arbitrary case classes. My question is how to do this while maintaining backward compatibility. As in: provide the jsonschema meta feature for people who want to opt in, don't change the compilation burden for anyone who does not want to use meta,
If instead I go the first route, with an added implicit parameter, wouldn't that require a special import to be added by everyone? Or am I missing something and would backward compatibility still be maintained?
There is big difference between implicit x: X among parameters and implicitly[X] inside body.
When you say implicitly[X] this means "check now whether in the current scope there is an implicit X".
When you say def foo(...)(implicit x: X) = ... this means "check later when foo is called that in the scope of the call site there will be an implicit X (and for now inside foo just assume without checking that there is)".
class Foo(...)(implicit x: X) is similar to the latter, "check when constructor is called that there will be an implicit X".
Regarding whether users have to import or not. If you put implicits for type X to companion object of X then they will be found automatically (implicits for type X[Y] should be put to companion object of either X or Y). If you put them somewhere else then they have to be imported to the current scope.
In order for implicitly[JsonSchema[T]] to compile, there must be a JsonSchema[T] in the implicit scope, which means that there must be a JsonSchema[T] (or something implicitly convertible to a JsonSchema[T]) passed through as an implicit argument, as you had with:
case class MetaHolder[T](v: T)(implicit val meta: JsonSchema[T])

Need to Reference Trait on Companion Object From Trait on Case Class

I need to access a companion class with a specified trait -- from a trait intended for case classes. I am almost certain that the Scala reflection library can accomplish this but I haven't quite been able to piece it together.
I created test code below that requires one section of ??? be filled in with some reflection magic. The code compiles and runs as is -- with a notification due to the missing functionality.
Some related answers that I have seen on StackOverflow were from 2.10. Scala 2.12 compatible please.
import scala.reflect.{ClassTag, classTag}
//for companion object
//accesses Fields of the associated case class to ensure the correctness
//note: abstract class -- not a trait due to issues using ClassTag on a trait
abstract class SupportsField1Companion[T: ClassTag] {
//gets the names of all Fields on the associated case class
val fieldNamesOfInstancedClass: Array[String] =
classTag[T].runtimeClass.getDeclaredFields.map(_.getName)
//prints the name and fields of the associated case class -- plus extra on success
def printFieldNames(extra: String = ""): Unit = {
val name = classTag[T].runtimeClass.getCanonicalName
val fields = fieldNamesOfInstancedClass.reduceLeft(_ + ", " + _)
println(s"Fields of $name: $fields" + extra)
}
}
//for case classes
//IMPORTANT -- please do not parameterize this if possible
trait SupportsField1 {
//some data for printing
val field1: String = this.getClass.getCanonicalName + ": field1"
//should get a reference to the associated companion object as instance of SupportsFieldsCompanion
def getSupportsFieldsCompanion: SupportsField1Companion[this.type] = //this.type may be wrong
??? //TODO reflection magic required -- need functionality to retrieve companion object cast as type
//calls a function on the associated Companion class
def callPrintFuncOnCompanion(): Unit =
getSupportsFieldsCompanion.printFieldNames(s" -- from ${this.getClass.getCanonicalName}")
}
//two case classes with the SupportsFieldsCompanion trait to ensure data is accessed correctly
object ExampleA extends SupportsField1Companion[ExampleA] {}
case class ExampleA() extends SupportsField1 {
val fieldA: String = "ExampleA: fieldA"
}
object ExampleB extends SupportsField1Companion[ExampleB] {}
case class ExampleB() extends SupportsField1 {
val fieldB: String = "ExampleB: fieldB"
}
object Run extends App {
//create instanced classes and print some test data
val exampleA = ExampleA()
println(exampleA.field1) //prints "ExampleA: field1" due to trait SupportsFields
println(exampleA.fieldA) //prints "ExampleA: fieldA" due to being of class ExampleA
val exampleB = ExampleB()
println(exampleB.field1) //prints "ExampleB: field1" due to trait SupportsFields
println(exampleB.fieldB) //prints "ExampleB: fieldB" due to being of class ExampleB
//via the SupportsFieldsCompanion trait on the companion objects,
//call a function on each companion object to show that each companion is associated with the correct case class
ExampleA.printFieldNames() //prints "Fields of ExampleA: fieldA, field1"
ExampleB.printFieldNames() //prints "Fields of ExampleB: fieldB, field1"
//test access of printFieldNames on companion object from instanced class
try {
exampleA.callPrintFuncOnCompanion() //on success, prints "Fields of ExampleA: fieldA, field1 -- from ExampleA"
exampleB.callPrintFuncOnCompanion() //on success, prints "Fields of ExampleB: fieldB, field1 -- from ExampleB"
} catch {
case _: NotImplementedError => println("!!! Calling function on companion(s) failed.")
}
}
There are lots of ways you can do this, but the following is probably one of the simplest that doesn't involve making assumptions about how Scala's companion object class name mangling works:
def getSupportsFieldsCompanion: SupportsField1Companion[this.type] =
scala.reflect.runtime.ReflectionUtils.staticSingletonInstance(
this.getClass.getClassLoader,
this.getClass.getCanonicalName
).asInstanceOf[SupportsField1Companion[this.type]]
This works as desired, but I'd probably type it as SupportsField1Companion[_], and ideally I'd probably avoid having public methods on SupportsField1 that refer to SupportsField1Companion—actually ideally I'd probably avoid this approach altogether, but if you're committed I think the ReflectionUtil solution above is probably reasonable.

Importing generic implicits from class instances

I'm trying to make a generic implicit provider which can create an implicit value for a given type, something in the lines of:
trait Evidence[T]
class ImplicitProvider[T] {
class Implementation extends Evidence[T]
implicit val evidence: Evidence[T] = new Implementation
}
To use this implicit, I create a val provider = new ImplicitProvider[T] instance where necessary and import from it import provider._. This works fine as long as there is just one instance. However sometimes implicits for several types are needed in one place
case class A()
case class B()
class Test extends App {
val aProvider = new ImplicitProvider[A]
val bProvider = new ImplicitProvider[B]
import aProvider._
import bProvider._
val a = implicitly[Evidence[A]]
val b = implicitly[Evidence[B]]
}
And this fails to compile with could not find implicit value for parameter and not enough arguments for method implicitly errors.
If I use implicit vals from providers directly, everything starts to work again.
implicit val aEvidence = aProvider.evidence
implicit val bEvidence = bProvider.evidence
However I'm trying to avoid importing individual values, as there are actually several implicits inside each provider and the goal is to abstract them if possible.
Can this be achieved somehow or do I want too much from the compiler?
The issue is that when you import from both objects, you're bringing in two entities that have colliding names: evidence in aProvider and evidence in bProvider. The compiler cannot disambiguate those, both because of how its implemented, and because it'd be a bad idea for implicits, which can already be arcane, to be able to do things that cannot be done explicitly (disambiguating between clashing names).
What I don't understand is what the point of ImplicitProvider is. You can pull the Implementation class out to the top level and have an object somewhere that holds the implicit vals.
class Implementation[T] extends Evidence[T]
object Evidence {
implicit val aEvidence: Evidence[A] = new Implementation[A]
implicit val bEvidence: Evidence[B] = new Implementation[B]
}
// Usage:
import Evidence._
implicitly[Evidence[A]]
implicitly[Evidence[B]]
Now, there is no name clash.
If you need to have an actual ImplicitProvider, you can instead do this:
class ImplicitProvider[T] { ... }
object ImplicitProviders {
implicit val aProvider = new ImplicitProvider[A]
implicit val bProvider = new ImplicitProvider[B]
implicit def ImplicitProvider2Evidence[T: ImplicitProvider]: Evidence[T]
= implicitly[ImplicitProvider[T]].evidence
}
// Usage
import ImplicitProviders._
// ...

Scala - Add member variable to class from outside

Is it possible to add a member variable to a class from outside the class? (Or mimic this behavior?)
Here's an example of what I'm trying to do. I already use an implicit conversion to add additional functions to RDD, so I added a variable to ExtendedRDDFunctions. I'm guessing this doesn't work because the variable is lost after the conversion in a rdd.setMember(string) call.
Is there any way to get this kind of functionality? Is this the wrong approach?
implicit def toExtendedRDDFunctions(rdd: RDD[Map[String, String]]): ExtendedRDDFunctions = {
new ExtendedRDDFunctions(rdd)
}
class ExtendedRDDFunctions(rdd: RDD[Map[String, String]]) extends Logging with Serializable {
var member: Option[String] = None
def getMember(): String = {
if (member.isDefined) {
return member.get
} else {
return ""
}
}
def setMember(field: String): Unit = {
member = Some(field)
}
def queryForResult(query: String): String = {
// Uses member here
}
}
EDIT:
I am using these functions as follows: I first call rdd.setMember("state"), then rdd.queryForResult(expression).
Because the implicit conversion is applied each time you invoke a method defined in ExtendedRDDFunctions, there is a new instance of ExtendedRDDFunctions created for every call to setMember and queryForResult. Those instances do not share any member variables.
You have basically two options:
Maintain a Map[RDD, String] in ExtendedRDDFunctions's companion object which you use to assign the member value to an RDD in setMember. This is the evil option as you introduce global state and open pitfalls for a whole range of errors.
Create a wrapper class that contains your member value and is returned by the setMember method:
case class RDDWithMember(rdd: RDD[Map[String, String]], member: String) extends RDD[Map[String, String]] {
def queryForResult(query: String): String = {
// Uses member here
}
// methods of the RDD interface, just delegate to rdd
}
implicit class ExtendedRDDFunctions(rdd: RDD[Map[String, String]]) {
def setMember(field: String): RDDWithMember = {
RDDWithMember(rdd, field)
}
}
Beside the omitted global state, this approach is also more type safe because you cannot call queryForResult on instances that do not have a member. The only downsides are that you have to delegate all members of RDD and that queryForResult is not defined on RDD itself.
The first issue can probably be addressed with some macro magic (search for "delegate" or "proxy" and "macro").
The later issue can be resolved by defining an additional extension method in ExtendedRDDFunctions that checks if the RDD is a RDDWithMember:
implicit class ExtendedRDDFunctions(rdd: RDD[Map[String, String]]) {
def setMember(field: String): RDDWithMember = // ...
def queryForResult(query: String): Option[String] = rdd match {
case wm: RDDWithMember => Some(wm.queryForResult(query))
case _ => None
}
}
import ExtendedRDDFunctions._
will import all attributes and functions from Companion object to be used in the body of your class.
For your usage look for delagate pattern.