I'm facing a strange error where I'm trying to parse a JSON String into a generic case class. My case class looks like this:
final case class LocationAPIObject[F[_]](
countryCode: F[String],
partyId: F[String],
uid: F[String]
)
For a JSON that comes in like this:
val js: JsValue = Json.parse("""{
"countryCode": "us",
"partyId": "123456"
}""")
It results in a parse error:
diverging implicit expansion for type play.api.libs.json.Reads[T]
starting with method Tuple22R in trait GeneratedReads
Here is the sample code that I'm working on: https://scastie.scala-lang.org/aBQIUCTvTECNPgSjCjDFlw
You have diverging implicit expansion error because you haven't specified type parameter. If you do
val locationAPI = Json.fromJson[LocationAPIObject[Option]](js)
then the error changes to implicit not found: play.api.libs.json.Reads[LocationAPIObject[scala.Option]]. No Json deserializer found for type LocationAPIObject[Option]. Try to implement an implicit Reads or Format for this type.
The case class LocationAPIObject being parametrized with higher-kinded F ("fields which are encapsulated effects") shouldn't be a problem. The fields are of type F[String] so in order to derive instances of the type class Reads or Format for LocationAPIObject[F] it should be enough to know instances for F[String].
But while this works for example in Circe
import io.circe.Decoder
import io.circe.parser.decode
import io.circe.generic.semiauto
implicit def locDec[F[_]](implicit ev: Decoder[F[String]]): Decoder[LocationAPIObject[F]] =
semiauto.deriveDecoder[LocationAPIObject[F]]
decode[LocationAPIObject[Option]](
"""{
"countryCode": "us",
"partyId": "123456"
}"""
)
// Right(LocationAPIObject(Some(us),Some(123456),None))
for some reason it doesn't in Play json:
implicit def locFormat[F[_]](implicit ev: Format[F[String]]): Format[LocationAPIObject[F]] =
Json.format[LocationAPIObject[F]]
or
implicit def locFormat[F[_]](implicit ev: OFormat[F[String]]): OFormat[LocationAPIObject[F]] =
Json.format[LocationAPIObject[F]]
// No instance of play.api.libs.json.Format is available for F, F, F in the implicit scope (Hint: if declared in the same file, make sure it's declared before)
or
implicit def locReads[F[_]](implicit ev: Reads[F[String]]): Reads[LocationAPIObject[F]] =
Json.reads[LocationAPIObject[F]]
// No instance of play.api.libs.json.Reads is available for F, F, F in the implicit scope (Hint: if declared in the same file, make sure it's declared before)
So the thing seems to be in Play json derivation macros.
The easiest would be to define a codec manually as it was adviced in the comments.
Related
def recordToType[A <: TBase[T, F]](record: ConsumerRecord[String, Array[Byte]]): A = {
(new TDeserializer(new TCompactProtocol.Factory())).deserialize(new A[T, F](), record.value())
}
The above syntax doesn't work but basically I want A to be restricted to TBase but TBase requires two type parameters. If I leave those parameters out it says the parameters are required, if I put in the parameters it says the parameters are unresolved... What is the correct way to write this?
You could write your method's signature like:
import scala.language.higherKinds
def recordToType[T, F, A[_, _] <: TBase[T, F]](record: ConsumerRecord[String, Array[Byte]]): A[T,F]
but there's another problem with your code. You can't just create a new instance of generic type like this:
new A[T, F]
It will show an error similar to: Error:(15, 9) class type required but A[T,F] found
You cant capture class of A with implicit ClassTag though:
def recordToType[T, F, A[_, _] <: TBase[T, F]](record: ConsumerRecord[String, Array[Byte]])(
implicit m: scala.reflect.ClassTag[A[_,_]]
): A[T,F] = {
val a:A[T,F] = m.runtimeClass.getConstructors.head.newInstance().asInstanceOf[A[T, F]]
...
I am trying to create some simple custom aggregate operators in Spark using Scala.
I have created a simple hierarchy of operators, with the following super-class:
sealed abstract class Aggregator(val name: String) {
type Key = Row // org.apache.spark.sql.Row
type Value
...
}
I also have a companion object, which constructs the appropriate aggregator each time. Observe that each operator is allowed to specify the Value type it wants.
Now the problem is when I try to call combineByKey:
val agg = Aggregator("SUM")
val res = rdd
.map(agg.mapper)
.reduceByKey(agg.reducer(_: agg.Value, _: agg.Value))
The error is:
value reduceByKey is not a member of org.apache.spark.rdd.RDD[(agg.Key, agg.Value)]
For my needs, Value can either be a numeric type or a tuple, hence its no bounds definition. If I replace the Value type declaration with:
type Value = Double
in Aggregator class, then everything works fine. Therefore, I suppose that the error is relevant to reduceByKey not knowing the exact Value type in compile time.
Any ideas on how to get around this?
Your RDD cannot be implicitly converted into PairRDDFunctions, because all the implicit ClassTags for keys and values are missing.
You might want to include the class tags as implicit parameters in your Aggregator:
sealed abstract class Aggregator[K: ClassTag, V: ClassTag](name: String) {
implicit val keyClassTag: ClassTag[K] = implicitly
implicit val valueClassTag: ClassTag[V] = implicitly
}
or maybe:
sealed abstract class Aggregator[K, V](name: String)(implicit kt: ClassTag[K], vt: ClassTag[V]) {
implicit val keyClassTag: ClassTag[K] = kt
implicit val valueClassTag: ClassTag[V] = vt
}
or maybe even:
sealed abstract class Aggregator(name: String) {
type K
type V
implicit def keyClassTag: ClassTag[K]
implicit def valueClassTag: ClassTag[V]
}
The last variant would shift the responsibility for providing the ClassTags to the implementor of the abstract class.
Now, when using an aggregator a of type Aggregator[K, V] in a reduceByKey, you would have to make sure that those implicitly provided class tags are in the current implicit scope:
val agg = Aggregator("SUM")
import agg._ // now the implicits should be visible
val res = rdd
.map(agg.mapper)
.reduceByKey(agg.reducer(_: agg.Value, _: agg.Value))
I'm new to Scala and rusty with Java and I'm tripping up on some semantics. I'm trying to get the class object of SequenceFileOutputFormat.
Other stack overflow posts say to simple do the following:
classOf[SequenceFileOutputFormat]
Which yields the error:
class SequenceFileOutputFormat takes type parameters
Ok fine, so it requires the parameter types, so I do the following:
classOf[SequenceFileOutputFormat[String, String]]
Which yields the error:
[error] found : Class[org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat[String,String]](classOf[org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat])
[error] required: Class[_ <: org.apache.hadoop.mapred.OutputFormat[_, _]]
What gives? How do I get the class object of a class that requires type parameters?
With hadoop-mapreduce-client-core "3.0.0", this here:
import org.apache.hadoop.mapreduce.lib.output._
object HadoopQuestion_48818781 {
def main(args: Array[String]): Unit = {
val c = classOf[SequenceFileOutputFormat[_, _]]
println("Class: " + c)
}
}
prints:
Class: class org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat
Explanation:
Even though classOf[T] is processed at compile time, the Class[T] thing itself is a runtime construct, and it does not know anything about the generic type parameters of T. Therefore, you have to put underscores _ for the runtime-erased type parameters. These underscores are really just syntactic sugar for existential types in Scala, but in this particular case they fulfill a role similar to that of the wildcards in Java.
Here is my code straight & simple:
package scalaproj
import scala.reflect._
case class MyClass() {}
def bar[T](cls : Class[T]) = println(cls)
def foobar[T: ClassTag] = println(classTag[T])
bar(classOf[MyClass])
foobar[MyClass]
Results: class scalaproj.GetFields$MyClass$2
scalaproj.GetFields$MyClass$2
Now I would like to do the following without the famous error: "class type required but T found"
def foo[T] = println(classOf[T])
foo[MyClass]
foo is just a function that takes a Generic Type Parameter and does not need a value parameter. I think this is strange given the two examples that work and all the flexibility build in into the Scala language and its handeling of generics.
Update:
Just to specify further strangeness:
def foo1[T](t : T) = {} // no compile error
def foo2[T](): List[T] = { List[T]() } // no compile error
def foo3[T](): T = { T() } // compile error: "not found: value T"
A good explanation is appreciated.
You can't, as classOf will not work with arbitrary types (and your T is an arbitrary type).
For example:
scala> classOf[Int with String]
<console>:15: error: class type required but Int with String found
classOf[Int with String]
^
You can achieve the same thing with ClassTag#runtimeClass:
def foo[T: ClassTag] = println(classTag[T].runtimeClass)
Given a made-up F type-class:
scala> trait F[A] {}
defined trait F
and this definition, which uses a context bound to require that the input A has a type-class instance of F:
scala> def f[A : F](x: A) = ???
f: [A](x: A)(implicit evidence$1: F[A])Nothing
I defined a Person and type-class instance:
scala> case class Person(name: String)
defined class Person
scala> implicit val person: F[Person] = new F[Person] {}
person: F[Person] = $anon$1#262b2c86
And the following compiles:
scala> f(Person("foo"))
scala.NotImplementedError: an implementation is missing
But, there's no String implementation, so it fails.
scala> f("foobar")
<console>:17: error: could not find implicit value for evidence parameter of type F[String]
f("foobar")
^
I then defined an F[String] using:
scala> implicit def fInstance(x: String) = new F[String] {}
fInstance: (x: String)F[String]
But, I can't run:
scala> f("foobar")
<console>:18: error: could not find implicit value for evidence parameter of type F[String]
f("foobar")
^
since I do not have an implicit F[String], but rather a String => F[String].
What's the proper way to use such an implicit def to meet the F[String] constraint, i.e. call the f function successfully with a type of String?
I got it to work via:
scala> implicit val x: F[String] = implicitly[String => F[String]].apply("foobar")
x: F[String] = $anon$1#7b7fdc8
scala> f("foobar")
scala.NotImplementedError: an implementation is missing
at scala.Predef$.$qmark$qmark$qmark(Predef.scala:230)
at .f(<console>:12)
... 33 elided
But I'm not sure if it's the right/clean way to do it.
You defined an implicit conversion. If you want to use a def to provide typeclass instances you just write the same as you'd write an implicit val but replace val with def.
implicit def fInstance = new F[String] {}
Normally you only use a def if you need type parameters, like here.
implicit def fInstance[A] = new F[List[A]] {}
Or
implicit def fInstance[A](implicit ev: F[A]) = new F[List[A]] {}
Your fInstance defines an implicit conversion, i.e. a way to turn a String into F[String]. For generating a typeclass instance, a method accepting implicit parameters can be used:
implicit def fInstance(implicit x: String) = new F[String] {}
it is typically used in FP libraries to derive one typeclass from another:
implicit def optionMonoid[A](implicit S: Semigroup[A]): Monoid[Option[A]] = ???
// or, which is the same
// implicit def optionMonoid[A: Semigroup]: Monoid[Option[A]] = ???
The idea is that F[String] can operate on any String in general, not being dependent on actual arguments provided into function. Of course, you can always provide instances explicitly:
f("foobar")(new F[String] { })
As a follow-up, the important part of typeclasses is that you can define them ad-hoc, i.e. not having access to definitions of F and String at all, and you are forced to scope implicits in Scala and import them, so it's totally ok.
Here is a simpler version of your definition (and you can remove implicit from fInstance):
implicit val singleFInstance: F[String] = fInstance("") // or fInstance("foobar"), etc.
Whether this is the right thing to do, very much depends on what F and f are supposed to mean.
But generally speaking: if F is really a type-class, fInstance(string) gives different results depending on the string (not just different instances, but different behavior), and f's signature is correct, then this is wrong and you should accept that calling f("foobar") isn't meaningful.