Kafka Streams: SessionWindowedSerde Vs TimeWindowedSerde. Ambiguous implicit values - scala

I keep getting 'ambiguous implicit values' message in the following code. I tried several things (as can be seen from a couple of lines I've commented out). Any ideas on how to fix this? This is in Scala.
def createTopology(conf: Config, properties: Properties): Topology = {
// implicit val sessionSerde = Serde[WindowedSerdes.SessionWindowedSerde[String]]
// implicit val produced: Produced[Windowed[String], Long] = Produced.`with`[WindowedSerdes.SessionWindowedSerde[String], Long]
implicit val produced: Produced[Windowed[String], Long] = Produced.`with`[Windowed[String], Long]
implicit val consumed: Consumed[String, String] = Consumed.`with`[String, String]
val builder: StreamsBuilder = new StreamsBuilder()
builder.stream("streams-plaintext-input")
.groupBy((_, word) => word)
.windowedBy(SessionWindows.`with`(Duration.ofMillis(60 * 1000)))
.count()
.toStream.to("streams-pipe-output")
builder.build()
}
Compiler Errors:
Error:(52, 78) ambiguous implicit values:
both method timeWindowedSerde in object Serdes of type [T](implicit tSerde: org.apache.kafka.common.serialization.Serde[T])org.apache.kafka.streams.kstream.WindowedSerdes.TimeWindowedSerde[T]
and method sessionWindowedSerde in object Serdes of type [T](implicit tSerde: org.apache.kafka.common.serialization.Serde[T])org.apache.kafka.streams.kstream.WindowedSerdes.SessionWindowedSerde[T]
match expected type org.apache.kafka.common.serialization.Serde[org.apache.kafka.streams.kstream.Windowed[String]]
implicit val produced: Produced[Windowed[String], Long] = Produced.`with`[Windowed[String], Long]
Error:(52, 78) could not find implicit value for parameter keySerde: org.apache.kafka.common.serialization.Serde[org.apache.kafka.streams.kstream.Windowed[String]]
implicit val produced: Produced[Windowed[String], Long] = Produced.`with`[Windowed[String], Long]
Error:(52, 78) not enough arguments for method with: (implicit keySerde: org.apache.kafka.common.serialization.Serde[org.apache.kafka.streams.kstream.Windowed[String]], implicit valueSerde: org.apache.kafka.common.serialization.Serde[Long])org.apache.kafka.streams.kstream.Produced[org.apache.kafka.streams.kstream.Windowed[String],Long].
Unspecified value parameters keySerde, valueSerde.
implicit val produced: Produced[Windowed[String], Long] = Produced.`with`[Windowed[String], Long]

I just add some implicits by adding imports and it compiles:
import org.apache.kafka.common.serialization.Serde
import org.apache.kafka.streams.Topology
import org.apache.kafka.streams.kstream.{SessionWindows, Windowed}
import org.apache.kafka.streams.scala.StreamsBuilder
import org.apache.kafka.streams.scala.kstream.{Consumed, Produced}
import java.time.Duration
import java.util.Properties
import org.apache.kafka.streams.scala.ImplicitConversions._
import org.apache.kafka.streams.scala.Serdes
import org.apache.kafka.streams.scala.Serdes.{Long, String}
def createTopology(conf: Config, properties: Properties): Topology = {
// here we have two implicits to choose, I pick the sessionWindowedSerde because it was in your code
// implicit val timeWindowedSerde: Serde[Windowed[String]] = Serdes.timeWindowedSerde[String]
implicit val sessionSerde: Serde[Windowed[String]] = Serdes.sessionWindowedSerde[String]
implicit val produced: Produced[Windowed[String], Long] = Produced.`with`[Windowed[String], Long]
implicit val consumed: Consumed[String, String] = Consumed.`with`[String, String]
val builder: StreamsBuilder = new StreamsBuilder()
builder.stream("streams-plaintext-input")
.groupBy((_, word) => word)
.windowedBy(SessionWindows.`with`(Duration.ofMillis(60 * 1000)))
.count()
.toStream.to("streams-pipe-output")
builder.build()
}
If you see an error:
ambiguous implicit values
it means in your scope defined more than one implicits which satisfying the needed types. For example object org.apache.kafka.streams.scala.Serdes has a two implicits:
implicit def timeWindowedSerde[T](implicit tSerde: Serde[T]): WindowedSerdes.TimeWindowedSerde[T] =
new WindowedSerdes.TimeWindowedSerde[T](tSerde)
implicit def sessionWindowedSerde[T](implicit tSerde: Serde[T]): WindowedSerdes.SessionWindowedSerde[T] =
new WindowedSerdes.SessionWindowedSerde[T](tSerde)
where TimeWindowedSerde extends Serdes.WrapperSerde<Windowed<T>>:
static public class TimeWindowedSerde<T> extends Serdes.WrapperSerde<Windowed<T>>
and SessionWindowedSerde extends Serdes.WrapperSerde<Windowed<T>>:
static public class SessionWindowedSerde<T> extends Serdes.WrapperSerde<Windowed<T>>
both of them extends the same type Serdes.WrapperSerde<Windowed<T>>,
and in the line:
implicit val produced: Produced[Windowed[String], Long] = Produced.`with`[Windowed[String], Long]
according to with function signature:
def `with`[K, V](implicit keySerde: Serde[K], valueSerde: Serde[V]): ProducedJ[K, V] =
ProducedJ.`with`(keySerde, valueSerde)
we expect some implicit value for Serde[Windowed[String]] and compiler can't pick one because both of them are Serde[Windowed[String]].
So if you just try to add both of them to the same scope:
implicit val timeWindowedSerde: Serde[Windowed[String]] = Serdes.timeWindowedSerde[String]
implicit val sessionSerde: Serde[Windowed[String]] = Serdes.sessionWindowedSerde[String]
you will see
ambiguous implicit values
again.
Conclusion: be careful while importing large bunches of implicits and the best practice is to import only implicits that you needed.

Related

How to support marshalling and unmarshalling in Akka HTTP with JSON Spray for HashMap fields?

I have the following case classes:
import scala.collection.immutable.HashMap
final case class GuiApplication(testSuites: TestSuites)
final case class GuiApplications(var applications: HashMap[Id, GuiApplication])
final case class Id(val id: Long)
final case class TestSuite()
final case class TestSuites(var testSuites: HashMap[Id, TestSuite])
The marshalling defines:
implicit val applicationsMapFormat = jsonFormat0(HashMap[Id, GuiApplication])
implicit val testsuitesMapFormat = jsonFormat0(HashMap[Id, TestSuite])
implicit val idFormat = jsonFormat1(Id)
implicit val testSuiteFormat = jsonFormat0(TestSuite)
implicit val testSuitesFormat = jsonFormat1(TestSuites)
implicit val applicationFormat = jsonFormat1(GuiApplication)
implicit val applicationsFormat = jsonFormat1(GuiApplications)
But I still get these compile errors:
type mismatch;
[error] found : Seq[(Id, GuiApplication)] => scala.collection.immutable.HashMap[Id,GuiApplication]
[error] required: () => ?
[error] Note: implicit value applicationsMapFormat is not applicable here because it comes after the application point and it lacks an explicit result type
[error] implicit val applicationsMapFormat = jsonFormat0(HashMap[Id, GuiApplication])
[error] ^
If I remove the first two implicits I get other compile errors.
How can I define implicit JSON formats for map types in Scala?
You just need to write your custom RootJsonFormat[scala.collection.immutable.HashMap[Id,GuiApplication]] and add it to the scope of your routes:
implicit val hashMapFormat: RootJsonFormat[scala.collection.immutable.HashMap[Id,GuiApplication]] = new RootJsonFormat[scala.collection.immutable.HashMap[Id,GuiApplication]] {
override def write(obj: scala.collection.immutable.HashMap[Long, String]): JsValue = ???
override def read(json: JsValue): scala.collection.immutable.HashMap[Id,GuiApplication] = ???
}

scala circe encoders/decoders for an abstract class with case classes

I want to save a collection of FieldMapping classes as a json string -
abstract class Field {
def clazz: Class[_]
def name: String
}
case class StringField(name: String) extends Field {
override def clazz: Class[_] = classOf[String]
}
case class DateField(name: String) extends Field {
override def clazz: Class[_] = classOf[Date]
}
... etc - full code here:
https://github.com/alexeyOnGitHub/scala-typesafe/blob/master/src/main/scala/com/example/model/Field.scala
Circe code:
import com.example.model.{DateField, Field, FieldMapping, StringField}
import io.circe.generic.semiauto.{deriveDecoder, deriveEncoder}
import io.circe.{Decoder, Encoder}
object CirceBoilerplateForConfigs {
implicit val fieldDecoder: Decoder[StringField] = deriveDecoder[StringField]
implicit val fieldEncoder: Encoder[StringField] = deriveEncoder[StringField]
implicit val dateDecoder: Decoder[DateField] = deriveDecoder[DateField]
implicit val dateEncoder: Encoder[DateField] = deriveEncoder[DateField]
implicit val fooDecoder: Decoder[FieldMapping] = deriveDecoder[FieldMapping]
implicit val fooEncoder: Encoder[FieldMapping] = deriveEncoder[FieldMapping]
}
Error:(14, 65) could not find Lazy implicit value of type
io.circe.generic.decoding.DerivedDecoder[com.example.model.FieldMapping]
implicit val fooDecoder: Decoder[FieldMapping] =
deriveDecoder[FieldMapping] Error:(14, 65)
not enough arguments for method deriveDecoder: (implicit decode:
shapeless.Lazy[io.circe.generic.decoding.DerivedDecoder[com.example.model.FieldMapping]])io.circe.Decoder[com.example.model.FieldMapping].
Unspecified value parameter decode. implicit val fooDecoder:
Decoder[FieldMapping] = deriveDecoder[FieldMapping] Error:(15, 65)
could not find Lazy implicit value of type
io.circe.generic.encoding.DerivedObjectEncoder[com.example.model.FieldMapping]
implicit val fooEncoder: Encoder[FieldMapping] =
deriveEncoder[FieldMapping] Error:(15, 65)
not enough arguments for
method deriveEncoder: (implicit encode:
shapeless.Lazy[io.circe.generic.encoding.DerivedObjectEncoder[com.example.model.FieldMapping]])io.circe.ObjectEncoder[com.example.model.FieldMapping].
Unspecified value parameter encode. implicit val fooEncoder:
Encoder[FieldMapping] = deriveEncoder[FieldMapping]
Field should be a sealed trait (with abstract class or not sealed trait this won't work).
The following code compiles:
import java.util.Date
sealed trait Field {
def clazz: Class[_]
def name: String
}
case class StringField(name: String) extends Field {
override def clazz: Class[_] = classOf[String]
}
case class DateField(name: String) extends Field {
override def clazz: Class[_] = classOf[Date]
}
case class FieldMapping(fieldInConnector1: Option[Field],
fieldInConnector2: Option[Field],
selected: Boolean,
defaultValue: String)
import io.circe.generic.semiauto.{deriveDecoder, deriveEncoder}
import io.circe.{Decoder, Encoder}
object CirceBoilerplateForConfigs {
implicit val stringDecoder: Decoder[StringField] = deriveDecoder[StringField]
implicit val stringEncoder: Encoder[StringField] = deriveEncoder[StringField]
implicit val dateDecoder: Decoder[DateField] = deriveDecoder[DateField]
implicit val dateEncoder: Encoder[DateField] = deriveEncoder[DateField]
implicit val fieldDecoder: Decoder[Field] = deriveDecoder[Field]
implicit val fieldEncoder: Encoder[Field] = deriveEncoder[Field]
implicit val fooDecoder: Decoder[FieldMapping] = deriveDecoder[FieldMapping]
implicit val fooEncoder: Encoder[FieldMapping] = deriveEncoder[FieldMapping]
}

Macwire, wireWith and implicit parameters

wireWith seems to have some issues with resolving implicit parameters.
Minimal example:
import com.softwaremill.macwire._
object A {
def props(x: Int)(implicit y: String): A = new A(x)
}
class A(x: Int)(implicit y: String) {
val sum: String = s"$x + $y"
}
object App {
def main(): Unit = {
val xVal: Int = 3
implicit val yVal: String = "5"
// val aInstance = wire[A] // works
// val aInstance = A.props(xVal) // works
val aInstance: A = wireWith(A.props _) // compile error
println(aInstance.sum)
}
}
App.main()
Error:
Error:(21, 33) type mismatch; found : Int required: String
val aInstance: A = wireWith(A.props _)
^
If implicit is ommitted at yVal, it complains about missing implicit:
Error:(18, 36) could not find implicit value for parameter y: String
val aInstance: A = wireWith(A.props _) // compile error
^
This is a simplified example though - in my production code i try to wire Actor props:
object Actor {
def props
(dependency: Dependency, refreshInterval: FiniteDuration ## CacheRefreshInterval)
(implicit ec: ExecutionContext): Props =
Props(new Actor(dependency, refreshInterval))
}
// DI container
protected implicit def executionContext: ExecutionContext
protected lazy val dependency: Dependency = wire[Dependency]
protected lazy val refreshInterval = 2.second.taggedWith[CacheRefreshInterval]
protected lazy val actorProps: Props ## ActorProps = actorwireWith(Actor.props _)
and get different compile error:
too many arguments for method props: (implicit ec: scala.concurrent.ExecutionContext)akka.actor.Props
I've tried making implicit parameter explicit and it worked just fine, except it's a bit against usual practice of passing execution context implicitly.
Am I doing something wrong or is it an issue in macwire?
Issue in Github: https://github.com/adamw/macwire/issues/125
If you were using wireWith(A.props _) as a workaround for macwire's inability to wire actors when their constructor are using implicit parameters (see macwire issue 121), you could switch to macwire 2.3.1 and wire your actors using wireActor[A], wireAnonymousActor[A] or wireProps[A]. issue 121 was fixed in macwire 2.3.1.

Implicit parameters cannot be found when imported (from an Object)

Why can not the code below find the imported implicits from MyProducers? In my understanding this should work fine because the implicits are in scope.
Error:(16, 34) could not find implicit value for evidence parameter of type MyProducers.ListProducer[Int]
val stuffInt:Int = getHead[Int]()
Error:(16, 34) not enough arguments for method getHead: (implicit evidence$2: MyProducers.ListProducer[Int])Int.
Unspecified value parameter evidence$2.
val stuffInt:Int = getHead[Int]()
Error:(18, 43) could not find implicit value for evidence parameter of type MyProducers.ListProducer[String]
val stuffString:String = getHead[String]()
Error:(18, 43) not enough arguments for method getHead: (implicit evidence$2: MyProducers.ListProducer[String])String.
Unspecified value parameter evidence$2.
val stuffString:String = getHead[String]()
Code:
object Resolver {
import MyProducers._
import MyProducers._
def getList[T:ListProducer]():List[T]= implicitly[ListProducer[T]].produceList
def getHead[T:ListProducer]():T= getList[T]().head
val stuffInt:Int = getHead[Int]()
val stuffString:String = getHead[String]()
val im=ip // this compiles fine, so implicits are in scope
val sm=sp
}
object MyProducers{
trait ListProducer[T]{
def produceList:List[T]
}
object IntProducer extends ListProducer[Int]{
override def produceList = List(22, 42)
}
implicit val ip=IntProducer
object StringProducer extends ListProducer[String]{
override def produceList = List("stuff", "Shiraly")
}
implicit val sp=StringProducer
}
What is strange, that this code compiles fine:
object Resolver {
import MyProducers._
import MyProducers._
trait ListProducer[T]{
def produceList:List[T]
}
object IntProducer extends ListProducer[Int]{
override def produceList = List(22, 42)
}
implicit val ip=IntProducer
object StringProducer extends ListProducer[String]{
override def produceList = List("stuff", "Shiraly")
}
implicit val sp=StringProducer
def getList[T:ListProducer]():List[T]= implicitly[ListProducer[T]].produceList
def getHead[T:ListProducer]():T= getList[T]().head
val stuffInt:Int = getHead[Int]()
val stuffString:String = getHead[String]()
val im=ip
val sm=sp
}
Any idea why the first code does not compile while the second does ?
Implicit vals and defs need to have type annotations on them. Add those, and the implicits will be found.

AggregateByKey fails to compile when it is in an abstract class

I'm new to both Scala and Spark, so I'm hoping someone can explain why aggregateByKey fails to compile when it is in an abstract class. This is about the simplest example I can come up with:
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.rdd.RDD
abstract class AbstractKeyCounter[K] {
def keyValPairs(): RDD[(K, String)]
def processData(): RDD[(K, Int)] = {
keyValPairs().aggregateByKey(0)(
(count, key) => count + 1,
(count1, count2) => count1 + count2
)
}
}
class StringKeyCounter extends AbstractKeyCounter[String] {
override def keyValPairs(): RDD[(String, String)] = {
val sc = new SparkContext(new SparkConf().setMaster("local").setAppName("counter"))
val data = sc.parallelize(Array("foo=A", "foo=A", "foo=A", "foo=B", "bar=C", "bar=D", "bar=D"))
data.map(_.split("=")).map(v => (v(0), v(1)))
}
}
Which gives:
Error:(11, 19) value aggregateByKey is not a member of org.apache.spark.rdd.RDD[(K, String)]
keyValPairs().aggregateByKey(0)(
^
If I instead use a single concrete class, it compiles and runs successfully:
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.rdd.RDD
class StringKeyCounter {
def processData(): RDD[(String, Int)] = {
val sc = new SparkContext(new SparkConf().setMaster("local").setAppName("counter"))
val data = sc.parallelize(Array("foo=A", "foo=A", "foo=A", "foo=B", "bar=C", "bar=D", "bar=D"))
val keyValPairs = data.map(_.split("=")).map(v => (v(0), v(1)))
keyValPairs.aggregateByKey(0)(
(count, key) => count + 1,
(count1, count2) => count1 + count2
)
}
}
What am I missing?
If you change:
abstract class AbstractKeyCounter[K] {
To:
abstract class AbstractKeyCounter[K : ClassTag] {
This will compile.
Why? aggregateByKey is a method of PairRDDFunctions (your RDD is implicitly converted into that class), which has the following signature:
class PairRDDFunctions[K, V](self: RDD[(K, V)])
(implicit kt: ClassTag[K], vt: ClassTag[V], ord: Ordering[K] = null)
This means its constructor expects implicit values of types ClassTag[K] and vt: ClassTag[V]. Your abstract class has no knowledge of what K is, and therefore cannot provide a matching implicit value. This means the implicit conversion into PairRDDFunctions "fails" (compiler doesn't perform the conversion) and therefore the method aggregateByKey can't be found.
Adding [K : ClassTag] is shorthand for adding an implicit argument implicit kt: ClassTag[K] to the abstract class constructor, which is then used by compiler and passed to the constructor of PairRDDFunctions.
For more about ClassTags and what they're good for see this good article.