I'm new to both Scala and Spark, so I'm hoping someone can explain why aggregateByKey fails to compile when it is in an abstract class. This is about the simplest example I can come up with:
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.rdd.RDD
abstract class AbstractKeyCounter[K] {
def keyValPairs(): RDD[(K, String)]
def processData(): RDD[(K, Int)] = {
keyValPairs().aggregateByKey(0)(
(count, key) => count + 1,
(count1, count2) => count1 + count2
)
}
}
class StringKeyCounter extends AbstractKeyCounter[String] {
override def keyValPairs(): RDD[(String, String)] = {
val sc = new SparkContext(new SparkConf().setMaster("local").setAppName("counter"))
val data = sc.parallelize(Array("foo=A", "foo=A", "foo=A", "foo=B", "bar=C", "bar=D", "bar=D"))
data.map(_.split("=")).map(v => (v(0), v(1)))
}
}
Which gives:
Error:(11, 19) value aggregateByKey is not a member of org.apache.spark.rdd.RDD[(K, String)]
keyValPairs().aggregateByKey(0)(
^
If I instead use a single concrete class, it compiles and runs successfully:
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.rdd.RDD
class StringKeyCounter {
def processData(): RDD[(String, Int)] = {
val sc = new SparkContext(new SparkConf().setMaster("local").setAppName("counter"))
val data = sc.parallelize(Array("foo=A", "foo=A", "foo=A", "foo=B", "bar=C", "bar=D", "bar=D"))
val keyValPairs = data.map(_.split("=")).map(v => (v(0), v(1)))
keyValPairs.aggregateByKey(0)(
(count, key) => count + 1,
(count1, count2) => count1 + count2
)
}
}
What am I missing?
If you change:
abstract class AbstractKeyCounter[K] {
To:
abstract class AbstractKeyCounter[K : ClassTag] {
This will compile.
Why? aggregateByKey is a method of PairRDDFunctions (your RDD is implicitly converted into that class), which has the following signature:
class PairRDDFunctions[K, V](self: RDD[(K, V)])
(implicit kt: ClassTag[K], vt: ClassTag[V], ord: Ordering[K] = null)
This means its constructor expects implicit values of types ClassTag[K] and vt: ClassTag[V]. Your abstract class has no knowledge of what K is, and therefore cannot provide a matching implicit value. This means the implicit conversion into PairRDDFunctions "fails" (compiler doesn't perform the conversion) and therefore the method aggregateByKey can't be found.
Adding [K : ClassTag] is shorthand for adding an implicit argument implicit kt: ClassTag[K] to the abstract class constructor, which is then used by compiler and passed to the constructor of PairRDDFunctions.
For more about ClassTags and what they're good for see this good article.
Related
I keep getting 'ambiguous implicit values' message in the following code. I tried several things (as can be seen from a couple of lines I've commented out). Any ideas on how to fix this? This is in Scala.
def createTopology(conf: Config, properties: Properties): Topology = {
// implicit val sessionSerde = Serde[WindowedSerdes.SessionWindowedSerde[String]]
// implicit val produced: Produced[Windowed[String], Long] = Produced.`with`[WindowedSerdes.SessionWindowedSerde[String], Long]
implicit val produced: Produced[Windowed[String], Long] = Produced.`with`[Windowed[String], Long]
implicit val consumed: Consumed[String, String] = Consumed.`with`[String, String]
val builder: StreamsBuilder = new StreamsBuilder()
builder.stream("streams-plaintext-input")
.groupBy((_, word) => word)
.windowedBy(SessionWindows.`with`(Duration.ofMillis(60 * 1000)))
.count()
.toStream.to("streams-pipe-output")
builder.build()
}
Compiler Errors:
Error:(52, 78) ambiguous implicit values:
both method timeWindowedSerde in object Serdes of type [T](implicit tSerde: org.apache.kafka.common.serialization.Serde[T])org.apache.kafka.streams.kstream.WindowedSerdes.TimeWindowedSerde[T]
and method sessionWindowedSerde in object Serdes of type [T](implicit tSerde: org.apache.kafka.common.serialization.Serde[T])org.apache.kafka.streams.kstream.WindowedSerdes.SessionWindowedSerde[T]
match expected type org.apache.kafka.common.serialization.Serde[org.apache.kafka.streams.kstream.Windowed[String]]
implicit val produced: Produced[Windowed[String], Long] = Produced.`with`[Windowed[String], Long]
Error:(52, 78) could not find implicit value for parameter keySerde: org.apache.kafka.common.serialization.Serde[org.apache.kafka.streams.kstream.Windowed[String]]
implicit val produced: Produced[Windowed[String], Long] = Produced.`with`[Windowed[String], Long]
Error:(52, 78) not enough arguments for method with: (implicit keySerde: org.apache.kafka.common.serialization.Serde[org.apache.kafka.streams.kstream.Windowed[String]], implicit valueSerde: org.apache.kafka.common.serialization.Serde[Long])org.apache.kafka.streams.kstream.Produced[org.apache.kafka.streams.kstream.Windowed[String],Long].
Unspecified value parameters keySerde, valueSerde.
implicit val produced: Produced[Windowed[String], Long] = Produced.`with`[Windowed[String], Long]
I just add some implicits by adding imports and it compiles:
import org.apache.kafka.common.serialization.Serde
import org.apache.kafka.streams.Topology
import org.apache.kafka.streams.kstream.{SessionWindows, Windowed}
import org.apache.kafka.streams.scala.StreamsBuilder
import org.apache.kafka.streams.scala.kstream.{Consumed, Produced}
import java.time.Duration
import java.util.Properties
import org.apache.kafka.streams.scala.ImplicitConversions._
import org.apache.kafka.streams.scala.Serdes
import org.apache.kafka.streams.scala.Serdes.{Long, String}
def createTopology(conf: Config, properties: Properties): Topology = {
// here we have two implicits to choose, I pick the sessionWindowedSerde because it was in your code
// implicit val timeWindowedSerde: Serde[Windowed[String]] = Serdes.timeWindowedSerde[String]
implicit val sessionSerde: Serde[Windowed[String]] = Serdes.sessionWindowedSerde[String]
implicit val produced: Produced[Windowed[String], Long] = Produced.`with`[Windowed[String], Long]
implicit val consumed: Consumed[String, String] = Consumed.`with`[String, String]
val builder: StreamsBuilder = new StreamsBuilder()
builder.stream("streams-plaintext-input")
.groupBy((_, word) => word)
.windowedBy(SessionWindows.`with`(Duration.ofMillis(60 * 1000)))
.count()
.toStream.to("streams-pipe-output")
builder.build()
}
If you see an error:
ambiguous implicit values
it means in your scope defined more than one implicits which satisfying the needed types. For example object org.apache.kafka.streams.scala.Serdes has a two implicits:
implicit def timeWindowedSerde[T](implicit tSerde: Serde[T]): WindowedSerdes.TimeWindowedSerde[T] =
new WindowedSerdes.TimeWindowedSerde[T](tSerde)
implicit def sessionWindowedSerde[T](implicit tSerde: Serde[T]): WindowedSerdes.SessionWindowedSerde[T] =
new WindowedSerdes.SessionWindowedSerde[T](tSerde)
where TimeWindowedSerde extends Serdes.WrapperSerde<Windowed<T>>:
static public class TimeWindowedSerde<T> extends Serdes.WrapperSerde<Windowed<T>>
and SessionWindowedSerde extends Serdes.WrapperSerde<Windowed<T>>:
static public class SessionWindowedSerde<T> extends Serdes.WrapperSerde<Windowed<T>>
both of them extends the same type Serdes.WrapperSerde<Windowed<T>>,
and in the line:
implicit val produced: Produced[Windowed[String], Long] = Produced.`with`[Windowed[String], Long]
according to with function signature:
def `with`[K, V](implicit keySerde: Serde[K], valueSerde: Serde[V]): ProducedJ[K, V] =
ProducedJ.`with`(keySerde, valueSerde)
we expect some implicit value for Serde[Windowed[String]] and compiler can't pick one because both of them are Serde[Windowed[String]].
So if you just try to add both of them to the same scope:
implicit val timeWindowedSerde: Serde[Windowed[String]] = Serdes.timeWindowedSerde[String]
implicit val sessionSerde: Serde[Windowed[String]] = Serdes.sessionWindowedSerde[String]
you will see
ambiguous implicit values
again.
Conclusion: be careful while importing large bunches of implicits and the best practice is to import only implicits that you needed.
I have a generic method that can accept any tuple of any size, the only constraint is that the first element of this tuple should be of type MyClass.
Something like this:
trait MyTrait[T <: (MyClass, _*)] {
getMyClass(x: T): MyClass = x._1
}
I've tried this
trait MyTrait[T <: (MyClass, _) with (MyClass, _, _) with (MyClass, _, _) with ...] {
getMyClass(x: T): MyClass = x._1
}
but I get the error unboud wildcard type
If you want to do this without either boilerplate or runtime reflection, Shapeless is your best bet. You can use the IsComposite type class to put type-level constraints on the first element of a tuple:
import shapeless.ops.tuple.IsComposite
trait MustBeFirst
class MyClass[P <: Product](p: P)(implicit ev: IsComposite[P] { type H = MustBeFirst }) {
def getMustBeFirst(x: P): MustBeFirst = ev.head(p)
}
And then:
scala> val good2 = (new MustBeFirst {}, "")
good2: (MustBeFirst, String) = ($anon$1#7294acee,"")
scala> val good3 = (new MustBeFirst {}, "", 123)
good3: (MustBeFirst, String, Int) = ($anon$1#6eff9288,"",123)
scala> val good4 = (new MustBeFirst {}, "", 'xyz, 123)
good4: (MustBeFirst, String, Symbol, Int) = ($anon$1#108cdf99,"",'xyz,123)
scala> val bad2 = ("abc", 123)
bad2: (String, Int) = (abc,123)
scala> new MyClass(good2)
res0: MyClass[(MustBeFirst, String)] = MyClass#5297aa76
scala> new MyClass(good3)
res1: MyClass[(MustBeFirst, String, Int)] = MyClass#3f501844
scala> new MyClass(good4)
res2: MyClass[(MustBeFirst, String, Symbol, Int)] = MyClass#24e15478
scala> new MyClass(bad2)
<console>:15: error: could not find implicit value for parameter ev: shapeless.ops.tuple.IsComposite[(String, Int)]{type H = MustBeFirst}
new MyClass(bad2)
^
If you need to use a trait, you can put the ev (for "evidence") requirement inside the definition instead of in the constructor:
trait MyTrait[P <: Product] {
implicit def ev: IsComposite[P] { type H = MustBeFirst }
}
Now any class instantiating MyTrait will have to provide evidence that P is a tuple with MustBeFirst as its first element.
It's a little bit unsafe but you can use Structural type in this case:
trait MyTrait {
def getMyClass(x: {def _1: MyClass}): MyClass = x._1
}
Scala can't use generic tuple with unknown size because Products don's inherit themeselfs. You can try to use Shapeless or Products from play json lib.
This is now possible in Scala 3 and very straightforward:
class MyClass
trait MyTrait[T <: MyClass *: _] {
def getMyClass(x: T): MyClass = x.head
}
The *: infix operator is the type level equivalent of the +: sequence prepend (or :: for lists). So we essentially require type T to be a tuple such that its first member is of type MyClass. Note that the members of generic tuples cannot be retrieved using the usual _1, _2, ... attributes, but should be accessed using list-like methods (head, tail, apply, etc.). More surprising, these methods are type-safe since they carry precise type information.
Demo:
object Test1 extends MyTrait[(MyClass, Int, String)] // Compiles
//object Test2 extends MyTrait[(Int, String)] // Does not compile!
//object Test3 extends MyTrait[EmptyTuple] // Neither does this
val myClass = Test1.getMyClass((new MyClass, 1, "abc"))
summon[myClass.type <:< MyClass] // Compiles
Read more about match types, type inference.
You need to inherit your trait from Product, through which you can have productIterator, productArity and, productElement to handle the returned value. Here is an example
case class MyClass()
trait MyTrait[T <: Product] {
def getMyClass(x: T): Option[MyClass] =
if(
x.productIterator.hasNext
&&
x.productIterator.next().isInstanceOf[MyClass]
){
Some(x.productIterator.next().asInstanceOf[MyClass])
} else {
None
}
}
case class Test() extends MyTrait[Product]
And you can invoke like this
Test().getMyClass((MyClass(), 1,3,4,5))
//res1: Option[MyClass] = Some(MyClass())
Test().getMyClass((1,3,4,5))
//res2: Option[MyClass] = None
Hope this helps you.
If you are looking for compile time guarantee then this is one of the use cases for
Shapeless,
You need to add Shapeless in your build.sbt,
libraryDependencies ++= Seq("
com.chuusai" %% "shapeless" % "2.3.3"
)
Now, you can use Shapeless to define a typesafe getter which comes with compile time guarantees,
scala> import shapeless._
// import shapeless._
scala> import ops.tuple.IsComposite
// import ops.tuple.IsComposite
scala> import syntax.std.tuple._
// import syntax.std.tuple._
scala> case class Omg(omg: String)
// defined class Omg
scala> val myStringTuple = ("All is well", 42, "hope")
// myStringTuple: (String, Int, String) = (All is well,42,hope)
scala> val myOmgTuple = (Omg("All is well"), 42, "hope")
// myOmgTuple: (Omg, Int, String) = (Omg(All is well),42,hope)
Now if you want to enrich your tuples with a "first" getter with a specific type then,
scala> implicit class GetterForProduct[B <: Product](b: B) {
| def getFirst[A](implicit comp: IsComposite[B] { type H = A }): A = b.head
| }
// defined class GetterForProduct
scala> val myString = myStringTuple.getFirst[String]
// myString: String = All is well
scala> val myOmgError = myOmgTuple.getFirst[String]
// <console>:24: error: could not find implicit value for parameter comp: shapeless.ops.tuple.IsComposite[(Omg, Int, String)]{type H = String}
// val myOmgError = myOmgTuple.getFirst[String]
// ^
scala> val myOmg = myOmgTuple.getFirst[Omg]
// myOmg: Omg = Omg(All is well
If you don't need the implicit enrichment and are just looking for a way to "lock" the type in a getter and use it for corresponding types,
scala> trait FirstGetterInProduct[A] {
| def getFirst[B <: Product](b: B)(implicit comp: IsComposite[B] { type H = A }): A = b.head
| }
// defined trait FirstGetterInProduct
scala> object firstGetterInProductForString extends FirstGetterInProduct[String]
// defined object firstGetterInProductForString
scala> object firstGetterInProductForOmg extends FirstGetterInProduct[Omg]
// defined object firstGetterInProductForOmg
// Right tuple with right getter,
scala> val myString = firstGetterInProductForString.getFirst(myStringTuple)
// myString: String = All is well
// will fail at compile time for tuple with different type for first
scala> val myOmgError = firstGetterInProductForString.getFirst(myOmgTuple)
// <console>:23: error: could not find implicit value for parameter comp: shapeless.ops.tuple.IsComposite[(Omg, Int, String)]{type H = String}
// val myOmgError = firstGetterInProductForString.getFirst(myOmgTuple)
// ^
scala> val myOmg = firstGetterInProductForOmg.getFirst(myOmgTuple)
// myOmg: Omg = Omg(All is well)
I wrote a macros, that reads class fields:
import scala.language.experimental.macros
import scala.reflect.macros.whitebox
object ArrayLikeFields {
def extract[T]: Set[String] = macro extractImpl[T]
def extractImpl[T: c.WeakTypeTag](c: whitebox.Context): c.Expr[Set[String]] = {
import c.universe._
val tree = weakTypeOf[T].decls
.collectFirst {
case m: MethodSymbol if m.isPrimaryConstructor => m
}
.map(y => y.paramLists.headOption.getOrElse(Seq.empty))
.getOrElse(Seq.empty)
.map(s => q"${s.name.decodedName.toString}")
c.Expr[Set[String]] {
q"""Set(..$tree)"""
}
}
}
I'm able to compile and run it for concrete type:
object Main extends App {
case class Person(name:String)
val res: Set[String] = ArrayLikeFields.extract[Person]
}
But i want use it with generic types like that:
object Lib {
implicit class SomeImplicit(s: String) {
def toOrgJson[T]: JSONObject = {
val arrayLikeFields: Set[String] = ArrayLikeFields.extract[T]
//some code, that uses fields, etc
null
}
}
}
Compilation error:
Error:(14, 65) type mismatch; found :
scala.collection.immutable.Set[Nothing] required: Set[String] Note:
Nothing <: String, but trait Set is invariant in type A. You may wish
to investigate a wildcard type such as _ <: String. (SLS 3.2.10)
val arrayLikeFields: Set[String] = ArrayLikeFields.extract[T]
I can't understund that. How can I solve my problem?
upd
I read scala 2.10.2 calling a 'macro method' with generic type not work about materialisation, but i have no instance of class
Try approach with materializing a type class like in 1
object Main extends App {
case class Person(name:String)
val res: Set[String] = ArrayLikeFields.extract[Person] //Set(name)
import Lib._
"abc".toOrgJson[Person] // prints Set(name)
}
object Lib {
implicit class SomeImplicit(s: String) {
def toOrgJson[T: ArrayLikeFields.Extract]: JSONObject = {
val arrayLikeFields: Set[String] = ArrayLikeFields.extract[T]
//some code, that uses fields, etc
println(arrayLikeFields) //added
null
}
}
}
import scala.language.experimental.macros
import scala.reflect.macros.whitebox
object ArrayLikeFields {
def extract[T](implicit extr: Extract[T]): Set[String] = extr()
trait Extract[T] {
def apply(): Set[String]
}
object Extract {
implicit def materializeExtract[T]: Extract[T] = macro materializeExtractImpl[T]
def materializeExtractImpl[T: c.WeakTypeTag](c: whitebox.Context): c.Expr[Extract[T]] = {
import c.universe._
val tree = weakTypeOf[T].decls
.collectFirst {
case m: MethodSymbol if m.isPrimaryConstructor => m
}
.map(y => y.paramLists.headOption.getOrElse(Seq.empty))
.getOrElse(Seq.empty)
.map(s => q"${s.name.decodedName.toString}")
c.Expr[Extract[T]] {
q"""new ArrayLikeFields.Extract[${weakTypeOf[T]}] {
override def apply(): _root_.scala.collection.immutable.Set[_root_.java.lang.String] =
_root_.scala.collection.immutable.Set(..$tree)
}"""
}
}
}
}
Actually, I don't think you need whitebox macros here, blackbox ones should be enough. So you can replace (c: whitebox.Context) with (c: blackbox.Context).
By the way, the same problem can be solved with Shapeless rather than macros (macros work in Shapeless under the hood)
object Main extends App {
case class Person(name:String)
val res: Set[String] = ArrayLikeFields.extract[Person] //Set(name)
}
object ArrayLikeFields {
def extract[T: Extract]: Set[String] = implicitly[Extract[T]].apply()
trait Extract[T] {
def apply(): Set[String]
}
object Extract {
def instance[T](strs: Set[String]): Extract[T] = () => strs
implicit def genericExtract[T, Repr <: HList](implicit
labelledGeneric: LabelledGeneric.Aux[T, Repr],
extract: Extract[Repr]
): Extract[T] = instance(extract())
implicit def hconsExtract[K <: Symbol, V, T <: HList](implicit
extract: Extract[T],
witness: Witness.Aux[K]
): Extract[FieldType[K, V] :: T] =
instance(extract() + witness.value.name)
implicit val hnilExtract: Extract[HNil] = instance(Set())
}
}
The answer on the linked question, scala 2.10.2 calling a 'macro method' with generic type not work , also applies here.
You are trying to solve a run-time problem with a compile-time macro, which is not possible.
The called method toOrgJson[T] cannot know the concrete type that T represents at compile time, but only gets that information at run-time. Therefore, you will not be able to do any concrete operations on T (such as listing its fields) at compile-time, only at run-time.
You can implement an operation like ArrayLikeFields.extract[T] at run-time using Reflection, see e.g. Get field names list from case class
I don't have a very solid understanding of Macros, but it seems that the compiler does not understand that the return type of the macro function is Set[String].
The following trick worked for me in scala 2.12.7
def toOrgJson[T]: JSONObject = {
val arrayLikeFields: Set[String] = ArrayLikeFields.extract[T].map(identity[String])
//some code, that uses fields, etc
null
}
EDIT
Actually to get a non empty Set T needs an upper bound such as T <: Person... and that is not what you wanted...
Leaving the answer here since the code does compile, and it might help someone in the direction of an answer
I'm working on some libraries for some design patterns I am running into over and over while programming with Spark. One I'm trying to generalize is where you group a Dataset by some key, then for each group, do some collation, then return the original type, so a simple example would be:
case class Counter(id: String, count: Long)
// Let's say I have some Dataset...
val counters: Dataset[Counter]
// The operation I find myself doing quite often:
import sqlContext.implicits._
counters.groupByKey(_.id)
.reduceGroups((a, b) => Counter(a.id, a.count + b.count))
.map(_._2)
So to generalize this, I add a new type:
trait KeyedData[K <: Product, T <: KeyedData[K, T] with Product] { self T =>
def key: K
def merge(other: T): T
}
Then I change the type definition of Counter to this:
case class Counter(id: String, count: Long) extends KeyedData[String, Counter] {
override def key: String = id
override def merge(other: Counter): Counter = Counter(id, count + other.count)
}
Then I made the following implicit class to add functionality to a Dataset:
implicit class KeyedDataDatasetWrapper[K <: Product, T <: KeyedData[K, T] with Product](ds: Dataset[T]) {
def collapse(implicit sqlContext: SQLContext): Dataset[T] = {
import sqlContext.implicits._
ds.groupByKey(_.key).reduceGroups(_.merge(_)).map(_._2)
}
}
Everytime time I compile though, I get this:
Unable to find encoder for type stored in a Dataset. Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._ Support for serializing other types will be added in future releases.
[error] ds.groupByKey(_.key).reduceGroups(_.merge(_)).map(_._2)
[error] ^
[error] one error found
Clearly something is not getting recognized as a Product type, so something must be wrong with my type parameters somewhere, but I'm not sure what. Any ideas?
UPDATE
I changed my implicit class to the following:
implicit class KeyedDataDatasetWrapper[K <: Product : TypeTag,
T <: KeyedData[K, T] with Product : TypeTag](ds: Dataset[T]) {
def merge(implicit sqlContext: SQLContext): Dataset[T] = {
implicit val encK: Encoder[K] = Encoders.product[K]
implicit val encT: Encoder[T] = Encoders.product[T]
ds.groupByKey(_.key).reduceGroups(_.comb(_)).map(_._2)
}
}
However now when I try and compile this:
val ds: Dataset[Counter] = ...
val merged = ds.merge
I get this compile error now, it seems Dataset[Counter] is not matching Dataset[T] in the implicit class definition:
: value merge is not a member of org.apache.spark.sql.Dataset[Counter]
[error] ds.merge
[error] ^
Why can not the code below find the imported implicits from MyProducers? In my understanding this should work fine because the implicits are in scope.
Error:(16, 34) could not find implicit value for evidence parameter of type MyProducers.ListProducer[Int]
val stuffInt:Int = getHead[Int]()
Error:(16, 34) not enough arguments for method getHead: (implicit evidence$2: MyProducers.ListProducer[Int])Int.
Unspecified value parameter evidence$2.
val stuffInt:Int = getHead[Int]()
Error:(18, 43) could not find implicit value for evidence parameter of type MyProducers.ListProducer[String]
val stuffString:String = getHead[String]()
Error:(18, 43) not enough arguments for method getHead: (implicit evidence$2: MyProducers.ListProducer[String])String.
Unspecified value parameter evidence$2.
val stuffString:String = getHead[String]()
Code:
object Resolver {
import MyProducers._
import MyProducers._
def getList[T:ListProducer]():List[T]= implicitly[ListProducer[T]].produceList
def getHead[T:ListProducer]():T= getList[T]().head
val stuffInt:Int = getHead[Int]()
val stuffString:String = getHead[String]()
val im=ip // this compiles fine, so implicits are in scope
val sm=sp
}
object MyProducers{
trait ListProducer[T]{
def produceList:List[T]
}
object IntProducer extends ListProducer[Int]{
override def produceList = List(22, 42)
}
implicit val ip=IntProducer
object StringProducer extends ListProducer[String]{
override def produceList = List("stuff", "Shiraly")
}
implicit val sp=StringProducer
}
What is strange, that this code compiles fine:
object Resolver {
import MyProducers._
import MyProducers._
trait ListProducer[T]{
def produceList:List[T]
}
object IntProducer extends ListProducer[Int]{
override def produceList = List(22, 42)
}
implicit val ip=IntProducer
object StringProducer extends ListProducer[String]{
override def produceList = List("stuff", "Shiraly")
}
implicit val sp=StringProducer
def getList[T:ListProducer]():List[T]= implicitly[ListProducer[T]].produceList
def getHead[T:ListProducer]():T= getList[T]().head
val stuffInt:Int = getHead[Int]()
val stuffString:String = getHead[String]()
val im=ip
val sm=sp
}
Any idea why the first code does not compile while the second does ?
Implicit vals and defs need to have type annotations on them. Add those, and the implicits will be found.