I am trying to use custom accumulator in Apache Spark to accumulate in a set. The result should have Set[String] type. For this I creat custom accumulator:
object SetAccumulatorParam extends AccumulatorParam[Set[String]] {
def addInPlace(r1: mutable.Set[String], r2: mutable.Set[String]): mutable.Set[String] = {
r1 ++= r2
}
def zero(initialValue: mutable.Set[String]): mutable.Set[String] = {
Set()
}
}
Yet I can not instantiate variable of this type.
val tags = sc.accumulator(Set(""))(SetAccumulatorParam)
result in error .Please help.
required: org.apache.spark.AccumulatorParam[Set[String]]
Adding to Traian's answer, here is a general case SetAccumulator for spark 2.x.
import org.apache.spark.util.AccumulatorV2
class SetAccumulator[T](var value: Set[T]) extends AccumulatorV2[T, Set[T]] {
def this() = this(Set.empty[T])
override def isZero: Boolean = value.isEmpty
override def copy(): AccumulatorV2[T, Set[T]] = new SetAccumulator[T](value)
override def reset(): Unit = Set.empty[T]
override def add(v: T): Unit = value + v
override def merge(other: AccumulatorV2[T, Set[T]]): Unit = value ++ other.value
override def value: Set[String] = value
}
And you can use it like this:
val accum = new SetAccumulator[String]()
spark.sparkContext.register(accum, "My Accum") // Optional, name it for SparkUI
spark.sparkContext.parallelize(Seq("a", "b", "a", "b", "c")).foreach(s => accum.add(s))
accum.value
Which outputs:
Set[String] = Set(a, b, c)
Update for 1.6:
object StringSetAccumulatorParam extends AccumulatorParam[Set[String]] {
def zero(initialValue: Set[String]): Set[String] = { Set() }
def addInPlace(s1: Set[String], s2: Set[String]): Set[String] = { s1 ++ s2 }
}
val stringSetAccum = sc.accumulator(Set[String]())(StringSetAccumulatorParam)
sc.parallelize(Array("1", "2", "3", "1")).foreach(s => stringSetAccum += Set(s))
stringSetAccum.value.toString
res0: String = Set(2, 3, 1)
In Spark 2.0 you're probably fine with using the existing collectionAccumulator (if you care about distinct values, you can check and add only if they don't exist):
val collAcc = spark.sparkContext.collectionAccumulator[String]("myCollAcc")
collAcc: org.apache.spark.util.CollectionAccumulator[String] = CollectionAccumulator(id: 32154, name: Some(myCollAcc), value: [])
spark.sparkContext.parallelize(Array("1", "2", "3")).foreach(s => collAcc.add(s))
collAcc.value.toString
res0: String = [3, 2, 1]
More info: https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.util.AccumulatorV2
Related
How to make this code work?
sealed abstract class Addable[A] {
def sum(el: Seq[A]): A
}
class MyAddable[A]() extends Addable[A] {
override def sum(el: Seq[A]): A = {
el.sum
}
}
val myvec = Vector(1, 2, 3)
val mylist = List(1, 2, 3)
val inst = new MyAddable
val res0 = inst.sum(mylist) // should return 6
val res1 = inst.sum(myvec) // should return 6
println(s"res0 = $res0")
println(s"res1 = $res1")
I want to pass a generic data type (Vector/List[Int]) and get a sum of it's elements using the described signatures and code structure.
At the moment I am getting:
found : immutable.this.List[scala.this.Int]
required: Seq[scala.this.Nothing]
Scalafiddle
The specific error is here:
val inst = new MyAddable
which should be
val inst = new MyAddable[Int]()
MyAddable is generic but you are not specifying a type, so it is assuming Nothing, hence the error message.
sealed abstract class Addable[A] {
def sum(el: Seq[A]): A
}
class MyAddable[A: Numeric]() extends Addable[A] {
override def sum(el: Seq[A]): A = {
el.sum
}
}
val myvec = Vector(1, 2, 3)
val mylist = List(1, 2, 3)
val inst = new MyAddable[Int]()
val res0 = inst.sum(mylist)
val res1 = inst.sum(myvec)
println(s"res0 = $res0")
println(s"res1 = $res1")
import cats.{Semigroup}
import cats.implicits._
// Specify a generic Reduce Function. Use Contravariant parameter to support reduce on derived types
trait Reduce[-F[_]] {
def reduce[A](fa:F[A])(f:(A,A) => A):A
}
object Reduce {
implicit val SeqReduce = new Reduce[Seq] {
def reduce[A] (data:Seq[A])(f:(A,A) => A ):A = data reduce f
}
implicit val OptReduce = new Reduce[Option] {
def reduce[A] (data:Option[A])(f:(A,A) => A ):A = data reduce f
}
}
// Generic sum function
def sum[A:Semigroup, F[_]](container: F[A])(implicit red:Reduce[F]):A = {
red.reduce(container)(Semigroup.combine(_,_))
}
val myvec = Vector(1, 2, 3)
val mylist = List (1, 2, 3)
val mymap = Map ( 1 -> "one",
2 -> "two",
3 -> "three"
)
val myopt = Some(1)
val res0 = sum(myvec)
val res1 = sum(mylist)
val res2 = sum(myopt)
println(s"res0 = $res0")
println(s"res1 = $res1")
println(s"res2 = $res2")
This gets a little more complicated for Maps
I have code where a class can provide modified copies of itself, like so:
case class A(i: Int, s: String) {
def foo(ii: Int): A = copy(i = ii)
def bar(ss: String): A = copy(s = ss)
}
I want to create a function that takes some optional arguments and creates these modified copies using these arguments if they are defined:
def subA(a: A, oi: Option[Int] = None, os: Option[String] = None): A = {
if (oi.isDefined && os.isDefined)
a.foo(oi.get).bar(os.get)
else if (oi.isDefined && !os.isDefined)
a.foo(oi.get)
else if (!oi.isDefined && os.isDefined)
a.bar(os.get)
else
a
}
This is clearly not sustainable, as I add new optional arguments, I have to create cases for every combination of arguments...
I also cannot do:
a.foo(oi.getOrElse(a.i)).bar(os.getOrElse(a.s))
Because in my actual code, if oi or os is not provided, I should NOT run their associated foo and bar functions. In other words, I have no default arguments for oi and os, rather their existence defines whether I should run certain functions at all.
Current solution, extend the class:
implicit class A_extended(a: A) {
def fooOption(oi: Option[Int]): A = if (oi.isDefined) a.foo(oi.get) else a
def barOption(os: Option[String]): A = if (os.isDefined) a.bar(os.get) else a
}
def subA(a: A, oi: Option[Int] = None, os: Option[String] = None): A = {
a.fooOption(oi).barOption(os)
}
But this problem comes up often and it's a bit tedious to do this constantly, is there something like:
// oi: Option[Int], foo: Int => A
oi.ifDefinedThen(a.foo(_), a) // returns a.foo(oi.get) if oi is not None, else just a
Or should I just extend Option to provide this functionality?
Use fold on option final def fold[B](ifEmpty: => B)(f: A => B): B
def subA(a: A, oi: Option[Int] = None, os: Option[String] = None): A = {
val oia = oi.fold(a)(a.foo)
os.fold(oia)(oia.bar)
}
Scala REPL
scala> def subA(a: A, oi: Option[Int] = None, os: Option[String] = None): A = {
val oia = oi.fold(a)(a.foo)
os.fold(oia)(oia.bar)
}
defined function subA
scala> subA(A(1, "bow"), Some(2), Some("cow"))
res10: A = A(2, "cow")
or
Use pattern matching to deal with options elegantly. Create a tuple of options and then use pattern matching to extract the inner values
val a = Some(1)
val b = Some("some string")
(a, b) match {
case (Some(x), Some(y)) =>
case (Some(x), _) =>
case (_, Some(y)) =>
case (_, _) =>
}
Well... You can use reflection to create arbitrary copiers and even updaters for your case classes.
The difference is that an updater updates the case class instance and the copier create a new copy with updated fields.
An implementation of an updater can be done as below,
import scala.language.existentials
import scala.reflect.runtime.{universe => ru}
def copyInstance[C: scala.reflect.ClassTag](instance: C, mapOfUpdates: Map[String, T forSome {type T}]): C = {
val runtimeMirror = ru.runtimeMirror(instance.getClass.getClassLoader)
val instanceMirror = runtimeMirror.reflect(instance)
val tpe = instanceMirror.symbol.toType
val copyMethod = tpe.decl(ru.TermName("copy")).asMethod
val copyMethodInstance = instanceMirror.reflectMethod(copyMethod)
val updates = tpe.members
.filter(member => member.asTerm.isCaseAccessor && member.asTerm.isMethod)
.map(member => {
val term = member.asTerm
//check if we need to update it or use the instance value
val updatedValue = mapOfUpdates.getOrElse(
key = term.name.toString,
default = instanceMirror.reflectField(term).get
)
updatedValue
}).toSeq.reverse
val copyOfInstance = copyMethodInstance(updates: _*).asInstanceOf[C]
copyOfInstance
}
def updateInstance[C: scala.reflect.ClassTag](instance: C, mapOfUpdates: Map[String, T forSome {type T}]): C = {
val runtimeMirror = ru.runtimeMirror(instance.getClass.getClassLoader)
val instanceMirror = runtimeMirror.reflect(instance)
val tpe = instanceMirror.symbol.toType
tpe.members.foreach(member => {
val term = member.asTerm
term.isCaseAccessor && term.isMethod match {
case true =>
// it is a case class accessor, check if we need to update it
mapOfUpdates.get(term.name.toString).foreach(updatedValue => {
val fieldMirror = instanceMirror.reflectField(term.accessed.asTerm)
// filed mirrors can even update immutable fields !!
fieldMirror.set(updatedValue)
})
case false => // Not a case class accessor, do nothing
}
})
instance
}
And since you wanted to use Options to copy, here is your define once and use with all case classes copyUsingOptions
def copyUsingOptions[C: scala.reflect.ClassTag](instance: C, listOfUpdateOptions: List[Option[T forSome {type T}]]): C = {
val runtimeMirror = ru.runtimeMirror(instance.getClass.getClassLoader)
val instanceMirror = runtimeMirror.reflect(instance)
val tpe = instanceMirror.symbol.toType
val copyMethod = tpe.decl(ru.TermName("copy")).asMethod
val copyMethodInstance = instanceMirror.reflectMethod(copyMethod)
val updates = tpe.members.toSeq
.filter(member => member.asTerm.isCaseAccessor && member.asTerm.isMethod)
.reverse
.zipWithIndex
.map({ case (member, index) =>
listOfUpdateOptions(index).getOrElse(instanceMirror.reflectField(member.asTerm).get)
})
val copyOfInstance = copyMethodInstance(updates: _*).asInstanceOf[C]
copyOfInstance
}
Now you can use these updateInstance or copyInstance to update or copy instances of any case classes,
case class Demo(id: Int, name: String, alliance: Option[String], power: Double, lat: Double, long: Double)
// defined class Demo
val d1 = Demo(1, "player_1", None, 15.5, 78.404, 71.404)
// d1: Demo = Demo(1,player_1,None,15.5,78.404,71.404)
val d1WithAlliance = copyInstance(d1, Map("alliance" -> Some("Empires")))
// d1WithAlliance: Demo = Demo(1,player_1,Some(Empires),15.5,78.404,71.404)
val d2 = copyInstance(d1, Map("id" -> 2, "name" -> "player_2"))
d2: Demo = Demo(2,player_2,None,15.5,78.404,71.404)
val d3 = copyWithOptions(
d1, List(Some(3),
Some("player_3"), Some(Some("Vikings")), None, None, None)
)
// d3: Demo = Demo(3,player_3,Some(Vikings),15.5,78.404,71.404)
// Or you can update instance using updateInstance
val d4 = updateInstance(d1, Map("id" -> 4, "name" -> "player_4"))
// d4: Demo = Demo(4,player_4,None,15.5,78.404,71.404)
d1
// d1: Demo = Demo(4,player_4,None,15.5,78.404,71.404)
Another option (no pun intended, heh) would be to have foo and bar themselves take and fold over Options:
case class A(i: Int, s: String) {
def foo(optI: Option[Int]): A =
optI.fold(this)(ii => copy(i = ii))
def bar(optS: Option[String]): A =
optS.fold(this)(ss => copy(s = ss))
}
Then, subA can be minimal:
object A {
def subA(
a: A,
optI: Option[Int] = None,
optS: Option[String] = None): A =
a foo optI bar optS
}
You can also overload foo and bar to take plain Int and String as well if you have to maintain the API; in that case make the Option-taking methods call out to their corresponding non-Option-taking ones.
What is the simplest way to convert a java.util.IdentityHashMap[A,B] into a subtype of scala.immutable.Map[A,B]? I need to keep keys separate unless they are eq.
Here's what I've tried so far:
scala> case class Example()
scala> val m = new java.util.IdentityHashMap[Example, String]()
scala> m.put(Example(), "first!")
scala> m.put(Example(), "second!")
scala> m.asScala // got a mutable Scala equivalent OK
res14: scala.collection.mutable.Map[Example,String] = Map(Example() -> first!, Example() -> second!)
scala> m.asScala.toMap // doesn't work, since toMap() removes duplicate keys (testing with ==)
res15: scala.collection.immutable.Map[Example,String] = Map(Example() -> second!)
Here's a simple implementation of identity map in Scala. In usage, it should be similar to standard immutable map.
Example usage:
val im = IdentityMap(
new String("stuff") -> 5,
new String("stuff") -> 10)
println(im) // IdentityMap(stuff -> 5, stuff -> 10)
Your case:
import scala.collection.JavaConverters._
import java.{util => ju}
val javaIdentityMap: ju.IdentityHashMap = ???
val scalaIdentityMap = IdentityMap.empty[String,Int] ++ javaIdentityMap.asScala
Implementation itself (for performance reasons, there may be some more methods that need to be overridden):
import scala.collection.generic.ImmutableMapFactory
import scala.collection.immutable.MapLike
import IdentityMap.{Wrapper, wrap}
class IdentityMap[A, +B] private(underlying: Map[Wrapper[A], B])
extends Map[A, B] with MapLike[A, B, IdentityMap[A, B]] {
def +[B1 >: B](kv: (A, B1)) =
new IdentityMap(underlying + ((wrap(kv._1), kv._2)))
def -(key: A) =
new IdentityMap(underlying - wrap(key))
def iterator =
underlying.iterator.map {
case (kw, v) => (kw.value, v)
}
def get(key: A) =
underlying.get(wrap(key))
override def size: Int =
underlying.size
override def empty =
new IdentityMap(underlying.empty)
override def stringPrefix =
"IdentityMap"
}
object IdentityMap extends ImmutableMapFactory[IdentityMap] {
def empty[A, B] =
new IdentityMap(Map.empty)
private class Wrapper[A](val value: A) {
override def toString: String =
value.toString
override def equals(other: Any) = other match {
case otherWrapper: Wrapper[_] =>
value.asInstanceOf[AnyRef] eq otherWrapper.value.asInstanceOf[AnyRef]
case _ => false
}
override def hashCode =
System.identityHashCode(value)
}
private def wrap[A](key: A) =
new Wrapper(key)
}
One way to handle this would be change what equality means for the class, e.g.
scala> case class Example() {
override def equals( that:Any ) = that match {
case that:AnyRef => this eq that
case _ => false
}
}
defined class Example
scala> val m = new java.util.IdentityHashMap[Example, String]()
m: java.util.IdentityHashMap[Example,String] = {}
scala> m.put(Example(), "first!")
res1: String = null
scala> m.put(Example(), "second!")
res2: String = null
scala> import scala.collection.JavaConverters._
import scala.collection.JavaConverters._
scala> m.asScala
res3: scala.collection.mutable.Map[Example,String] = Map(Example() -> second!, Example() -> first!)
scala> m.asScala.toMap
res4: scala.collection.immutable.Map[Example,String] = Map(Example() -> second!, Example() -> first!)
Or if you don't want to change equality for the class, you could make a wrapper.
Of course, this won't perform as well as a Map that uses eq instead of ==; it might be worth asking for one....
I cannot find the way to define a MongoRecord with a Map[String,String] field inside it in Lift - MongoRecord.
The Lift documentation says:
All standard Record Fields are supported. There is also support for Mongo specific types; ObjectId, UUID, Pattern, List, and Map.
How can I define Map and List fields?
I defined a BsonRecordMapField:
class BsonRecordMapField[OwnerType <: BsonRecord[OwnerType], SubRecordType <: BsonRecord[SubRecordType]]
(rec: OwnerType, valueMeta: BsonMetaRecord[SubRecordType])(implicit mf: Manifest[SubRecordType])
extends MongoMapField[OwnerType, SubRecordType](rec: OwnerType) {
import scala.collection.JavaConversions._
override def asDBObject: DBObject = {
val javaMap = new HashMap[String, DBObject]()
for ((key, element) <- value) {
javaMap.put(key.asInstanceOf[String], element.asDBObject)
}
val dbl = new BasicDBObject(javaMap)
dbl
}
override def setFromDBObject(dbo: DBObject): Box[Map[String, SubRecordType]] = {
val mapResult: Map[String, SubRecordType] = (for ((key, dboEl) <- dbo.toMap.toSeq) yield (key.asInstanceOf[String], valueMeta.fromDBObject(dboEl.asInstanceOf[DBObject]))).toMap
setBox(Full(mapResult))
}
override def asJValue = {
val fieldList = (for ((key, elem) <- value) yield JField(key, elem.asJValue)).toList
JObject(fieldList)
}
override def setFromJValue(jvalue: JValue) = jvalue match {
case JNothing | JNull if optional_? => setBox(Empty)
case JObject(fieldList) => val retrievedMap = fieldList.map {
field =>
val key = field.name
val valRetrieved = valueMeta.fromJValue(field.value) openOr valueMeta.createRecord
(key, valRetrieved)
}.toMap
setBox(Full(retrievedMap))
case other => setBox(FieldHelpers.expectedA("JObject", other))
}
}
This is the implicit query for Rogue:
class BsonRecordMapQueryField[M <: BsonRecord[M], B <: BsonRecord[B]](val field: BsonRecordMapField[M, B])(implicit mf: Manifest[B]) {
def at(key: String): BsonRecordField[M, B] = {
val listBox = field.setFromJValue(JObject(List(JField("notExisting", JInt(0)))))
val rec = listBox.open_!.head._2
new BsonRecordField[M, B](field.owner, rec.meta)(mf) {
override def name = field.name + "." + key
}
}
}
object ExtendedRogue extends Rogue {
implicit def bsonRecordMapFieldToBsonRecordMapQueryField[M <: BsonRecord[M], B <: BsonRecord[B]](f: BsonRecordMapField[M, B])(implicit mf: Manifest[B]): BsonRecordMapQueryField[M, B] = new BsonRecordMapQueryField[M, B](f) (mf)
}
You can use the at operator in map now.
What about MongoMapField?
I'm looking to create a class that is basically a collection with an extra field. However, I keep running into problems and am wondering what the best way of implementing this is. I've tried to follow the pattern given in the Scala book. E.g.
import scala.collection.IndexedSeqLike
import scala.collection.mutable.Builder
import scala.collection.generic.CanBuildFrom
import scala.collection.mutable.ArrayBuffer
class FieldSequence[FT,ST](val field: FT, seq: IndexedSeq[ST] = Vector())
extends IndexedSeq[ST] with IndexedSeqLike[ST,FieldSequence[FT,ST]] {
def apply(index: Int): ST = return seq(index)
def length = seq.length
override def newBuilder: Builder[ST,FieldSequence[FT,ST]]
= FieldSequence.newBuilder[FT,ST](field)
}
object FieldSequence {
def fromSeq[FT,ST](field: FT)(buf: IndexedSeq[ST])
= new FieldSequence(field, buf)
def newBuilder[FT,ST](field: FT): Builder[ST,FieldSequence[FT,ST]]
= new ArrayBuffer mapResult(fromSeq(field))
implicit def canBuildFrom[FT,ST]:
CanBuildFrom[FieldSequence[FT,ST], ST, FieldSequence[FT,ST]] =
new CanBuildFrom[FieldSequence[FT,ST], ST, FieldSequence[FT,ST]] {
def apply(): Builder[ST,FieldSequence[FT,ST]]
= newBuilder[FT,ST]( _ ) // What goes here?
def apply(from: FieldSequence[FT,ST]): Builder[ST,FieldSequence[FT,ST]]
= from.newBuilder
}
}
The problem is the CanBuildFrom that is implicitly defined needs an apply method with no arguments. But in these circumstances this method is meaningless, as a field (of type FT) is needed to construct a FieldSequence. In fact, it should be impossible to construct a FieldSequence, simply from a sequence of type ST. Is the best I can do to throw an exception here?
Then your class doesn't fulfill the requirements to be a Seq, and methods like flatMap (and hence for-comprehensions) can't work for it.
I'm not sure I agree with Landei about flatMap and map. If you replace with throwing an exception like this, most of the operations should work.
def apply(): Builder[ST,FieldSequence[FT,ST]] = sys.error("unsupported")
From what I can see in TraversableLike, map and flatMap and most other ones use the apply(repr) version. So for comprehensions seemingly work. It also feels like it should follow the Monad laws (the field is just carried accross).
Given the code you have, you can do this:
scala> val fs = FieldSequence.fromSeq("str")(Vector(1,2))
fs: FieldSequence[java.lang.String,Int] = FieldSequence(1, 2)
scala> fs.map(1 + _)
res3: FieldSequence[java.lang.String,Int] = FieldSequence(2, 3)
scala> val fs2 = FieldSequence.fromSeq("str1")(Vector(10,20))
fs2: FieldSequence[java.lang.String,Int] = FieldSequence(10, 20)
scala> for (x <- fs if x > 0; y <- fs2) yield (x + y)
res5: FieldSequence[java.lang.String,Int] = FieldSequence(11, 21, 12, 22)
What doesn't work is the following:
scala> fs.map(_ + "!")
// does not return a FieldSequence
scala> List(1,2).map(1 + _)(collection.breakOut): FieldSequence[String, Int]
java.lang.RuntimeException: unsupported
// this is where the apply() is used
For breakOut to work you would need to implement the apply() method. I suspect you could generate a builder with some default value for field: def apply() = newBuilder[FT, ST](getDefault) with some implementation of getDefault that makes sense for your use case.
For the fact that fs.map(_ + "!") does not preserve the type, you need to modify your signature and implementation, so that the compiler can find a CanBuildFrom[FieldSequence[String, Int], String, FieldSequence[String, String]]
implicit def canBuildFrom[FT,ST_FROM,ST]:
CanBuildFrom[FieldSequence[FT,ST_FROM], ST, FieldSequence[FT,ST]] =
new CanBuildFrom[FieldSequence[FT,ST_FROM], ST, FieldSequence[FT,ST]] {
def apply(): Builder[ST,FieldSequence[FT,ST]]
= sys.error("unsupported")
def apply(from: FieldSequence[FT,ST_FROM]): Builder[ST,FieldSequence[FT,ST]]
= newBuilder[FT, ST](from.field)
}
In the end, my answer was very similar to that in a previous question. The difference with that question and my original and the answer are slight but basically allow anything that has a sequence to be a sequence.
import scala.collection.SeqLike
import scala.collection.mutable.Builder
import scala.collection.mutable.ArrayBuffer
import scala.collection.generic.CanBuildFrom
trait SeqAdapter[+A, Repr[+X] <: SeqAdapter[X,Repr]]
extends Seq[A] with SeqLike[A,Repr[A]] {
val underlyingSeq: Seq[A]
def create[B](seq: Seq[B]): Repr[B]
def apply(index: Int) = underlyingSeq(index)
def length = underlyingSeq.length
def iterator = underlyingSeq.iterator
override protected[this] def newBuilder: Builder[A,Repr[A]] = {
val sac = new SeqAdapterCompanion[Repr] {
def createDefault[B](seq: Seq[B]) = create(seq)
}
sac.newBuilder(create)
}
}
trait SeqAdapterCompanion[Repr[+X] <: SeqAdapter[X,Repr]] {
def createDefault[A](seq: Seq[A]): Repr[A]
def fromSeq[A](creator: (Seq[A]) => Repr[A])(seq: Seq[A]) = creator(seq)
def newBuilder[A](creator: (Seq[A]) => Repr[A]): Builder[A,Repr[A]] =
new ArrayBuffer mapResult fromSeq(creator)
implicit def canBuildFrom[A,B]: CanBuildFrom[Repr[A],B,Repr[B]] =
new CanBuildFrom[Repr[A],B,Repr[B]] {
def apply(): Builder[B,Repr[B]] = newBuilder(createDefault)
def apply(from: Repr[A]) = newBuilder(from.create)
}
}
This fixes all the problems huynhjl brought up. For my original problem, to have a field and a sequence treated as a sequence, a simple class will now do.
trait Field[FT] {
val defaultValue: FT
class FieldSeq[+ST](val field: FT, val underlyingSeq: Seq[ST] = Vector())
extends SeqAdapter[ST,FieldSeq] {
def create[B](seq: Seq[B]) = new FieldSeq[B](field, seq)
}
object FieldSeq extends SeqAdapterCompanion[FieldSeq] {
def createDefault[A](seq: Seq[A]): FieldSeq[A] =
new FieldSeq[A](defaultValue, seq)
override implicit def canBuildFrom[A,B] = super.canBuildFrom[A,B]
}
}
This can be tested as so:
val StringField = new Field[String] { val defaultValue = "Default Value" }
StringField: java.lang.Object with Field[String] = $anon$1#57f5de73
val fs = new StringField.FieldSeq[Int]("str", Vector(1,2))
val fsfield = fs.field
fs: StringField.FieldSeq[Int] = (1, 2)
fsfield: String = str
val fm = fs.map(1 + _)
val fmfield = fm.field
fm: StringField.FieldSeq[Int] = (2, 3)
fmfield: String = str
val fs2 = new StringField.FieldSeq[Int]("str1", Vector(10, 20))
val fs2field = fs2.field
fs2: StringField.FieldSeq[Int] = (10, 20)
fs2field: String = str1
val ffor = for (x <- fs if x > 0; y <- fs2) yield (x + y)
val fforfield = ffor.field
ffor: StringField.FieldSeq[Int] = (11, 21, 12, 22)
fforfield: String = str
val smap = fs.map(_ + "!")
val smapfield = smap.field
smap: StringField.FieldSeq[String] = (1!, 2!)
smapfield: String = str
val break = List(1,2).map(1 + _)(collection.breakOut): StringField.FieldSeq[Int]
val breakfield = break.field
break: StringField.FieldSeq[Int] = (2, 3)
breakfield: String = Default Value
val x: StringField.FieldSeq[Any] = fs
val xfield = x.field
x: StringField.FieldSeq[Any] = (1, 2)
xfield: String = str