Scala: subclassing with factories - scala

Let's say I've got two traits, one of them being a factory for another:
trait BaseT {
val name: String
def introduceYourself() = println("Hi, I am " +name)
// some other members ...
}
trait BaseTBuilder {
def build: BaseT
}
Now, I want to extend BaseT:
trait ExtendedT extends BaseT {
val someNewCoolField: Int
override def introduceYourself() = {
super.introduceYourself()
println(someNewCoolField)
}
// some other extra fields
Let's say I know how to initialize the new fields, but I'd like to use BaseTBuilder for initializing superclass members. Is there a possibility to create a trait that would be able to instantiate ExtendedT somehow? This approach obviously fails:
trait ExtendedTBuilder { self: TBuilder =>
def build: ExtendedT = {
val base = self.build()
val extended = base.asInstanceOf[ExtendedT] // this cannot work
extended.someNewCoolField = 4 // this cannot work either, assignment to val
extended
}
def buildDifferently: ExtendedT = {
new ExtendedT(4) // this fails, we don't know anything about constructors of ExtendedT
}
def build3: ExtendedT = {
self.build() with {someNewCoolField=5} //that would be cool, but it cannot work either
}
}
I'd like to have such a set of traits (or objects) that when someone supplies concrete implementation of BaseT and BaseTBuilder I could instantiantiate ExtendedT by writing:
val extendedBuilder = new ConcreteBaseTBuilder with ExtendedTBuilder
val e: ExtendedT = extendedBuilder.build
ExtendedT could contain a field of type BaseT, but then it would require manually proxying all the necessary methods and fields, which is in my opinion a violation of DRY principle. How to solve that?

How about create ExtendBaseT instance in your ExtendBaseTBuilder
trait ExtendBaseTBuilder { self : BaseTBuilder =>
def build: ExtendBaseT = {
new ExtendBaseT {
val someNewCoolField: Int = 3
}
}
}

Related

Is there any way to rewrite the below code using Scala value class or other concept?

I need to write two functions to get the output format and the output index for file conversion. As part of this, I wrote a TransformSettings class for these methods and set the default value. And in the transformer class, I created a new object of TransformSettings class to get the default values for each job run. Also, I have another class called ParquetTransformer that extends Transformer where I want to change these default values. So I implemented like below.
class TransformSettings{
def getOuputFormat: String = {
"orc"
}
def getOuputIndex(table: AWSGlueDDL.Table): Option[String] = {
table.StorageDescriptor.SerdeInfo.Parameters.get("orc.column.index.access")
}
}
class Transformer{
def getTransformSettings: TransformSettings = {
new TransformSettings
}
def posttransform(table: AWSGlueDDL.Table):Dateframe ={
val indexAccess = getTransformSettings.getOuputIndex(table: AWSGlueDDL.Table)
........
}
}
class ParquetTransformer extends Transformer{
override def getTransformSettings: TransformSettings = {
val transformSettings = new TransformSettings {
override def getOuputFormat: String = {
"parquet"
}
override def getOuputIndex(table: AWSGlueDDL.Table): Option[String] = {
table.StorageDescriptor.SerdeInfo.Parameters.get("parquet.column.index.access")
}
}
}
}
Is there a way to avoid creating a brand new object of TransformSettings in Transfomer class every time this is called?
Also is there a way to rewrite the code using Scala value class?
As #Dima proposed in the comments try to make TransformSettings a field / constructor parameter (a val) in the class Transformer and instantiate them outside
class TransformSettings{
def getOuputFormat: String = {
"orc"
}
def getOuputIndex(table: AWSGlueDDL.Table): Option[String] = {
table.StorageDescriptor.SerdeInfo.Parameters.get("orc.column.index.access")
}
}
class Transformer(val transformSettings: TransformSettings) {
def posttransform(table: AWSGlueDDL.Table): DataFrame ={
val indexAccess = transformSettings.getOuputIndex(table: AWSGlueDDL.Table)
???
}
}
val parquetTransformSettings = new TransformSettings {
override def getOuputFormat: String = {
"parquet"
}
override def getOuputIndex(table: AWSGlueDDL.Table): Option[String] = {
table.StorageDescriptor.SerdeInfo.Parameters.get("parquet.column.index.access")
}
}
class ParquetTransformer extends Transformer(parquetTransformSettings)
You don't seem to need value classes (... extends AnyVal) now. They are more about unboxing, not about life-cycle management. TransformSettings and Transformer can't be value classes because they are not final (you're extending them in class ParquetTransformer extends Transformer... and new TransformSettings { ... }). By the way, value classes have many limatations
https://failex.blogspot.com/2017/04/the-high-cost-of-anyval-subclasses.html
https://github.com/scala/bug/issues/12271
Besides value classes, there are scala-newtype library in Scala 2 and opaque types in Scala 3.

Scala inner class typing

Suppose I have the following:
class Deck[+T] {
class Card(value: T)
class Pile(val cards: List[Card]) {
val deck = Deck.this
def shuffle(shuffler: Shuffler): shuffler.shuffle(this)
}
}
trait Shuffler {
def shuffle[T](pile: Deck[T]#Pile): pile.type
}
object Shuffler {
def randomShuffler(r: Random): Shuffler = new Shuffler {
override def shuffle[T](pile: Deck[T]#Pile): pile.deck.Pile = {
new pile.deck.Pile(r.shuffle(pile.cards))
}
}
}
Is it possible to do the same thing without having the val deck declaration in Pile? Also, is it possible to do the same thing without the T declaration in shuffle()?
I had been playing around with things such as pile: x.Pile forSome {val x: Deck[_]}, but they don't seem to compile due to typing issues (read: me not fully understanding semantics therein), and I'm trying to avoid rewriting Shuffler to, say, work with raw lists instead (how do I express that, anyways? List[Deck[T]#Card] is not quite there, since I want lists of Cards from the same Deck).
Is it possible to do the same thing without having the val deck declaration in Pile?
Not if we want to enforce a value dependent type, which you seem to want (e.g. two Pile[Int] being only compatible if they refer to the same deck value).
Also, is it possible to do the same thing without the T declaration in shuffle()?
You can move type parameters to type members, this can sometimes save you from needing to specify them when they are only internally used.
Here is an idea:
object Pile {
def apply(deck0: Deck): Pile { type D = deck0.type } = new Pile {
val deck = deck0
type D = deck0.type
val cards = deck.cards.toList
}
}
trait Pile { self =>
type D <: Deck
type Self = Pile { type D = self.deck.type }
val deck : D
def cards: List[deck.Card]
def shuffle(shuffler: Shuffler): Self = shuffler.shuffle(this)
}
object Deck {
def apply[A1](values: Set[A1]): Deck { type A = A1 } = new Deck {
type A = A1
val cards = values.map(Card(_))
}
}
trait Deck {
type A
case class Card(value: A)
def cards: Set[Card]
}
trait Shuffler {
def shuffle(pile: Pile): pile.Self
}
object Shuffler {
def randomShuffler(r: util.Random): Shuffler = new Shuffler {
def shuffle(pile: Pile): pile.Self = new Pile {
type D = pile.deck.type
val deck = pile.deck
val cards = r.shuffle(pile.cards)
}
}
}
Test:
val deck = Deck(Set(1 to 10: _*))
val pile0 = Pile(deck)
pile0.cards
val sh = Shuffler.randomShuffler(util.Random)
val pile1 = pile0.shuffle(sh)
pile1.cards
As you can see, enforcing value dependent types is not trivial, so the question is if you really need them, or you are ok with a simple type parameter for A. For example, the above doesn't prevent you from accidentally putting the same card twice into a pile.

Scala Reflection Conundrum: Can you explain these weird results?

I wrote some Scala code, using reflection, that returns all vals in an object that are of a certain type. Below are three versions of this code. One of them works but is ugly. Two attempts to improve it don't work, in very different ways. Can you explain why?
First, the code:
import scala.reflect.runtime._
import scala.util.Try
trait ScopeBase[T] {
// this version tries to generalize the type. The only difference
// from the working version is [T] instead of [String]
def enumerateBase[S: universe.TypeTag]: Seq[T] = {
val mirror = currentMirror.reflect(this)
universe.typeOf[S].decls.map {
decl => Try(mirror.reflectField(decl.asMethod).get.asInstanceOf[T])
}.filter(_.isSuccess).map(_.get).filter(_ != null).toSeq
}
}
trait ScopeString extends ScopeBase[String] {
// This version works but requires passing the val type
// (String, in this example) explicitly. I don't want to
// duplicate the code for different val types.
def enumerate[S: universe.TypeTag]: Seq[String] = {
val mirror = currentMirror.reflect(this)
universe.typeOf[S].decls.map {
decl => Try(mirror.reflectField(decl.asMethod).get.asInstanceOf[String])
}.filter(_.isSuccess).map(_.get).filter(_ != null).toSeq
}
// This version tries to avoid passing the object's type
// as the [S] type parameter. After all, the method is called
// on the object itself; so why pass the type?
def enumerateThis: Seq[String] = {
val mirror = currentMirror.reflect(this)
universe.typeOf[this.type].decls.map {
decl => Try(mirror.reflectField(decl.asMethod).get.asInstanceOf[String])
}.filter(_.isSuccess).map(_.get).filter(_ != null).toSeq
}
}
// The working example
object Test1 extends ScopeString {
val IntField: Int = 13
val StringField: String = "test"
lazy val fields = enumerate[Test1.type]
}
// This shows how the attempt to generalize the type doesn't work
object Test2 extends ScopeString {
val IntField: Int = 13
val StringField: String = "test"
lazy val fields = enumerateBase[Test2.type]
}
// This shows how the attempt to drop the object's type doesn't work
object Test3 extends ScopeString {
val IntField: Int = 13
val StringField: String = "test"
lazy val fields = enumerateThis
}
val test1 = Test1.fields // List(test)
val test2 = Test2.fields // List(13, test)
val test3 = Test3.fields // List()
The "enumerate" method does work. However, as you can see from the Test1 example, it requires passing the object's own type (Test1.type) as a parameter, which should not have been necessary. The "enumerateThis" method tries to avoid that but fails, producing an empty list. The "enumerateBase" method attempts to generalize the "enumerate" code by passing the val type as a parameter. But it fails, too, producing the list of all vals, not just those of a certain type.
Any idea what's going on?
Your problem in your generic implementation is the loss of the type information of T. Also, don't use exceptions as your primary method of control logic (it's very slow!). Here's a working version of your base.
abstract class ScopeBase[T : universe.TypeTag, S <: ScopeBase[T, S] : universe.TypeTag : scala.reflect.ClassTag] {
self: S =>
def enumerateBase: Seq[T] = {
val mirror = currentMirror.reflect(this)
universe.typeOf[S].baseClasses.map(_.asType.toType).flatMap(
_.decls
.filter(_.typeSignature.resultType <:< universe.typeOf[T])
.filter(_.isMethod)
.map(_.asMethod)
.filter(_.isAccessor)
.map(decl => mirror.reflectMethod(decl).apply().asInstanceOf[T])
.filter(_ != null)
).toSeq
}
}
trait Inherit {
val StringField2: String = "test2"
}
class Test1 extends ScopeBase[String, Test1] with Inherit {
val IntField: Int = 13
val StringField: String = "test"
lazy val fields = enumerateBase
}
object Test extends App {
println(new Test1().fields)
}
Instead of getting the type from universe.typeOf you can use the runtime class currentMirror.classSymbol(getClass).toType, below is an example that works:
def enumerateThis: Seq[String] = {
val mirror = currentMirror.reflect(this)
currentMirror.classSymbol(getClass).toType.decls.map {
decl => Try(mirror.reflectField(decl.asMethod).get.asInstanceOf[String])
}.filter(_.isSuccess).map(_.get).filter(_ != null).toSeq
}
//prints List(test)
With everyone's help, here's the final version that works:
import scala.reflect.runtime.{currentMirror, universe}
abstract class ScopeBase[T: universe.TypeTag] {
lazy val enumerate: Seq[T] = {
val mirror = currentMirror.reflect(this)
currentMirror.classSymbol(getClass).baseClasses.map(_.asType.toType).flatMap {
_.decls
.filter(_.typeSignature.resultType <:< universe.typeOf[T])
.filter(_.isMethod)
.map(_.asMethod)
.filterNot(_.isConstructor)
.filter(_.paramLists.size == 0)
.map(decl => mirror.reflectField(decl.asMethod).get.asInstanceOf[T])
.filter(_ != null).toSeq
}
}
}
trait FieldScope extends ScopeBase[Field[_]]
trait DbFieldScope extends ScopeBase[DbField[_, _]] {
// etc....
}
As you see from the last few lines, my use cases are limited to scope objects for specific field types. This is why I want to parameterize the scope container. If I wanted to enumerate the fields of multiple types in a single scope container, then I would have parameterized the enumerate method.

Scala Pickling: Writing a custom pickler / unpickler for nested structures

I'm trying to write a custom SPickler / Unpickler pair to work around some the current limitations of scala-pickling.
The data type I'm trying to pickle is a case class, where some of the fields already have their own SPickler and Unpickler instances.
I'd like to use these instances in my custom pickler, but I don't know how.
Here's an example of what I mean:
// Here's a class for which I want a custom SPickler / Unpickler.
// One of its fields can already be pickled, so I'd like to reuse that logic.
case class MyClass[A: SPickler: Unpickler: FastTypeTag](myString: String, a: A)
// Here's my custom pickler.
class MyClassPickler[A: SPickler: Unpickler: FastTypeTag](
implicit val format: PickleFormat) extends SPickler[MyClass[A]] with Unpickler[MyClass[A]] {
override def pickle(
picklee: MyClass[A],
builder: PBuilder) {
builder.beginEntry(picklee)
// Here we save `myString` in some custom way.
builder.putField(
"mySpecialPickler",
b => b.hintTag(FastTypeTag.ScalaString).beginEntry(
picklee.myString).endEntry())
// Now we need to save `a`, which has an implicit SPickler.
// But how do we use it?
builder.endEntry()
}
override def unpickle(
tag: => FastTypeTag[_],
reader: PReader): MyClass[A] = {
reader.beginEntry()
// First we read the string.
val myString = reader.readField("mySpecialPickler").unpickle[String]
// Now we need to read `a`, which has an implicit Unpickler.
// But how do we use it?
val a: A = ???
reader.endEntry()
MyClass(myString, a)
}
}
I would really appreciate a working example.
Thanks!
Here is a working example:
case class MyClass[A](myString: String, a: A)
Note that the type parameter of MyClass does not need context bounds. Only the custom pickler class needs the corresponding implicits:
class MyClassPickler[A](implicit val format: PickleFormat, aTypeTag: FastTypeTag[A],
aPickler: SPickler[A], aUnpickler: Unpickler[A])
extends SPickler[MyClass[A]] with Unpickler[MyClass[A]] {
private val stringUnpickler = implicitly[Unpickler[String]]
override def pickle(picklee: MyClass[A], builder: PBuilder) = {
builder.beginEntry(picklee)
builder.putField("myString",
b => b.hintTag(FastTypeTag.ScalaString).beginEntry(picklee.myString).endEntry()
)
builder.putField("a",
b => {
b.hintTag(aTypeTag)
aPickler.pickle(picklee.a, b)
}
)
builder.endEntry()
}
override def unpickle(tag: => FastTypeTag[_], reader: PReader): MyClass[A] = {
reader.hintTag(FastTypeTag.ScalaString)
val tag = reader.beginEntry()
val myStringUnpickled = stringUnpickler.unpickle(tag, reader).asInstanceOf[String]
reader.endEntry()
reader.hintTag(aTypeTag)
val aTag = reader.beginEntry()
val aUnpickled = aUnpickler.unpickle(aTag, reader).asInstanceOf[A]
reader.endEntry()
MyClass(myStringUnpickled, aUnpickled)
}
}
In addition to the custom pickler class, we also need an implicit def which returns a pickler instance specialized for concrete type arguments:
implicit def myClassPickler[A: SPickler: Unpickler: FastTypeTag](implicit pf: PickleFormat) =
new MyClassPickler

ZeroC Ice "checked casts" in Scala

ZeroC Ice for Java translates every Slice interface Simple into (among other things) a proxy interface SimplePrx and a proxy SimplePrxHelper. If I have an ObjectPrx (the base interface for all proxies), I can check whether it actually has interface Simple by using a static method on SimplePrxHelper:
val obj : Ice.ObjectPrx = ...; // Get a proxy from somewhere...
val simple : SimplePrx = SimplePrxHelper.checkedCast(obj);
if (simple != null)
// Object supports the Simple interface...
else
// Object is not of type Simple...
I wanted to write a method castTo so that I could replace the second line with
val simple = castTo[SimplePrx](obj)
or
val simple = castTo[SimplePrxHelper](obj)
So far as I can see, Scala's type system is not expressive enough to allow me to define castTo. Is this correct?
Should be able to do something with implicits, along these lines:
object Casting {
trait Caster[A] {
def checkedCast(obj: ObjectPrx): Option[A]
}
def castTo[A](obj: ObjectPrx)(implicit caster: Caster[A]) =
caster.checkedCast(obj)
implicit object SimplePrxCaster extends Caster[SimplePrx] {
def checkedCast(obj: ObjectPrx) = Option(SimplePrxHelper.checkedCast(obj))
}
}
Then you just bring things into scope where you want to use them:
package my.package
import Casting._
...
def whatever(prx: ObjectPrx) {
castTo[SimplePrx](prx) foreach (_.somethingSimple())
}
...
You can get something like what you want with structural types:
def castTo[A](helper: { def checkedCast(o: Object): A })(o: Object) = {
helper.checkedCast(o)
}
class FooPrx { }
object FooPrxHelper {
def checkedCast(o: Object): FooPrx = o match {
case fp : FooPrx => fp
case _ => null
}
}
scala> val o: Object = new FooPrx
o: java.lang.Object = FooPrx#da8742
scala> val fp = castTo(FooPrxHelper)(o)
fp: FooPrx = FooPrx#da8742