What are the use-cases for auxiliary constructors in Scala? - scala

For example, how is this:
class Cat(name: String, val age: Int) {
def this() = this("Garfield", 20)
}
val someCat = new Cat
someCat.age
res0: Int = 20
Different from:
class Cat(name: String = "Garfield", val age: Int = 20)
val someCat = new Cat
someCat.age
res0: Int = 20
Note:
I have seen answers to other questions(e.g here) that discuss the differences between Java & Scala in the implementation for auxiliary constructors. But I am mostly trying to understand why do we need them in Scala, in the first place.

Auxiliary constructors are good for more than just supplying defaults. For example, here's one that can take arguments of different types:
class MyBigInt(x: Int) {
def this(s: String) = this(s.toInt)
}
You can also hide the main constructor if it contains implementation details:
class MyBigInt private(private val data: List[Byte]) {
def this(n: Int) = this(...)
}
This allows you to have data clearly be the backing structure for your class while avoiding cluttering your class with the arguments to one of your auxiliary constructors.
Another use for auxiliary constructors could be migrating Java code to Scala (or refactoring to change a backing type, as in the example above) without breaking dependencies. In general though, it is often better to use a custom apply method in the companion object, as they are more flexible.

A recurring use case I noticed is, as Brian McCutchon already mentioned in his answer "For example, here's one that can take arguments of different types", parameters of Option type in the primary constructor. For example:
class Person(val firstName:String, val middleName:Option[String], val lastName: String)
To create a new instance you need to do:
val person = new Person("Guido", Some("van"), "Rossum")
But with an auxiliary constructor, the whole process will be very pleasant.
class Person(val firstName:String, val middleName:Option[String], val lastName: String){
def this(firstName:String, middleName:String, lastName: String) = this(firstName, Some(middleName), lastName)
}
val person = new Person("Guido", "van", "Rossum")

Related

Get only super class fields

case class Person(name: String,
override val age: Int,
override val address: String
) extends Details(age, address)
class Details(val age: Int, val address: String)
val person = Person("Alex", 33, "Europe")
val details = person.asInstanceOf[Details] // ???
println(details) // I want only Details class fields
I have these 2 classes. In reality, both have a lot of fields. Somewhere, I need only field of superclass, taken from Person class.
There is a nice way to get only super class values and not mapping them field by field?
*I'm pretty sure I'll have some problems with json writes for class Details (which is not a case class and have not a singleton object, but this is another subject)
If I get your question correctly, then you might be asking me runtime polymorphism or dynamic method dispatch from java.
If so, you may have to create both the class and not case class
class Details( val age: Int, val address: String)
class Person(name: String,
override val age: Int,
override val address: String
) extends Details(age, address) {
}
Now create the object of person and reference to superclass (Details)
val detail:Details = new Person("Alex", 33, "Europe")
println(detail.address)
println(detail.age)
This way you will be able to get the only address and age
Another way is like , why can't we create the Details a separate entity like:
case class Details( age: Int, address: String)
case class Person(name: String,
details: Details
)
val detail = Person("Alex", Details(10,"Europe") )
Output:
println(detail.details)
Details(10,Europe)
I will post a solution that leverages scala macro system (old kind, not the newest introduced with Scala 3.0). It could be an overkill for you...
BTW, if you want to access to only parent values (for example for getting key, value pair), you can:
given a type tag, get all parents;
from them, extract all the accessors (vals);
for each val, get its value;
and finally returns a list with all accessors taken
So, I try to solve each point step by step.
First of all, we have to write the macro definition as:
object Macros {
def accessors[T](element : T): String = macro MacrosImpl.accessors[T]
}
object MacrosImpl {
def accessors[T: c.WeakTypeTag](c: whitebox.Context): c.Expr[String] = ...
}
for the first point, we can leverage the reflection macroprogramming API using c.universe:
import c.universe._
val weakType = weakTypeTag[T] //thanks to the WeakTypeTag typeclass
val parents = weakType.tpe.baseClasses
for the second point, we can iterate over the parent classes and then take only the public accessors:
val accessors = parents
.map(weakType.tpe.baseType(_))
.flatMap(_.members)
.filter(_.isPublic)
.filter(_.isMethod)
.map(_.asMethod)
.filter(_.isAccessor)
.toSet
So, for example, if the we write Macros.accessors[Details](person), accessors will yield age and address.
To take the value, we can leverage quasiqouting. So, first we take only the values name:
val names = accessors
.map(_.fullName)
.map(_.split("\\."))
.map(_.reverse.head)
Then we convert them into a TermName:
val terms = names.map(TermName(_))
And finally, we convert each term to a key value tuple containing the val name and its value:
val accessorValues = terms
.map(name => c.Expr[(String, Any)](q"(${name.toString}, ${element}.${name})"))
.toSeq
The last step consist in convert a Seq[Expr[(String, Any)] into a Expr[Seq[(String, Any)]. A way to do that, could be leveraging recursion, reify, and splicing expression:
def seqToExprs(seq: Seq[Expr[(String, Any)]]): c.Expr[Seq[(String, Any)]] =
seq.headOption match {
case Some(head) =>
c.universe.reify(
Seq((head.splice._1, head.splice._2)) ++
seqToExprs(seq.tail).splice
)
case _ => c.Expr[Seq[(String, Any)]](q"Seq.empty")
}
So now I decide to return a String representation (but you can manipulate it as you want):
val elements = seqToExprs(accessorValues)
c.Expr[String](q"${elements}.mkString")
You can use it as:
import Macros._
class A(val a : Int)
class B(val b : Int) extends A(b)
class C(val c: Int) extends B(c)
//println(typeToString[List[Set[List[Double]]]])
val c = new C(10)
println(accessors[C](c)) // prints (a, 10)(b, 10)(c, 10)
println(accessors[B](c)) // prints (a, 10)(b, 10)
println(accessors[A](c)) // prints (a, 10)
And, using your example:
// Your example:
case class Person(name: String,
override val age: Int,
override val address: String
) extends Details(age, address)
class Details(val age: Int, val address: String)
val person = Person("Alex", 33, "Europe")
println(accessors[Details](person)) // prints (address,Europe)(age,33)
println(accessors[Person](person)) // prints (address,Europe)(age,33)(name,Alex)
Here there is a repository with the macro implemented.
Scala 3.0 introduce a safer and cleaner macro system, if you use it and you want to go further you can read these articles:
macros tips and tricks
short tutorial
another tutorial

Passing different object models as a parameter to a method in scala

I've really struggled with type relationships in scala and how to use them effectively. I am currently trying to understand how I would use them to only edit certain fields in a Mongo Collection. This means passing a particular object containing only those fields to a method which (after reading about variances) I thought that I could do like this:
abstract class DocClass
case class DocPart1(oId: Option[BSONObjectID], name: String, other: String) extends DocClass
case class DocPart2(city: String, country: String) extends DocClass
With the method that calls a more generic method as:
def updateMultipleFields(oId: Option[BSONObjectID], dataModel: DocClass): Future[Result] = serviceClientDb.updateFields[T](collectionName, dataModel, oId)
// updateFields updates the collection by passing *dataModel* into the collection, i.e. Json.obj("$set" -> dataModel)
So dataModel can be a DocPart1 or DocPart2 object. I'm eager not to use a
type parameter on updateMultipleFields (as this interesting article may suggest) as this leads me to further issues in passing these to this method in other files in the project. I'm doing this to abide with DRY and in order to maintain efficient database operations.
I've gone round in circles with this one - can anyone shed any light on this?
Edited after #SerGr's comments
So to be completely clear; I'm using Play/Scala/ReactiveMongo Play JSON (as documented here) and I have a MongoDB collection with lots of fields.
case class Doc(oId: Option[BSONObjectID], name: String, city: String, country: String, city: String, continent: String, region: String, region: String, latitude: Long, longitude: Long)
To create a new document I have auto-mapped Doc (above) to the collection structure (in Play - like this) and created a form (to insert/update the collection) - all working well!
But when editing a document; I would like to update only some fields (so that all of the fields are not updated). I have therefore created multiple case classes to divide these fields into smaller models (like the examples of DocPart1 & DocPart2) and mapped the form data to just one. This has led me to pass these as a parameter to the updateMultipleFields method as shown above. I hope that this makes more sense.
I'm not sure if I understand correctly what you need. Still here is some code that might be it. Assume we have our FullDoc class defined as:
case class FullDoc(_id: Option[BSONObjectID], name: String, other: String)
and we have 2 partial updates defined as:
sealed trait BaseDocPart
case class DocPart1(name: String) extends BaseDocPart
case class DocPart2(other: String) extends BaseDocPart
Also assume we have an accessor to our Mongo collection:
def docCollection: Future[JSONCollection] = ...
So if I understand your requirements, what you need is something like this:
def update[T <: BaseDocPart](oId: BSONObjectID, docPart: T)(implicit format: OFormat[T]) = {
docCollection.flatMap(_.update(BSONDocument("_id" -> oId),
JsObject(Seq("$set" -> Json.toJson(docPart)))))
}
Essentially the main trick is to use generic T <: BaseDocPart and pass implicit format: OFormat[T] so that we can convert our specific child of BaseDocPart to JSON even after type erasure.
And here is some additional test code (that I used in my console application)
implicit val fullFormat = Json.format[FullDoc]
implicit val part1Format = Json.format[DocPart1]
implicit val part2Format = Json.format[DocPart2]
def insert(id: Int) = {
val fullDoc = FullDoc(None, s"fullDoc_$id", s"other_$id")
val insF: Future[WriteResult] = docCollection.flatMap(_.insert(fullDoc))
val insRes = Await.result(insF, 2 seconds)
println(s"insRes = $insRes")
}
def loadAndPrintAll() = {
val readF = docCollection.flatMap(_.find(Json.obj()).cursor[FullDoc](ReadPreference.primaryPreferred).collect(100, Cursor.FailOnError[Vector[FullDoc]]()))
val readRes = Await.result(readF, 2 seconds)
println(s"readRes =\n${readRes.mkString("\n")}")
}
def loadRandomDocument(): FullDoc = {
val readF = docCollection.flatMap(_.find(Json.obj()).cursor[FullDoc](ReadPreference.primaryPreferred).collect(100, Cursor.FailOnError[Vector[FullDoc]]()))
val readRes = Await.result(readF, 2 seconds)
readRes(Random.nextInt(readRes.length))
}
def updateWrapper[T <: BaseDocPart](oId: BSONObjectID, docPart: T)(implicit writer: OFormat[T]) = {
val updateRes = Await.result(update(oId, docPart), 2 seconds)
println(s"updateRes = $updateRes")
}
// pre-fill with some data
insert(1)
insert(2)
insert(3)
insert(4)
val newId: Int = ((System.currentTimeMillis() - 1511464148000L) / 100).toInt
println(s"newId = $newId")
val doc21: FullDoc = loadRandomDocument()
println(s"doc21 = $doc21")
updateWrapper(doc21._id.get, DocPart1(s"p1_modified_$newId"))
val doc22: FullDoc = loadRandomDocument()
println(s"doc22 = $doc22")
updateWrapper(doc22._id.get, DocPart2(s"p2_modified_$newId"))
loadAndPrintAll()

Scala adding an extra function to a Class

I encountered the following code while checking through a Scala code. I'm finding it difficult to understand what it does.
class Foo(val name: String, val age: Int, val sex: Symbol)
object Foo {
def apply(name: String, age: Int, sex: Symbol) = new Foo(name, age, sex)
}
Does it add a constructor method to the Class Foo which was already defined?
Is it possible to add extra methods to classes which are already defined using this syntax?
Does it add a constructor method to the Class Foo which was already
defined?
It adds syntax sugar to the class. Meaning, you can create an instance of Foo like this:
val foo = Foo()
Instead of
val foo = new Foo()
Is it possible to add extra methods to classes which are already
defined using this syntax?
In that regards, apply is special as the compiler knows it and expands Foo() to Foo.apply. This means that any other method you want to invoke, you'll have to call the Foo static object, but they will not apply to the Foo instance.
If you want to externally add methods to Foo, you can do so via an implicit class:
implicit class RichFoo(foo: Foo) extends AnyVal {
def fooDetails(): String = s"{Name: ${foo.name}, Age: ${foo.Age}"
}
Now you can call it on an instance of Foo:
val f = Foo()
println(f.fooDetails())
In the case, you can think of Foo.apply() as a static method.
Realistically, objects in Scala are implemented as Singleton instances.
Here's the documentation on that.
You can invoke any class or object instance in Scala if it has an apply method. What you're doing here is adding a constructor method to Foo's companion object so that when you call it, it will instantiate an instance of Foo.
It is not possible to add methods to an instance with this method. For that, you might be interested in the Scala Pimp My Library pattern which is implemented using implicits.
// the following are equivalent, given your code above
val x = new Foo("Jason", 29, 'Male)
val y = Foo.apply("Jason", 29, 'Male)
val z = Foo("Jason", 29, 'Male)
Please read about companion object: http://docs.scala-lang.org/tutorials/tour/singleton-objects.html hope this helps
It simplifies object creation for this type. Other way will be to create case class.
Looks like as duplicate to me:
Scala: companion object purpose
This pattern is commonly know as static factory methods. The code you provided is not very useful, but consider these additional factory methods (think of them as "named constructors"):
class Foo(val name: String, val age: Int, val sex: Symbol)
object Foo {
def apply(name: String, age: Int, sex: Symbol) = new Foo(name, age, sex)
def newMaleFoo(name:String,age:int) = new Foo(name,age,'male)
def newPeterFoo(age:int) = new Foo("Peter",age,'male)
}

Type safety when optional field is guaranteed to be present

Let's say I have a following case class:
case class Product(name: String, categoryId: Option[Long]/*, other fields....*/)
Here you can see that categoryId is optional.
Now let's say I have a following method in my DAO layer:
getCategoryProducts(): List[Product] = {
// query products that have categoryId defined
}
You see, that this method returns products, that are guaranteed to have categoryId defined with some value.
What I would like to do is something like this:
trait HasCategory {
def categoryId_!: Long
}
// and then specify in method signature
getCategoryProducts(): List[Product with HasCategory]
This will work, but then such a product will have two methods: categoryId_! and categoryId that smells bad.
Another way would be:
sealed trait Product {
def name: String
/*other fields*/
}
case class SimpleProduct(name: String, /*, other fields....*/) extends Product
case class ProductWithCategory(name: String, categoryId: Long/*, other fields....*/) extends Product
def getCategoryProducts: List[ProductWithCategory] = ...
This method helps to avoid duplicate methods categoryId and categoryId_!, but it requires you to create two case classes and a trait duplicating all the fields, which also smells.
My question: how can I use Scala type system to declare this specific case without these fields duplications ?
Not sure how much this will scale for your particular case, but one solution that comes to mind is to parameterize over the Option type using a higher-kinded generic type:
object Example {
import scala.language.higherKinds
type Id[A] = A
case class Product[C[_]](name: String, category: C[Long])
def productsWithoutCategories: List[Product[Option]] = ???
def productsWithCategories: List[Product[Id]] = ???
}
A way to do it is to use type classes -
import scala.language.implicitConversions
object Example {
sealed class CartId[T]
implicit object CartIdSomeWitness extends CartId[Some[Long]]
implicit object CartIdNoneWitness extends CartId[None.type]
implicit object CartIdPresentWitness extends CartId[Long]
case class Product[T: CartId](name: String, categoryId: T /*, other fields....*/)
val id: Long = 7
val withId = Product("dsds", id)
val withSomeId = Product("dsds", Some(id))
val withNoneId = Product("dsds", None)
val presentId: Long = withId.categoryId
val maybeId: Some[Long] = withSomeId.categoryId
val noneId: None.type = withNoneId.categoryId
val p = Product("sasa", true) //Error:(30, 18) could not find implicit value for evidence parameter of type com.novak.Program.CartId[Boolean]
}
This solution involves some code and dependent on implicits but does what you're trying to achieve.
Be aware that this solution is not completely sealed and can be 'hacked'. You can cheat and do something like -
val hack: Product[Boolean] = Product("a", true)(new CartId[Boolean])
val b: Boolean =hack.categoryId
For some more - advanced solutions which include
* Miles Sabin (#milessabin)’s Unboxed union types in Scala via the Curry-Howard isomorphism
* Scalaz / operator
http://eed3si9n.com/learning-scalaz/Coproducts.html

scala: how to view subclass methods with a generic instantiation

I have the following where I set information and extractors for different schemes of data:
trait DataScheme {
type Type <: List[Any]
class ExtractorMethods(ticker: String, dataList: List[Type]) {
def getDatetime(datum: Type): Date = new Date(datum(columnIndex(Names.datetime)).toString)
def upperDatum(date: Date): Type = dataList.minBy(datum => getDatetime(datum) >= date)
def lowerDatum(date: Date): Type = dataList.maxBy(datum => getDatetime(datum) <= date)
}
}
trait IndexScheme extends DataScheme {
type Type = (Date, Double, Double, Double, Double, Long)
class ExtractorMethods(ticker: String, dataList: List[Type]) extends super.ExtractorMethods(ticker: String, dataList: List[Type]){
def testing12(int: Int):Int = 12
val test123 = 123
}
}
I want anything extending DataScheme to use its ExtractorMethods methods (e.g. lowerDatum) but also have its own methods (e.g. testing12).
There is a class definition for lists of data elements:
class Data[+T <: DataScheme](val ticker: String, val dataList: List[T#Type], val isSorted: Boolean)
(implicit m: Manifest[T], mm: Manifest[T#Type]) extends Symbols {
def this(ticker: String, dataList: List[T#Type])(implicit m: Manifest[T], mm: Manifest[T#Type]) = this(ticker, dataList, false)(m: Manifest[T], mm: Manifest[T#Type])
val dataScheme: T
val extractorMethods = new dataScheme.ExtractorMethods(ticker, dataList.asInstanceOf[List[dataScheme.Type]])
}
A Data class should make accessible the methods in ExtractorMethods of the scheme so they can be used in the main program through the instance of Data that has been defined. For example if sortedData is an instance of Data[IndexScheme], the following works:
val lowerDatum = sortedData.extractorMethods.lowerDatum(new SimpleDateFormat("yyyy-MM-dd HH:mm:ss").parse("2010-03-31 00:00:00"))
but this does not:
val testing = sortedData.extractorMethods.testing12(123)
because 'testing 123 is not a member of sortedData.dataScheme.extractorMethods'. So my question is how can the subclasses of ExtractorMethods in the subtraits of DataScheme like IndexScheme be made accessible? How is it possible using Manifests and TypeTags? Thanks.
So you want the generic class Data[DataScheme] or Data[IndexScheme] to have access to the methods of whichever type Data has been parameterised with. You've tried to do this several different ways, from the evidence in your code.
To answer your last question - manifests can't help in this particular case and TypeTags are only part of the answer. If you really want to do this, you do it with mirrors.
However, you will have to make some changes to your code. Scala only has instance methods; there are no such things as static methods in Scala. This means that you can only use reflection to invoke a method on an instance of a class, trait or object. Your traits are abstract and can't be instantiated.
I can't really tell you how to clean up your code, because what you have pasted up here is a bit of a mess and is full of different things you have tried. What I can show you is how to do it with a simpler set of classes:
import scala.reflect.runtime.universe._
class t1 {
class Methods {
def a = "a"
def b = "b"
}
def methods = new Methods
}
class t2 extends t1 {
class Methods extends super.Methods {
def one = 1
def two = 2
}
override def methods = new Methods
}
class c[+T <: t1](implicit tag: TypeTag[T]) {
def generateT = {
val mirror = runtimeMirror(getClass.getClassLoader)
val cMirror = mirror.reflectClass(typeOf[T].typeSymbol.asClass)
cMirror.reflectConstructor(typeOf[T].declaration(nme.CONSTRUCTOR).asMethod)
}
val t = generateT().asInstanceOf[T]
}
val v1 = new c[t1]
val v2 = new c[t2]
If you run that, you'll find that v1.t.methods gives you a class with only methods a and b, but v2.t.methods gives a class with methods one and two as well.
This really is not how to do this - reaching for reflection for this kind of job shows a very broken model. But I guess that's your business.
I stick by what I said below, though. You should be using implicit conversions (and possibly implicit parameters) with companion objects. Use Scala's type system the way it's designed - you are fighting it all the way.
ORIGINAL ANSWER
Well, I'm going to start by saying that I would never do things the way you are doing this; it seems horribly over-complicated. But you can do what you want to do, roughly the way you are doing it, by
Using mixins
Moving the extractorMethods creation code into the traits.
Here's a greatly simplified example:
trait t1 {
class Methods {
def a = "a"
def b = "b"
}
def methods = new Methods
}
trait t2 extends t1 {
class Methods extends super.Methods {
def one = 1
def two = 2
}
override def methods = new Methods
}
class c1 extends t1
val v1 = new c1
// v1.methods.a will return "a", but v1.methods.one does not exist
class c2 extends c1 with t2
val v2 = new c2
// v2.methods.a returns "a" and v2.methods.one returns 1
I could replicate your modus operandi more closely by defining c1 like this:
class c1 extends t1 {
val myMethods = methods
}
in which case v1.myMethods would only have methods a and b but v2.myMethods would have a, b, one and two.
You should be able to see how you can adapt this to your own class and trait structure. I know my example doesn't have any of your complex type logic in it, but you know better than I what you are trying to achieve there. I'm just trying to demonstrate a simple mechanism.
But dude, way to make your life difficult...
EDIT
There are so many things I could say about what is wrong with your approach here, both on the small and large scale. I'm going to restrict myself to saying two things:
You can't do what you are trying to do in the Data class because it is abstract. You cannot force Scala to magically replace an uninitialised, abstract method of a non-specific type with the specific type, just by littering everything with Type annotations. You can only solve this with a concrete class which provides the specific type.
You should be doing this with implicit conversions. Implicits would help you do it the wrong way you seem fixated on, but would also help you do it the right way. Oh, and use a companion object, either for the implicits or to hold a factory (or bot).