I need to group some values of an Enumeration.
Here is the code :
object PostType extends Enumeration {
// Documents
val BOOKMARK = Value("bookmark")
val FILE = Value("file")
val NOTE = Value("note")
val WIKIDOC = Value("wikidoc")
…
}
object PostTypes {
type PostTypes = List[PostType.Value]
val DOCUMENTS : PostTypes = List(PostType.BOOKMARK, PostType.FILE, PostType.NOTE, PostType.WIKIDOC)
val QUESTIONS : PostTypes = List(PostType.QUESTION, PostType.QUICKPOLL, PostType.SURVEY)
val EVENT : PostTypes = List(PostType.EVENT)
…
val ALL : PostTypes = PostType.values.toList
}
Is there a better way ?
Here are the cons that I see : PostType.Value and PostTypes.PostTypes in client code !
[update] improve code with the help of both answers
object PostType extends Enumeration {
// Documents
val Bookmark = Value("bookmark")
val File = Value("file")
val Note = Value("note")
val Wikidoc = Value("wikidoc")
…
}
object PostTypes {
import PostType._
implicit def toList(pt: Value) = List(pt)
type PostTypes = List[Value]
val Documents = List(Bookmark, File, Note, Wikidoc)
val Questions = List(Question, Quickpoll, Survey)
val All = values.toList
}
[update 2] Another bunch of improvements
object PostType extends Enumeration {
type PostType = Value
type PostTypes = List[Value]
implicit def toList(pt: Value) = List(pt)
// Documents
val Bookmark = Value(1, "bookmark")
val File = Value(2, "file")
val Note = Value(3, "note")
val Wikidoc = Value(12, "wikidoc")
// Declare after Val to avoid runtime error
val Documents = List(Bookmark, File, Note, Wikidoc)
val Questions = List(Question, Quickpoll, Survey)
val All = values.toList.sorted
}
Below is an approach that does not use enumerations, but hopefully achieves your aims. The example shows how to instantiate PostType from a String instance. If the String does not match then a MatchError is thrown.
package rando
object PostType {
val all = Document.all ++ Question.all
def fromString(s: String): PostType = Document.fromString.orElse(Question.fromString)(s)
}
sealed trait PostType
object Document {
val all = Set(Bookmark, File, Note, Wikidoc)
val fromString: PartialFunction[String, Document] = {
case "bookmark" => Bookmark
case "file" => File
case "note" => Note
case "wikidoc" => Wikidoc
}
}
sealed trait Document extends PostType
case object Bookmark extends Document
case object File extends Document
case object Note extends Document
case object Wikidoc extends Document
object Question {
val all = Set(SlowPoll, QuickPoll, Survey)
val fromString: PartialFunction[String, Question] = {
case "slowpoll" => SlowPoll
case "quickpoll" => QuickPoll
case "survey" => Survey
}
}
sealed trait Question extends PostType
case object SlowPoll extends Question
case object QuickPoll extends Question
case object Survey extends Question
object Example extends App {
println(PostType.fromString("bookmark").getClass)
}
I think it's as good as it gets unless you want to use alternative Enumeration implementation (see my answer here) or just use some constants.
The only thing I could suggest is to define a type alias type PostType = Value and use it instead of Value. I'm sure you know that by importing <package>.PostType._ you won't have to prefix your enum values with PostType anymore.
Finally PostTypes seems a little bit like overkill and it's easy to confuse it with PostType when reading. These are just minor things though. I use the same approach as you do and I'm not aware of anything better.
Here is the final code, with the help of other answers.
Just need to import PostType._ in client code.
object PostType extends Enumeration {
type PostType = Value
type PostTypes = List[Value]
implicit def toList(pt: Value) = List(pt)
// Documents
val Bookmark = Value(1, "bookmark")
val File = Value(2, "file")
val Note = Value(3, "note")
val Wikidoc = Value(12, "wikidoc")
// Declare after Val to avoid runtime error
val Documents = List(Bookmark, File, Note, Wikidoc)
val Questions = List(Question, Quickpoll, Survey)
val All = values.toList.sorted
}
Related
I'm writing a Scala library to operate upon Spark DataFrames. I have a bunch of classes, each of which contain a function that operates upon the supplied DataFrame:
class Foo(){val func = SomeFunction(,,,)}
class Bar(){val func = SomeFunction(,,,)}
class Baz(){val func = SomeFunction(,,,)}
The user of my library passes a parameter operation: String to indicate class to instantiate, the value passed has to be the name of one of those classes hence I have code that looks something like this:
operation match {
case Foo => new Foo().SomeFunction
case Bar => new Bar().SomeFunction
case Baz => new Baz().SomeFunction
}
I'm a novice Scala developer but this seems rather like a clunky way of achieving this. I'm hoping there is a simpler way to instantiate the desired class based on the value of operation given that it will be the same as the name of the desired class.
The reason I want to do this is that I want external contributors to contribute their own classes and I want to make it at easy as possible for them to do that, I don't want them to have to know they also need to go and change a pattern match.
For
case class SomeFunction(s: String)
class Foo(){val func = SomeFunction("Foo#func")}
class Bar(){val func = SomeFunction("Bar#func")}
class Baz(){val func = SomeFunction("Baz#func")}
//...
reflection-based version of
def foo(operation: String) = operation match {
case "Foo" => new Foo().func
case "Bar" => new Bar().func
case "Baz" => new Baz().func
// ...
}
is
import scala.reflect.runtime.universe
import scala.reflect.runtime.universe._
def foo(operation: String): SomeFunction = {
val runtimeMirror = universe.runtimeMirror(getClass.getClassLoader)
val classSymbol = runtimeMirror.staticClass(operation)
val constructorSymbol = classSymbol.primaryConstructor.asMethod
val classMirror = runtimeMirror.reflectClass(classSymbol)
val classType = classSymbol.toType
val constructorMirror = classMirror.reflectConstructor(constructorSymbol)
val instance = constructorMirror()
val fieldSymbol = classType.decl(TermName("func")).asTerm
val instanceMirror = runtimeMirror.reflect(instance)
val fieldMirror = instanceMirror.reflectField(fieldSymbol)
fieldMirror.get.asInstanceOf[SomeFunction]
}
Testing:
foo("Foo") //SomeFunction(Foo#func)
foo("Bar") //SomeFunction(Bar#func)
foo("Baz") //SomeFunction(Baz#func)
I'm trying to figure out if a member field in any given case class is also a case class. Taken from this answer, given an instance or an object, I can pass it along and determine if it's a case class:
def isCaseClass(v: Any): Boolean = {
import reflect.runtime.universe._
val typeMirror = runtimeMirror(v.getClass.getClassLoader)
val instanceMirror = typeMirror.reflect(v)
val symbol = instanceMirror.symbol
symbol.isCaseClass
}
However, what I'd like, is to take a case class, extract all of its member fields, and find out which ones are case classes themselves. Something in this manner:
def innerCaseClasses[A](parentCaseClass:A): List[Class[_]] = {
val nestedCaseClasses = ListBuffer[Class[_]]()
val fields = parentCaseClass.getClass.getDeclaredFields
fields.foreach(field => {
if (??? /*field is case class */ ) {
nestedCaseClasses += field.getType
}
})
nestedCaseClasses.toList
}
I thought maybe I could extract the fields, their classes, and use reflection to instantiate a new instance of that member field as its own class. I'm not 100% how to do that, and it seems like perhaps there's an easier way. Is there?
Ah! I've figured it out (simplified the function which tells the determination):
import reflect.runtime.universe._
case class MyThing(str:String, num:Int)
case class WithMyThing(name:String, aThing:MyThing)
val childThing = MyThing("Neat" , 293923)
val parentCaseClass = WithMyThing("Nate", childThing)
def isCaseClass(v: Any): Boolean = {
val typeMirror = runtimeMirror(v.getClass.getClassLoader)
val instanceMirror = typeMirror.reflect(v)
val symbol = instanceMirror.symbol
symbol.isCaseClass
}
def innerCaseClasses[A](parentCaseClass:A): Unit = {
val fields = parentCaseClass.asInstanceOf[Product].productIterator
fields.foreach(field => {
println(s"Field: ${field.getClass.getSimpleName} isCaseClass? " + isCaseClass(field))
})
}
innerCaseClasses(parentCaseClass)
printout:
Field: String isCaseClass? false
Field: MyThing isCaseClass? true
I wrote some Scala code, using reflection, that returns all vals in an object that are of a certain type. Below are three versions of this code. One of them works but is ugly. Two attempts to improve it don't work, in very different ways. Can you explain why?
First, the code:
import scala.reflect.runtime._
import scala.util.Try
trait ScopeBase[T] {
// this version tries to generalize the type. The only difference
// from the working version is [T] instead of [String]
def enumerateBase[S: universe.TypeTag]: Seq[T] = {
val mirror = currentMirror.reflect(this)
universe.typeOf[S].decls.map {
decl => Try(mirror.reflectField(decl.asMethod).get.asInstanceOf[T])
}.filter(_.isSuccess).map(_.get).filter(_ != null).toSeq
}
}
trait ScopeString extends ScopeBase[String] {
// This version works but requires passing the val type
// (String, in this example) explicitly. I don't want to
// duplicate the code for different val types.
def enumerate[S: universe.TypeTag]: Seq[String] = {
val mirror = currentMirror.reflect(this)
universe.typeOf[S].decls.map {
decl => Try(mirror.reflectField(decl.asMethod).get.asInstanceOf[String])
}.filter(_.isSuccess).map(_.get).filter(_ != null).toSeq
}
// This version tries to avoid passing the object's type
// as the [S] type parameter. After all, the method is called
// on the object itself; so why pass the type?
def enumerateThis: Seq[String] = {
val mirror = currentMirror.reflect(this)
universe.typeOf[this.type].decls.map {
decl => Try(mirror.reflectField(decl.asMethod).get.asInstanceOf[String])
}.filter(_.isSuccess).map(_.get).filter(_ != null).toSeq
}
}
// The working example
object Test1 extends ScopeString {
val IntField: Int = 13
val StringField: String = "test"
lazy val fields = enumerate[Test1.type]
}
// This shows how the attempt to generalize the type doesn't work
object Test2 extends ScopeString {
val IntField: Int = 13
val StringField: String = "test"
lazy val fields = enumerateBase[Test2.type]
}
// This shows how the attempt to drop the object's type doesn't work
object Test3 extends ScopeString {
val IntField: Int = 13
val StringField: String = "test"
lazy val fields = enumerateThis
}
val test1 = Test1.fields // List(test)
val test2 = Test2.fields // List(13, test)
val test3 = Test3.fields // List()
The "enumerate" method does work. However, as you can see from the Test1 example, it requires passing the object's own type (Test1.type) as a parameter, which should not have been necessary. The "enumerateThis" method tries to avoid that but fails, producing an empty list. The "enumerateBase" method attempts to generalize the "enumerate" code by passing the val type as a parameter. But it fails, too, producing the list of all vals, not just those of a certain type.
Any idea what's going on?
Your problem in your generic implementation is the loss of the type information of T. Also, don't use exceptions as your primary method of control logic (it's very slow!). Here's a working version of your base.
abstract class ScopeBase[T : universe.TypeTag, S <: ScopeBase[T, S] : universe.TypeTag : scala.reflect.ClassTag] {
self: S =>
def enumerateBase: Seq[T] = {
val mirror = currentMirror.reflect(this)
universe.typeOf[S].baseClasses.map(_.asType.toType).flatMap(
_.decls
.filter(_.typeSignature.resultType <:< universe.typeOf[T])
.filter(_.isMethod)
.map(_.asMethod)
.filter(_.isAccessor)
.map(decl => mirror.reflectMethod(decl).apply().asInstanceOf[T])
.filter(_ != null)
).toSeq
}
}
trait Inherit {
val StringField2: String = "test2"
}
class Test1 extends ScopeBase[String, Test1] with Inherit {
val IntField: Int = 13
val StringField: String = "test"
lazy val fields = enumerateBase
}
object Test extends App {
println(new Test1().fields)
}
Instead of getting the type from universe.typeOf you can use the runtime class currentMirror.classSymbol(getClass).toType, below is an example that works:
def enumerateThis: Seq[String] = {
val mirror = currentMirror.reflect(this)
currentMirror.classSymbol(getClass).toType.decls.map {
decl => Try(mirror.reflectField(decl.asMethod).get.asInstanceOf[String])
}.filter(_.isSuccess).map(_.get).filter(_ != null).toSeq
}
//prints List(test)
With everyone's help, here's the final version that works:
import scala.reflect.runtime.{currentMirror, universe}
abstract class ScopeBase[T: universe.TypeTag] {
lazy val enumerate: Seq[T] = {
val mirror = currentMirror.reflect(this)
currentMirror.classSymbol(getClass).baseClasses.map(_.asType.toType).flatMap {
_.decls
.filter(_.typeSignature.resultType <:< universe.typeOf[T])
.filter(_.isMethod)
.map(_.asMethod)
.filterNot(_.isConstructor)
.filter(_.paramLists.size == 0)
.map(decl => mirror.reflectField(decl.asMethod).get.asInstanceOf[T])
.filter(_ != null).toSeq
}
}
}
trait FieldScope extends ScopeBase[Field[_]]
trait DbFieldScope extends ScopeBase[DbField[_, _]] {
// etc....
}
As you see from the last few lines, my use cases are limited to scope objects for specific field types. This is why I want to parameterize the scope container. If I wanted to enumerate the fields of multiple types in a single scope container, then I would have parameterized the enumerate method.
I'm using scala and slick here, and I have a baserepository which is responsible for doing the basic crud of my classes.
For a design decision, we do have updatedTime and createdTime columns all handled by the application, and not by triggers in database. Both of this fields are joda DataTime instances.
Those fields are defined in two traits called HasUpdatedAt, and HasCreatedAt, for the tables
trait HasCreatedAt {
val createdAt: Option[DateTime]
}
case class User(name:String,createdAt:Option[DateTime] = None) extends HasCreatedAt
I would like to know how can I use reflection to call the user copy method, to update the createdAt value during the database insertion method.
Edit after #vptron and #kevin-wright comments
I have a repo like this
trait BaseRepo[ID, R] {
def insert(r: R)(implicit session: Session): ID
}
I want to implement the insert just once, and there I want to createdAt to be updated, that's why I'm not using the copy method, otherwise I need to implement it everywhere I use the createdAt column.
This question was answered here to help other with this kind of problem.
I end up using this code to execute the copy method of my case classes using scala reflection.
import reflect._
import scala.reflect.runtime.universe._
import scala.reflect.runtime._
class Empty
val mirror = universe.runtimeMirror(getClass.getClassLoader)
// paramName is the parameter that I want to replacte the value
// paramValue is the new parameter value
def updateParam[R : ClassTag](r: R, paramName: String, paramValue: Any): R = {
val instanceMirror = mirror.reflect(r)
val decl = instanceMirror.symbol.asType.toType
val members = decl.members.map(method => transformMethod(method, paramName, paramValue, instanceMirror)).filter {
case _: Empty => false
case _ => true
}.toArray.reverse
val copyMethod = decl.declaration(newTermName("copy")).asMethod
val copyMethodInstance = instanceMirror.reflectMethod(copyMethod)
copyMethodInstance(members: _*).asInstanceOf[R]
}
def transformMethod(method: Symbol, paramName: String, paramValue: Any, instanceMirror: InstanceMirror) = {
val term = method.asTerm
if (term.isAccessor) {
if (term.name.toString == paramName) {
paramValue
} else instanceMirror.reflectField(term).get
} else new Empty
}
With this I can execute the copy method of my case classes, replacing a determined field value.
As comments have said, don't change a val using reflection. Would you that with a java final variable? It makes your code do really unexpected things. If you need to change the value of a val, don't use a val, use a var.
trait HasCreatedAt {
var createdAt: Option[DateTime] = None
}
case class User(name:String) extends HasCreatedAt
Although having a var in a case class may bring some unexpected behavior e.g. copy would not work as expected. This may lead to preferring not using a case class for this.
Another approach would be to make the insert method return an updated copy of the case class, e.g.:
trait HasCreatedAt {
val createdAt: Option[DateTime]
def withCreatedAt(dt:DateTime):this.type
}
case class User(name:String,createdAt:Option[DateTime] = None) extends HasCreatedAt {
def withCreatedAt(dt:DateTime) = this.copy(createdAt = Some(dt))
}
trait BaseRepo[ID, R <: HasCreatedAt] {
def insert(r: R)(implicit session: Session): (ID, R) = {
val id = ???//insert into db
(id, r.withCreatedAt(??? /*now*/))
}
}
EDIT:
Since I didn't answer your original question and you may know what you are doing I am adding a way to do this.
import scala.reflect.runtime.universe._
val user = User("aaa", None)
val m = runtimeMirror(getClass.getClassLoader)
val im = m.reflect(user)
val decl = im.symbol.asType.toType.declaration("createdAt":TermName).asTerm
val fm = im.reflectField(decl)
fm.set(??? /*now*/)
But again, please don't do this. Read this stackoveflow answer to get some insight into what it can cause (vals map to final fields).
I am quite new to the scala programming language, and I currently need to do the following. I have a signleton object like the following:
object MyObject extends Serializable {
val map: HashMap[String, Int] = null
val x: int = -1;
val foo: String = ""
}
Now i want to avoid to have to serialize each field of this object separately, thus I was considering writing the whole object to a file, and then, in the next execution of the program, read the file and initialize the singleton object from there. Is there any way to do this?
Basically what I want is when the serialization file doesn't exist, those variables to be initialized to new structures, while when it exists, the fields to be initialized from the ones on the file. But I want to avoid having to serialize/deserialize every field manually...
UPDATE:
I had to use a custom deserializer as presented here: https://issues.scala-lang.org/browse/SI-2403, since i had issues with a custom class I use inside the HashMap as values.
UPDATE2:
Here is the code I use to serialize:
val store = new ObjectOutputStream(new FileOutputStream(new File("foo")))
store.writeObject(MyData)
store.close
And the code to deserialize (in a different file):
#transient private lazy val loadedData: MyTrait = {
if(new File("foo").exists()) {
val in = new ObjectInputStream(new FileInputStream("foo")) {
override def resolveClass(desc: java.io.ObjectStreamClass): Class[_] = {
try { Class.forName(desc.getName, false, getClass.getClassLoader) }
catch { case ex: ClassNotFoundException => super.resolveClass(desc) }
}
}
val obj = in.readObject().asInstanceOf[MyTrait]
in.close
obj
}
else null
}
Thanks,
No needs to serialize an object with only immutable fields (because the compiler will do it for you...) I will assume that the object provides default values. Here is a way to do this:
Start by writing an trait with all the required fields:
trait MyTrait {
def map: HashMap[String, Int]
def x: Int
def foo: String
}
Then write an object with the defaults:
object MyDefaults extends MyTrait {
val map = Map()
val x = -1
val foo =
}
Finally write an implementation unserializing data if it exists:
object MyData extends MyTrait {
private lazy val loadedData: Option[MyTrait] = {
if( /* filename exists */ ) Some( /*unserialize filename as MyTrait*/)
else None
}
lazy val map = loadedData.getOrElse( MyDefault ).map
lazy val x = loadedData.getOrElse( MyDefault ).x
lazy val foo = loadedData.getOrElse( MyDefault ).foo
}