Is it possible to add a member variable to a class from outside the class? (Or mimic this behavior?)
Here's an example of what I'm trying to do. I already use an implicit conversion to add additional functions to RDD, so I added a variable to ExtendedRDDFunctions. I'm guessing this doesn't work because the variable is lost after the conversion in a rdd.setMember(string) call.
Is there any way to get this kind of functionality? Is this the wrong approach?
implicit def toExtendedRDDFunctions(rdd: RDD[Map[String, String]]): ExtendedRDDFunctions = {
new ExtendedRDDFunctions(rdd)
}
class ExtendedRDDFunctions(rdd: RDD[Map[String, String]]) extends Logging with Serializable {
var member: Option[String] = None
def getMember(): String = {
if (member.isDefined) {
return member.get
} else {
return ""
}
}
def setMember(field: String): Unit = {
member = Some(field)
}
def queryForResult(query: String): String = {
// Uses member here
}
}
EDIT:
I am using these functions as follows: I first call rdd.setMember("state"), then rdd.queryForResult(expression).
Because the implicit conversion is applied each time you invoke a method defined in ExtendedRDDFunctions, there is a new instance of ExtendedRDDFunctions created for every call to setMember and queryForResult. Those instances do not share any member variables.
You have basically two options:
Maintain a Map[RDD, String] in ExtendedRDDFunctions's companion object which you use to assign the member value to an RDD in setMember. This is the evil option as you introduce global state and open pitfalls for a whole range of errors.
Create a wrapper class that contains your member value and is returned by the setMember method:
case class RDDWithMember(rdd: RDD[Map[String, String]], member: String) extends RDD[Map[String, String]] {
def queryForResult(query: String): String = {
// Uses member here
}
// methods of the RDD interface, just delegate to rdd
}
implicit class ExtendedRDDFunctions(rdd: RDD[Map[String, String]]) {
def setMember(field: String): RDDWithMember = {
RDDWithMember(rdd, field)
}
}
Beside the omitted global state, this approach is also more type safe because you cannot call queryForResult on instances that do not have a member. The only downsides are that you have to delegate all members of RDD and that queryForResult is not defined on RDD itself.
The first issue can probably be addressed with some macro magic (search for "delegate" or "proxy" and "macro").
The later issue can be resolved by defining an additional extension method in ExtendedRDDFunctions that checks if the RDD is a RDDWithMember:
implicit class ExtendedRDDFunctions(rdd: RDD[Map[String, String]]) {
def setMember(field: String): RDDWithMember = // ...
def queryForResult(query: String): Option[String] = rdd match {
case wm: RDDWithMember => Some(wm.queryForResult(query))
case _ => None
}
}
import ExtendedRDDFunctions._
will import all attributes and functions from Companion object to be used in the body of your class.
For your usage look for delagate pattern.
Related
I have a class like this -
class Cache (
tableName: String,
TTL: Int) {
// Creates a cache
}
I have a companion object that returns different types of caches. It has functions that require a base table name and can construct the cache.
object Cache {
def getOpsCache(baseTableName: String): Cache = {
new Cache(s"baseTableName_ops", OpsTTL);
}
def getSnapshotCache(baseTableName: String): Cache = {
new Cache(s"baseTableName_snaps", SnapshotTTL);
}
def getMetadataCache(baseTableName: String): Cache = {
new Cache(s"baseTableName_metadata", MetadataTTL);
}
}
The object does a few more things and the Cache class has more parameters, which makes it useful to have a companion object to create different types of Caches. The baseTableName parameter is same for all of the caches. Is there a way in which I can pass this parameter only once and then just call the functions to get different types of caches ?
Alternative to this is to create a factory class and pass the baseTableName parameter to constructor and then call the functions. But I am wondering if it could be done in any way with the Companion object.
The simplest way is to put your factory in a case class:
case class CacheFactory(baseTableName: String) {
lazy val getOpsCache: Cache =
Cache(s"baseTableName_ops", OpsTTL)
lazy val getSnapshotCache =
Cache(s"baseTableName_snaps", SnapshotTTL)
lazy val getMetadataCache =
Cache(s"baseTableName_metadata", MetadataTTL)
}
As I like case classes I changed your Cache also to a case class:
case class Cache(tableName: String, TTL: Int)
As you can see I adjusted your Java code to correct Scala code.
If you want to put it in the companion object, you could use implicits, like:
object Cache {
def getOpsCache(implicit baseTableName: String): Cache =
Cache(s"baseTableName_ops", OpsTTL)
def getSnapshotCache(implicit baseTableName: String) =
Cache(s"baseTableName_snaps", SnapshotTTL)
def getMetadataCache(implicit baseTableName: String) =
Cache(s"baseTableName_metadata", MetadataTTL)
}
Then your client looks like:
implicit val baseTableName: String = "baseName"
cache.getSnapshotCache
cache.getMetadataCache
Consider creating algebraic data type like so
sealed abstract class Cache(tablePostfix: String, ttl: Int) {
val tableName = s"baseTableName_$tablePostfix"
}
case object OpsCache extends Cache("ops", 60)
case object SnapshotCache extends Cache("snaps", 120)
case object MetadataCache extends Cache("metadata", 180)
OpsCache.tableName // res0: String = baseTableName_ops
I have a class
class MyClass {
def apply(myRDD: RDD[String]) {
val rdd2 = myRDD.map(myString => {
// do String manipulation
}
}
}
object MyClass {
}
Since I have a block of code performing one task (the area that says "do String manipulation"), I thought I should break it out into its own method. Since the method is not changing the state of the class, I thought I should make it a static method.
How do I do that?
I thought that you can just pop a method inside the companion object and it would be available as a static class, like this:
object MyClass {
def doStringManipulation(myString: String) = {
// do String manipulation
}
}
but when I try val rdd2 = myRDD.map(myString => { doStringManipulation(myString)}), scala doesn't recognize the method and it forces me to do MyClass.doStringManipulation(myString) in order to call it.
What am I doing wrong?
In Scala there are no static methods: all methods are defined over an object, be it an instance of a class or a singleton, as the one you defined in your question.
As you correctly pointed out, by having a class and an object named in the same way in the same compilation unit you make the object a companion of the class, which means that the two have access to each others' private fields and methods, but this does not mean they are available without specifying which object you are accessing.
What you want to do is either using the long form as mentioned (MyClass.doStringManipulation(myString)) or, if you think it makes sense, you can just import the method in the class' scope, as follows:
import MyClass.doStringManipulation
class MyClass {
def apply(myRDD: RDD[String]): Unit = {
val rdd2 = myRDD.map(doStringManipulation)
}
}
object MyClass {
private def doStringManipulation(myString: String): String = {
???
}
}
As a side note, for the MyClass.apply method, you used the a notation which is going to disappear in the future:
// this is a shorthand for a method that returns `Unit` but is going to disappear
def method(parameter: Type) {
// does things
}
// this means the same, but it's going to stay
// the `=` is enough, even without the explicit return type
// unless, that is, you want to force the method to discard the last value and return `Unit`
def method(parameter: Type): Unit = {
// does things
}
You should follow scala's advice.
val rdd2 = myRDD.map(MyClass.doStringManipulation)
Write this inside the class then it will work as expected.
import MyClass._
I've built a microservice using Scala and Play and now I need to create a new version of the service that returns the same data as the previous version of the service but in a different JSON format. The service currently uses implicit Writes converters to do this. My controller looks something like this, where MyJsonWrites contains the implicit definitions.
class MyController extends Controller with MyJsonWrites {
def myAction(query: String) = Action.async {
getData(query).map {
results =>
Ok(Json.toJson(results))
}
}
}
trait MyJsonWrites {
implicit val writes1: Writes[SomeDataType]
implicit val writes2: Writes[SomeOtherDataType]
...
}
Now I need a new version of myAction where the JSON is formatted differently. The first attempt I made was to make MyController a base class and have subclasses extend it with their own trait that has the implicit values. Something like this.
class MyNewContoller extends MyController with MyNewJsonWrites
This doesn't work though because the implicit values defined on MyNewJsonWrites are not available in the methods of the super class.
It would be ideal if I could just create a new action on the controller that somehow used the converters defined in MyNewJsonWrites. Sure, I could change the trait to an object and import the implicit values in each method but then I'd have to duplicate the method body of myAction so that the implicits are in scope when I call Json.toJson. I don't want to pass them as implicit parameters to a base method because there are too many of them. I guess I could pass a method as a parameter to the base method that actually does the imports and Json.toJson call. Something like this. I just thought maybe there'd be a better way.
def myBaseAction(query: String, toJson: Seq[MyResultType] => JsValue) = Action.async {
getData(query).map {
results =>
Ok(Json.toJson(results))
}
}
def myActionV1(query: String) = {
def toJson(results: Seq[MyResultType]) = {
import MyJsonWritesV2._
Json.toJson(results)
}
myBaseAction(query, toJson)
}
Instead of relying on scala implicit resolution, you can call your writes directly:
def myBaseAction(query: String, writes: Writes[MyResultType]) = Action.async {
getData(query).map { results =>
val seqWrites: Writes[Seq[MyResultType]] = Writes.seq(writes)
Ok(seqWrites.writes(results))
}
}
def myActionV1(query: String) = myBaseAction(query, MyJsonWritesV1)
def myActionV2(query: String) = myBaseAction(query, MyJsonWritesV2)
When using spray-json, I need to bring a JsonFormat[A] into implicit scope for every domain type A that I want to serialize.
The recommended approach is to create a custom object with all the implicits as fields:
object MyJsonProtocol extends DefaultJsonProtocol {
implicit val colorFormat = jsonFormat4(Color)
}
import MyJsonProtocol._
My app has a great many domain types, some of which have long names. My MyJsonProtocol is getting long and unreadable:
object MyJsonProtocol extends DefaultJsonProtocol {
... // many more here
implicit val someClassWithALongNameFormat = jsonFormat4(SomeClassWithALongName)
implicit val someClassWithALongNameInnerFormat = jsonFormat4(SomeClassWithALongNameInner)
implicit val someClassWithALongNameVariantBFormat = jsonFormat4(SomeClassWithALongNameVariantB)
... // many more here
}
The long val names have various problems:
they feel redundant (the names are never read)
they make my lines very long
they introduce a copy/paste risk that the name of the format won't match the type of the format
they make the RHS values not aligned, which hides the common pattern here
Is there any way to bring an object into implicit scope without naming it? Something like this would be much neater:
object MyJsonProtocol extends DefaultJsonProtocol {
... // many more here
implicit val _ = jsonFormat4(SomeClassWithALongName)
implicit val _ = jsonFormat4(SomeClassWithALongNameInner)
implicit val _ = jsonFormat4(SomeClassWithALongNameVariantB)
... // many more here
}
... but Scala doesn't allow multiple fields named "_".
Is there any way to bring these formats into implicit scope without naming them all? Is there another way to use spray-json that avoids this issue?
Normally, I define typeclass instances in the companion objects:
case class Foo()
object Foo {
implicit val jsonFormatter = new JsonFormat[Foo] { ... }
}
case class Bar()
object Bar {
implicit val jsonFormatter = new JsonFormat[Bar] { ... }
}
I don't have to import anything, as companion objects are by default included in the implicit search scope, and the implicit members can all have the same names.
class PasswordCaseClass(val password:String)
trait PasswordTrait { self:PasswordCaseClass =>
override def password = "blue"
}
val o = new PasswordCaseClass("flowers") with PasswordTrait
Is it possible to override PasswordCaseClass's password with what is provided in PasswordTrait? Right now, I receive this error:
e.scala:6: error: overriding value password in class PasswordCase
Class of type String;
method password in trait PasswordTrait of type => java.lang.String needs to be a stable,
immutable value
val o = new PasswordCaseClass("flowers") with PasswordTrait
^
one error found
I would like to be able to have something like this:
class User(val password:String) {
}
trait EncryptedPassword { u:User =>
def password = SomeCriptographyLibrary.encrypt(u.password)
}
val u = new User("random_password") with EncryptedPassword
println(u.password) // see the encrypted version here
You can override a def with a val, but you can't do it the other way around. A val implies a guarantee -- that it's value is stable and immutable -- that a def does not.
This worked for me (with some modifications):
trait PasswordLike {
val password: String
}
class PasswordCaseClass(val password:String) extends PasswordLike
trait PasswordTrait extends PasswordLike {
override val password: String = "blue"
}
and then:
scala> val o = new PasswordCaseClass("flowers") with PasswordTrait
o: PasswordCaseClass with PasswordTrait = $anon$1#c2ccac
scala> o.password
res1: String = blue
You are trying to override the value with the method definition. It simply makes no sense - they have different semantics. Values supposed to be calculated once per object lifecycle (and stored within a final class attribute) and methods can be calculated multiple times. So what you are trying to do is to brake the contract of the class in a number of ways.
Anyway there is also compiler's fault - the error explanation is totally unclear.