Is there a way to make sure that a class overrides hashCode - scala

I am creating a cache, and want to make sure that the key type overrides hashCode.
If hashCode was not already defined on Object, something like this would work
trait Key {
def hashCode: Int
}
If the keys are always case classes it is obviously not a problem, but I want to make sure that if somebody passes a regular class it will fail. Is there a way to do it in Scala?
On a side note: My key is specifications for a SQL query which currently is represented as case classes. For example
case class Filter(age: Option[Int], gender: Option[String])
But eventually, I want to represent it using a cleaner specification pattern implementation (for example: https://gist.github.com/lbialy/912fad3c909374b81ce7)

If you want to explicitly whitelist classes that are allowed to use their hashCode, you cannot use inheritance for that, but you can provide your own typeclass:
trait HasApprovedHashCode[X] {
def hashCode(x: X): Int
}
and then modify all the methods that crucially rely on a proper implementation of hashCode like this:
def methodRelyingOnHashCode[K: HasApprovedHashCode, V](...) = ...
Now you can explicitly whitelist only those classes that you deem as having good enough implementation of hashCode.
Usually, I would say: hash code of the used key is not your responsibility. If the user of your library insists on shooting h(im/er)self in the foot, you cannot prevent it. You shouldn't facilitate it, or even create a situation where this is almost inevitable, but it's not your responsibility to hunt down every single class out there that could somehow misbehave when used as a key of a map.

Related

Is there a difference between extending a trait with a type or using a type parameter in your case class?

I recently had a coworker implement a trait like
trait CaseClassStuff{
type T
val value: T
}
and then used it to instantiate case classes as
case class MyCaseClassString(value: String) extends CaseClassStuff { type T = String }
case class MyCaseClassDouble(value: Double) extends CaseClassStuff { type T = Double }
and I thought that was particularly whacky since it seemed reasonable enough to just do
case class MyCaseClass[T](value: T)
to get the exact same result. There was argument over how using the trait allowed us to avoid needing to update anything using that case class, since with the trait we just explicitly used MyCaseClassString and MyCaseClassDouble in different areas, but I wasn't sure how since they seemed to be ostensibly the same thing, especially since the only change between the two is their type. The program using them was set up to parse out the logic when it was a double or a string received.
So, my question is about whether or not they are different as far as the compiler is concerned, and whether or not there is actual benefit from doing it the way with the trait in general, or if it was just specific to my situation. It wasn't clear to either of us if it was best practice to use the trait or just the type parameter, since it seems like two ways to accomplish the same outcome.

Caching Scala Case Class Instances

Suppose we have the following case classes:
abstract sealed class Tree
case class Leaf(i: Int) extends Tree
case class Node(left: Tree, right: Tree) extends Tree
Every time we call a case class constructor, a new object is created in memory. For instance, in the code below:
val a = Leaf(0)
val b = Leaf(0)
a and b point to distinct objects in memory:
a == b // true
a eq b // false
I would like to override the "apply" method of the case classes, to make them return a cached object, in case it already exists, so that, in the minimal example above, "a eq b" would return true.
I found these two related answers in Stackoverflow:
How to override apply in a case class companion (shows how to override "apply" method)
Why do each new instance of case classes evaluate lazy vals again in Scala? (shows a simple way to cache case class instances)
I am planning to implement my overriding "apply" method with caching in a way that combines the two approaches linked above. But I am wondering if there are alternative ways that I should consider. If you know any, could you please share your solution here?
Caching instances of case classes seems to be a very useful and natural thing to do to reduce memory consumption. And yet, the solution I am planning to implement (based on the two answers linked above) seems quite convoluted, requiring a lot of boilerplate code that will compromise the elegance and succinctness of case classes. Does anyone know if future versions of the Scala language might allow us to achieve case class instance caching by writing something simple like this:
abstract sealed class Tree
cached case class Leaf(i: Int) extends Tree
cached case class Node(left: Tree, right: Tree) extends Tree
??
Caching instances of case classes seems to be a very useful and natural thing to do to reduce memory consumption.
Note that this isn't even remotely an automatic improvement, and very much depends on usage pattern of the case class (not just yours, but anybody who uses your library):
You need to take into account the memory cache needs and inability to garbage collect instances referenced from the cache (note that using a WeakHashMap won't help: it requires "that value objects do not strongly refer to their own keys, either directly or indirectly").
If the keys are primitives (as in Leaf), they need to be boxed before lookup which will often already be a constructor call.
Lookup in a map is significantly slower than a trivial constructor call.
Escape analysis will often ensure the objects aren't actually constructed, while making sure your program works as if they were. Of course, caching will ensure that objects do escape.
But neglecting all that, you can write a macro annotation which will allow you #cached case class Leaf(i: Int) extends Tree and generate the code you want (or at least #cachedcase class; I am not sure if you'll be able to override apply otherwise). Because of the above I just wouldn't expect it to be a part of the language any time soon.

Scala type alias with companion object

I'm a relatively new Scala user and I wanted to get an opinion on the current design of my code.
I have a few classes that are all represented as fixed length Vector[Byte] (ultimately they are used in a learning algorithm that requires a byte string), say A, B and C.
I would like these classes to be referred to as A, B and C elsewhere in the package for readability sake and I don't need to add any extra class methods to Vector for these methods. Hence, I don't think the extend-my-library pattern is useful here.
However, I would like to include all the useful functional methods that come with Vector without having to 'drill' into a wrapper object each time. As efficiency is important here, I also didn't want the added weight of a wrapper.
Therefore I decided to define type aliases in the package object:
package object abc {
type A: Vector[Byte]
type B: Vector[Byte]
type C: Vector[Byte]
}
However, each has it's own fixed length and I would like to include factory methods for their creation. It seems like this is what companion objects are for. This is how my final design looks:
package object abc {
type A: Vector[Byte]
object A {
val LENGTH: Int = ...
def apply(...): A = {
Vector.tabulate...
}
}
...
}
Everything compiles and it allows me to do stuff like this:
val a: A = A(...)
a map {...} mkString(...)
I can't find anything specifically warning against writing companion objects for type aliases, but it seems it goes against how type aliases should be used. It also means that all three of these classes are defined in the same file, when ideally they should be separated.
Are there any hidden problems with this approach?
Is there a better design for this problem?
Thanks.
I guess it is totally ok, because you are not really implementing a companion object.
If you were, you would have access to private fields of immutable.Vector from inside object A (like e.g. private var dirty), which you do not have.
Thus, although it somewhat feels like A is a companion object, it really isn't.
If it were possible to create a companion object for any type by using type alias would make member visibility constraints moot (except maybe for private|protected[this]).
Furthermore, naming the object like the type alias clarifies context and purpose of the object, which is a plus in my book.
Having them all in one file is something that is pretty common in scala as I know it (e.g. when using the type class pattern).
Thus:
No pitfalls, I know of.
And, imho, no need for a different approach.

How to correctly implement a custom number-like class in Scala?

I am currently trying to implement my own UnsignedInt. I would like to implement this correctly so that it fits into the whole Scala type & class system. However, I am really confused by all the classes that fit into Number.
With which class(es) do I need to work: Numeric, Integral or ScalaNumber? Or something completely different? What classes and/or traits should my own class implement?
The short answer is: don't implement your own, use the Spire one :) Otherwise, you should implement Integral (which includes Numeric). Note that your type shouldn't extend it; you need implicit values in the companion object, i.e.
class UnsignedInt { ... }
object UnsignedInt {
implicit val UIntIntegral: Integral[UnsignedInt] = ...
}
You should also consider making your class a value class.

Is it appropriate to define a non-trivial Scala case class?

I'm defining a Scala class today, and I think "I need an equals method and a hashCode method; and a copy method would be handy too. I'll turn this into a case class." My class already has a bunch of other code, and is in no way trivial.
So fine, it all works and everything, but when the text books deal with case classes, all of the examples define them for use as value classes or 'data transfer objects'. Is it appropriate to define a non-trivial case class? Is the thought process described above OK, or do I need to think of case classes differently?
A case class provides, equals, hashCode and toString methods based on the main constructor parameters, all of which are turned into val too. In addition, the object companion gets an apply and an unapply methods, again based on the main constructor parameters.
Also, a case class inherits from Serializable and from Product, and should not be extended by other classes.
If all of these things are appropriate for your class, then feel free to declare it as a `case class'.
Feel free, provided it doesn't have descendants. Extending case classes is a bad idea.