I know that "union types" are not supported in Scala, but what about intersection types?
In short, I would like a function like this:
def intersect[A,B,C](a: A, b: B): C = ??? // a & b
Or a method :
class A {
def intersect[B, C](b: B): C = ??? // this & b
}
A and B share a common superclass ensuring the validity of the intersect operation, and C would be a type at the intersection ofA or B.
In my use case, A or B represent either variables or constants (of the same type). I want to distinguish, class-wise, a constant from a variable with singleton domain. If I try to intersect a set with a value, I return the value (or return the empty set/throw an exception if the value is not in the set).
Here is an example of expected output:
trait IntExpression {
// Correct signature to be determined
def intersect [A <: IntExpression, B <: A & this.type] (that: A): B
}
case class IntVariable(domain: Seq[Int]) extends IntExpression
case class IntConstant(value: Int) extends IntExpression
val a = IntVariable(1,2,3)
val b = IntVariable(2,3,4)
val c = IntConstant(2)
And then:
a intersect b == b intersect a == IntVariable(2,3)
a intersect c == c intersect a == IntConstant(2)
As ziggystar is correctly stated above: A & B is A with B in Scala.
Regarding the fact you want to create your adjoint types C in runtime, you must want to create this type in runtime, based on the types you got in A and B.
Solution for this, or at least a clue, you may find at How to mix-in a trait to instance?.
However, what you want at your use-case is a case of the http://en.wikipedia.org/wiki/Dependent_type. In spite of the fact that dependent types are not supported in Scala, you may always try Agda ;)
I use ++ as implementation because it's there. And intersect on Set doesn't work because Set is invariant. If this doesn't make sense, ignore it.
def intersect[C, A <: C, B <: C](as: Seq[A], bs: Seq[B]): Seq[C] = as ++ bs
Related
I have been trying to determine how one might write a type parameter that restricts a function to types that support relational operators?
For example:
def biggerOf[A <: ???](a: A, b: A): A = { if (a > b) a else b }
Where ??? is my dilemma. Advanced type parameter bits are new to me, so asking for a little help. Thought AnyVal might be a winner but for Unit type (and Boolean which won't break, but won't work either). Thanks for any ideas.
You want to bring the Ordering typeclass into play.
import scala.math.Ordering.Implicits.infixOrderingOps
def biggerOf[A:Ordering](a: A, b: A): A = { if (a > b) a else b }
A:Ordering restricts A to types in the Ordering typeclass and infixOrderingOps enables the convenience operators (methods) such as <, >=, etc.
You can use the ordering typeclass.
def biggerOf[A : Ordering](a: A, b: A): A = {
import Ordering.Implicits._
if (a > b) a else b
}
As Luis and jwvh has suggested, you can use the Ordering typeclass but I'd like to introduce to you the typeclass pattern and using typeclasses in restricting the usage of a function in a more depth.
A typeclass allows one to be able to be flexible in its generic types whilst restricting it enough that those that do not have an instance of the typeclass may not be able to use your function.
Take this as an example, assuming that the Numerical datatypes in Scala have no operators on them, we can generalize the idea of operating on them by introducing the Num typeclass and then by specifying their behavior on the operation because these Numerical datatypes have different ways of handling algebraic operations. (e.g an Integral vs Real number)
trait Num[A] {
def add(l: A, r: A): A
def sub(l: A, r: A): A
def mul(l: A, r: A): A
def div(l: A, r: A): A
}
object NumTest {
def addThenMultiply[A](l: A, r: A)(implicit ev: Num[A]): A =
ev.mul(ev.add(l, r), r)
//then we create Num instances for numerical types
implicit val intNum = new Num[Int] { /* implementation*/ }
implicit val floatNum = new Num[Float] { /* implementation */ }
}
Now in this case, the function addThenMultiply will only work for Int and Float datatypes since they're the only ones that have a Num instance but you can add instances for other custom datatypes also depending on your need.
Noticing that my code was essentially iterating over a list and updating a value in a Map, I first created a trivial helper method which took a function for the transformation of the map value and return an updated map. As the program evolved, it gained a few other Map-transformation functions, so it was natural to turn it into an implicit value class that adds methods to scala.collection.immutable.Map[A, B]. That version works fine.
However, there's nothing about the methods that require a specific map implementation and they would seem to apply to a scala.collection.Map[A, B] or even a MapLike. So I would like it to be generic in the map type as well as the key and value types. This is where it all goes pear-shaped.
My current iteration looks like this:
implicit class RichMap[A, B, MapType[A, B] <: collection.Map[A, B]](
val self: MapType[A, B]
) extends AnyVal {
def updatedWith(k: A, f: B => B): MapType[A, B] =
self updated (k, f(self(k)))
}
This code does not compile because self updated (k, f(self(k))) isa scala.collection.Map[A, B], which is not a MapType[A, B]. In other words, the return type of self.updated is as if self's type was the upper type bound rather than the actual declared type.
I can "fix" the code with a downcast:
def updatedWith(k: A, f: B => B): MapType[A, B] =
self.updated(k, f(self(k))).asInstanceOf[MapType[A, B]]
This does not feel satisfactory because downcasting is a code smell and indicates misuse of the type system. In this particular case it would seem that the value will always be of the cast-to type, and that the whole program compiles and runs correctly with this downcast supports this view, but it still smells.
So, is there a better way to write this code to have scalac correctly infer types without using a downcast, or is this a compiler limitation and a downcast is necessary?
[Edited to add the following.]
My code which uses this method is somewhat more complex and messy as I'm still exploring a few ideas, but an example minimum case is the computation of a frequency distribution as a side-effect with code roughly like this:
var counts = Map.empty[Int, Int] withDefaultValue 0
for (item <- items) {
// loads of other gnarly item-processing code
counts = counts updatedWith (count, 1 + _)
}
There are three answers to my question at the time of writing.
One boils down to just letting updatedWith return a scala.collection.Map[A, B] anyway. Essentially, it takes my original version that accepted and returned an immutable.Map[A, B], and makes the type less specific. In other words, it's still insufficiently generic and sets policy on which types the caller uses. I can certainly change the type on the counts declaration, but that is also a code smell to work around a library returning the wrong type, and all it really does is move the downcast into the caller's code. So I don't really like this answer at all.
The other two are variations on CanBuildFrom and builders in that they essentially iterate over the map to produce a modified copy. One inlines a modified updated method, whereas the other calls the original updated and appends it to the builder and thus appears to make an extra temporary copy. Both are good answers which solve the type correctness problem, although the one that avoids an extra copy is the better of the two from a performance standpoint and I prefer it for that reason. The other is however shorter and arguably more clearly shows intent.
In the case of a hypothetical immutable Map that shares large trees in a similar vein to List, this copying would break the sharing and reduce performance and so it would be preferable to use the existing modified without performing copies. However, Scala's immutable maps don't appear to do this and so copying (once) seems to be the pragmatic solution that is unlikely to make any difference in practice.
Yes! Use CanBuildFrom. This is how the Scala collections library infers the closest collection type to the one you want, using CanBuildFrom evidence. So long as you have implicit evidence of CanBuildFrom[From, Elem, To], where From is the type of collection you're starting with, Elem is the type contained within the collection, and To is the end result you want. The CanBuildFrom will supply a Builder to which you can add elements to, and when you're done, you can call Builder#result() to get the completed collection of the appropriate type.
In this case:
From = MapType[A, B]
Elem = (A, B) // The type actually contained in maps
To = MapType[A, B]
Implementation:
import scala.collection.generic.CanBuildFrom
implicit class RichMap[A, B, MapType[A, B] <: collection.Map[A, B]](
val self: MapType[A, B]
) extends AnyVal {
def updatedWith(k: A, f: B => B)(implicit cbf: CanBuildFrom[MapType[A, B], (A, B), MapType[A, B]]): MapType[A, B] = {
val builder = cbf()
builder ++= self.updated(k, f(self(k)))
builder.result()
}
}
scala> val m = collection.concurrent.TrieMap(1 -> 2, 5 -> 3)
m: scala.collection.concurrent.TrieMap[Int,Int] = TrieMap(1 -> 2, 5 -> 3)
scala> m.updatedWith(1, _ + 10)
res1: scala.collection.concurrent.TrieMap[Int,Int] = TrieMap(1 -> 12, 5 -> 3)
Please note that updated method returns Map class, rather than generic, so I would say you should be fine returning Map as well. But if you really want to return a proper type, you could have a look at implementation of updated in List.updated
I've wrote a small example. I'm not sure it covers all the cases, but it works on my tests. I also used mutable Map, because it was harder for me to test immutable, but I guess it can be easily converted.
implicit class RichMap[A, B, MapType[x, y] <: Map[x, y]](val self: MapType[A, B]) extends AnyVal {
import scala.collection.generic.CanBuildFrom
def updatedWith[R >: B](k: A, f: B => R)(implicit bf: CanBuildFrom[MapType[A, B], (A, R), MapType[A, R]]): MapType[A, R] = {
val b = bf(self)
for ((key, value) <- self) {
if (key != k) {
b += (key -> value)
} else {
b += (key -> f(value))
}
}
b.result()
}
}
import scala.collection.immutable.{TreeMap, HashMap}
val map1 = HashMap(1 -> "s", 2 -> "d").updatedWith(2, _.toUpperCase()) // map1 type is HashMap[Int, String]
val map2 = TreeMap(1 -> "s", 2 -> "d").updatedWith(2, _.toUpperCase()) // map2 type is TreeMap[Int, String]
val map3 = HashMap(1 -> "s", 2 -> "d").updatedWith(2, _.asInstanceOf[Any]) // map3 type is HashMap[Int, Any]
Please also note that CanBuildFrom pattern is much more powerfull and this example doesn't use all of it's power. Thanks to CanBuildFrom some operations can change the type of collection completely like BitSet(1, 3, 5, 7) map {_.toString } type is actually SortedSet[String].
So in scala we have the typical Lens signature as:
case class Lens[O,V](get: O => V, set: (O,V) => O)
But as you can see, it only updates and sets values of the same type, it does not set one type for another. What I have in mind is something more like this:
case class Lens[O[_],A,B](get: O[A] => A, set: (O[A],B) => O[B])
With A and B make sense for O[_]My question is. Does this stop being isomorphic? Is there a simpler way without breaking some rules?
I think that to figure out the right Lens abstraction, it would be helpful to have a concrete lens-able type in mind.
However, for your particular example, there is something we can say:
case class Lens[O[_],V[_],A,B](get: O[A] => V[A], set: (O[A],V[B]) => O[B])
I do not think that this kind of lens can be composed. In order to compose lenses, the result of a get has to be able to feed into set. But here, the result of a get is a V[_], whereas set needs an O[_].
As a further explanation, here is another possible kind of polymorphic lens, but it is probably not one that fits your needs:
trait Lens[T[_]] {
def get[A](t: T[A]): A
def set[A,B](t: T[A], x: B): T[B]
}
It can be composed like so:
def composeLenses[T[_],U[_]](lens1: Lens[T], lens2: Lens[U]) =
new Lens[({type x[A] = T[U[A]]})#x] {
def get[A](t: T[U[A]]): A = lens2.get(lens1.get(t))
def set[A,B](t: T[U[A]], x: B): T[U[B]] = lens1.set(t, lens2.set(lens1.get(t), x))
}
I wouldn't have been able to figure out the definition of Lens abstractly -- in order to do it I had to use this concrete case:
case class Box[A](x: A)
def boxBoxGet[A](b: Box[Box[A]]): A = ???
def boxBoxSet[A,B](b: Box[Box[A]], x: B): Box[Box[B]] = ???
In haskell lens and in monocle, polymorphic lenses have 4 type parameters. They are equivalent to the following implementation:
case class PLens[S, T, A, B](get: S => A, set: B => S => T)
Then monomoprhic lenses are simply a type alias:
type Lens[S, A] = PLens[S, S, A, A]
You can read the 4 type parameters as: if I change an A to B inside of S then I get a T.
e.g.
S = (Int, String)
T = (Long, String)
A = Int
B = Long
I have a pair of classes that look something like this. There's a Generator that generates a value based on some class-level values, and a GeneratorFactory that constructs a Generator.
case class Generator[T, S](a: T, b: T, c: T) {
def generate(implicit bf: CanBuildFrom[S, T, S]): S =
bf() += (a, b, c) result
}
case class GeneratorFactory[T]() {
def build[S <% Seq[T]](seq: S) = Generator[T, S](seq(0), seq(1), seq(2))
}
You'll notice that GeneratorFactory.build accepts an argument of type S and Generator.generate produces a value of type S, but there is nothing of type S stored by the Generator.
We can use the classes like this. The factory works on a sequence of Char, and generate produces a String because build is given a String.
val gb = GeneratorFactory[Char]()
val g = gb.build("this string")
val o = g.generate
This is fine and handles the String type implicitly because we are using the GeneratorFactory.
The Problem
Now the problem arises when I want to construct a Generator without going through the factory. I would like to be able to do this:
val g2 = Generator('a', 'b', 'c')
g2.generate // error
But I get an error because g2 has type Generator[Char,Nothing] and Scala "Cannot construct a collection of type Nothing with elements of type Char based on a collection of type Nothing."
What I want is a way to tell Scala that the "default value" of S is something like Seq[T] instead of Nothing. Borrowing from the syntax for default parameters, we could think of this as being something like:
case class Generator[T, S=Seq[T]]
Insufficient Solutions
Of course it works if we explicitly tell the generator what its generated type should be, but I think a default option would be nicer (my actual scenario is more complex):
val g3 = Generator[Char, String]('a', 'b', 'c')
val o3 = g3.generate // works fine, o3 has type String
I thought about overloading Generator.apply to have a one-generic-type version, but this causes an error since apparently Scala can't distinguish between the two apply definitions:
object Generator {
def apply[T](a: T, b: T, c: T) = new Generator[T, Seq[T]](a, b, c)
}
val g2 = Generator('a', 'b', 'c') // error: ambiguous reference to overloaded definition
Desired Output
What I would like is a way to simply construct a Generator without specifying the type S and have it default to Seq[T] so that I can do:
val g2 = Generator('a', 'b', 'c')
val o2 = g2.generate
// o2 is of type Seq[Char]
I think that this would be the cleanest interface for the user.
Any ideas how I can make this happen?
Is there a reason you don't want to use a base trait and then narrow S as needed in its subclasses? The following for example fits your requirements:
import scala.collection.generic.CanBuildFrom
trait Generator[T] {
type S
def a: T; def b: T; def c: T
def generate(implicit bf: CanBuildFrom[S, T, S]): S = bf() += (a, b, c) result
}
object Generator {
def apply[T](x: T, y: T, z: T) = new Generator[T] {
type S = Seq[T]
val (a, b, c) = (x, y, z)
}
}
case class GeneratorFactory[T]() {
def build[U <% Seq[T]](seq: U) = new Generator[T] {
type S = U
val Seq(a, b, c, _*) = seq: Seq[T]
}
}
I've made S an abstract type to keep it a little more out of the way of the user, but you could just as well make it a type parameter.
This does not directly answer your main question, as I think others are handling that. Rather, it is a response to your request for default values for type arguments.
I have put some thought into this, even going so far as starting to write a proposal for instituting a language change to allow it. However, I stopped when I realized where the Nothing actually comes from. It is not some sort of "default value" like I expected. I will attempt to explain where it comes from.
In order to assign a type to a type argument, Scala uses the most specific possible/legal type. So, for example, suppose you have "class A[T](x: T)" and you say "new A[Int]". You directly specified the value of "Int" for T. Now suppose that you say "new A(4)". Scala knows that 4 and T have to have the same type. 4 can have a type anywhere between "Int" and "Any". In that type range, "Int" is the most specific type, so Scala creates an "A[Int]". Now suppose that you say "new A[AnyVal]". Now, you are looking for the most specific type T such that Int <: T <: Any and AnyVal <: T <: AnyVal. Luckily, Int <: AnyVal <: Any, so T can be AnyVal.
Continuing, now suppose that you have "class B[S >: String <: AnyRef]". If you say "new B", you won't get an B[Nothing]. Rather you will find that you get a B[String]. This is because S is being constrained as String <: S <: AnyRef and String is at the bottom of that range.
So, you see, for "class C[R]", "new C" doesn't give you a C[Nothing] because Nothing is some sort of default value for type arguments. Rather, you get a C[Nothing] because Nothing is the lowest thing that R can be (if you don't specify otherwise, Nothing <: R <: Any).
This is why I gave up on my default type argument idea: I couldn't find a way to make it intuitive. In this system of restricting ranges, how do you implement a low-priority default? Or, does the default out-priority the "choose the lowest type" logic if it is within the valid range? I couldn't think of a solution that wouldn't be confusing for at least some cases. If you can, please let me know, as I'm very interested.
edit: Note that the logic is reversed for contravariant parameters. So if you have "class D[-Q]" and you say "new D", you get a D[Any].
One option is to move the summoning of the CanBuildFrom to a place where it (or, rather, its instances) can help to determine S,
case class Generator[T,S](a: T, b: T, c: T)(implicit bf: CanBuildFrom[S, T, S]) {
def generate : S =
bf() += (a, b, c) result
}
Sample REPL session,
scala> val g2 = Generator('a', 'b', 'c')
g2: Generator[Char,String] = Generator(a,b,c)
scala> g2.generate
res0: String = abc
Update
The GeneratorFactory will also have to be modified so that its build method propagates an appropriate CanBuildFrom instance to the Generator constructor,
case class GeneratorFactory[T]() {
def build[S](seq: S)(implicit conv: S => Seq[T], bf: CanBuildFrom[S, T, S]) =
Generator[T, S](seq(0), seq(1), seq(2))
}
Not that with Scala < 2.10.0 you can't mix view bounds and implicit parameter lists in the same method definition, so we have to translate the bound S <% Seq[T] to its equivalent implicit parameter S => Seq[T].
Okay, fair warning: this is a follow-up to my ridiculous question from last week. Although I think this question isn't as ridiculous. Anyway, here goes:
Previous ridiculous question:
Assume I have some base trait T with subclasses A, B and C, I can declare a collection Seq[T] for example, that can contain values of type A, B and C. Making the subtyping more explicit, let's use the Seq[_ <: T] type bound syntax.
Now instead assume I have a typeclass TC[_] with members A, B and C (where "member" means the compiler can find some TC[A], etc. in implicit scope). Similar to above, I want to declare a collection of type Seq[_ : TC], using context bound syntax.
This isn't legal Scala, and attempting to emulate may make you feel like a bad person. Remember that context bound syntax (when used correctly!) desugars into an implicit parameter list for the class or method being defined, which doesn't make any sense here.
New premise:
So let's assume that typeclass instances (i.e. implicit values) are out of the question, and instead we need to use implicit conversions in this case. I have some type V (the "v" is supposed to stand for "view," fwiw), and implicit conversions in scope A => V, B => V and C => V. Now I can populate a Seq[V], despite A, B and C being otherwise unrelated.
But what if I want a collection of things that are implicitly convertible both to views V1 and V2? I can't say Seq[V1 with V2] because my implicit conversions don't magically aggregate that way.
Intersection of implicit conversions?
I solved my problem like this:
// a sort of product or intersection, basically identical to Tuple2
final class &[A, B](val a: A, val b: B)
// implicit conversions from the product to its member types
implicit def productToA[A, B](ab: A & B): A = ab.a
implicit def productToB[A, B](ab: A & B): B = ab.b
// implicit conversion from A to (V1 & V2)
implicit def viewsToProduct[A, V1, V2](a: A)(implicit v1: A => V1, v2: A => V2) =
new &(v1(a), v2(a))
Now I can write Seq[V1 & V2] like a boss. For example:
trait Foo { def foo: String }
trait Bar { def bar: String }
implicit def stringFoo(a: String) = new Foo { def foo = a + " sf" }
implicit def stringBar(a: String) = new Bar { def bar = a + " sb" }
implicit def intFoo(a: Int) = new Foo { def foo = a.toString + " if" }
implicit def intBar(a: Int) = new Bar { def bar = a.toString + " ib" }
val s1 = Seq[Foo & Bar]("hoho", 1)
val s2 = s1 flatMap (ab => Seq(ab.foo, ab.bar))
// equal to Seq("hoho sf", "hoho sb", "1 if", "1 ib")
The implicit conversions from String and Int to type Foo & Bar occur when the sequence is populated, and then the implicit conversions from Foo & Bar to Foo and Bar occur when calling foobar.foo and foobar.bar.
The current ridiculous question(s):
Has anybody implemented this pattern anywhere before, or am I the first idiot to do it?
Is there a much simpler way of doing this that I've blindly missed?
If not, then how would I implement more general plumbing, such that I can write Seq[Foo & Bar & Baz]? This seems like a job for HList...
Extra mega combo bonus: in implementing the more general plumbing, can I constrain the types to be unique? For example, I'd like to prohibit Seq[Foo & Foo].
The appendix of fails:
My latest attempt (gist). Not terrible, but there are two things I dislike there:
The Seq[All[A :: B :: C :: HNil]] syntax (I want the HList stuff to be opaque, and prefer Seq[A & B & C])
The explicit type annotation (abc[A].a) required for conversion. It seems like you can either have type inference or implicit conversions, but not both... I couldn't figure out how to avoid it, anyhow.
I can give a partial answer for the point 4. This can be obtained by applying a technique such as :
http://vpatryshev.blogspot.com/2012/03/miles-sabins-type-negation-in-practice.html