Scala different outputs of logically same programs - scala

val a = List(1, 2, 3, 4, 5)
val b = a.grouped(2).filter(_.length == 2).map(x => (x(0), x(1)))
//b.foreach(x => println(x))
val r = b.foldLeft((0, 0)) {
case ((m, n), (x, y)) => {
(m + x, n + y)
}
}
println(r)
The program gives correct output (4, 6) for the above program. But when I uncomment the foreach statement above it outputs (0, 0). What's wrong here?

val b = a.grouped(2).filter(_.length == 2).map(x => (x(0), x(1))), b's type is Iterator:
scala> :type b
Iterator[(Int, Int)]
so when you have iterated b by b.foreach(x => println(x)), after this the current iterator b is empty, Since Iterator only can be iterated once.

Related

Optimal way to find neighbors of element of collection in circular manner

I have a Vector and I'd like to find neighbors of given element.
Say if we have Vector(1, 2, 3, 4, 5) and then:
for element 2, result must be Some((1, 3))
for element 5, result must be Some((4, 1))
for element 1, result must be Some((5, 2))
for element 6, result must be None
and so on..
I have not found any solution in standard lib(please point me if there is one), so got the next one:
implicit class VectorOps[T](seq: Vector[T]) {
def findNeighbors(elem: T): Option[(T, T)] = {
val currentIdx = seq.indexOf(elem)
val firstIdx = 0
val lastIdx = seq.size - 1
seq match {
case _ if currentIdx == -1 || seq.size < 2 => None
case _ if seq.size == 2 => seq.find(_ != elem).map(elem => (elem, elem))
case _ if currentIdx == firstIdx => Some((seq(lastIdx), seq(currentIdx + 1)))
case _ if currentIdx == lastIdx => Some((seq(currentIdx - 1), seq(firstIdx)))
case _ => Some((seq(currentIdx - 1), seq(currentIdx + 1)))
}
}
}
The question is: how this can be simplified/optimized using stdlib?
def neighbours[T](v: Seq[T], x: T): Option[(T, T)] =
(v.last +: v :+ v.head)
.sliding(3, 1)
.find(_(1) == x)
.map(x => (x(0), x(2)))
This uses sliding to create a 3 element window in the data and uses find to match the middle value of the 3. Adding the last/first to the input deals with the wrap-around case.
This will fail if the Vector is too short so needs some error checking.
This version is safe for all input
def neighbours[T](v: Seq[T], x: T): Option[(T, T)] =
(v.takeRight(1) ++ v ++ v.take(1))
.sliding(3, 1)
.find(_(1) == x)
.map(x => (x(0), x(2)))
Optimal when number of calls with the same sequence is about or more than seq.toSet.size:
val elementToPair = seq.indicies.map(i => seq(i) ->
(seq((i - 1 + seq.length) % seq.length), seq((i + 1 + seq.length) % seq.length)
).toMap
elementToPair.get(elem)
// other calls
Optimal when number of calls with the same sequence less than seq.toSet.size:
Some(seq.indexOf(elem)).filterNot(_ == -1).map { i =>
(seq((i - 1 + seq.length) % seq.length), seq((i + 1 + seq.length) % seq.length) }

Looking for an FP ranking implementation which handles ties (i.e. equal values)

Starting from a sorted sequence of values, my goal is to assign a rank to each value, using identical ranks for equal values (aka ties):
Input: Vector(1, 1, 3, 3, 3, 5, 6)
Output: Vector((0,1), (0,1), (1,3), (1,3), (1,3), (2,5), (3,6))
A few type aliases for readability:
type Rank = Int
type Value = Int
type RankValuePair = (Rank, Value)
An imperative implementation using a mutable rank variable could look like this:
var rank = 0
val ranked1: Vector[RankValuePair] = for ((value, index) <- values.zipWithIndex) yield {
if ((index > 0) && (values(index - 1) != value)) rank += 1
(rank, value)
}
// ranked1: Vector((0,1), (0,1), (1,3), (1,3), (1,3), (2,5), (3,6))
To hone my FP skills, I was trying to come up with a functional implementation:
val ranked2: Vector[RankValuePair] = values.sliding(2).foldLeft((0 , Vector.empty[RankValuePair])) {
case ((rank: Rank, rankedValues: Vector[RankValuePair]), Vector(currentValue, nextValue)) =>
val newRank = if (nextValue > currentValue) rank + 1 else rank
val newRankedValues = rankedValues :+ (rank, currentValue)
(newRank, newRankedValues)
}._2
// ranked2: Vector((0,1), (0,1), (1,3), (1,3), (1,3), (2,5))
It is less readable, and – more importantly – is missing the last value (due to using sliding(2) on an odd number of values).
How could this be fixed and improved?
This works well for me:
// scala
val vs = Vector(1, 1, 3, 3, 3, 5, 6)
val rank = vs.distinct.zipWithIndex.toMap
val result = vs.map(i => (rank(i), i))
The same in Java 8 using Javaslang:
// java(slang)
Vector<Integer> vs = Vector(1, 1, 3, 3, 3, 5, 6);
Function<Integer, Integer> rank = vs.distinct().zipWithIndex().toMap(t -> t);
Vector<Tuple2<Integer, Integer>> result = vs.map(i -> Tuple(rank.apply(i), i));
The output of both variants is
Vector((0, 1), (0, 1), (1, 3), (1, 3), (1, 3), (2, 5), (3, 6))
*) Disclosure: I'm the creator of Javaslang
This is nice and concise but it assumes that your Values don't go negative. (Actually it just assumes that they can never start with -1.)
val vs: Vector[Value] = Vector(1, 1, 3, 3, 3, 5, 6)
val rvps: Vector[RankValuePair] =
vs.scanLeft((-1,-1)){ case ((r,p), v) =>
if (p == v) (r, v) else (r + 1, v)
}.tail
edit
Modification that makes no assumptions, as suggested by #Kolmar.
vs.scanLeft((0,vs.headOption.getOrElse(0))){ case ((r,p), v) =>
if (p == v) (r, v) else (r + 1, v)
}.tail
Here's an approach with recursion, pattern matching and guards.
The interesting part is where the head and head of the tail (h and ht respectively) are de-constructed from the list and an if checks if they are equal. The logic for each case adjusts the rank and proceeds on the remaining part of the list.
def rank(xs: Vector[Value]): List[RankValuePair] = {
def rankR(xs: List[Value], acc: List[RankValuePair], rank: Rank): List[RankValuePair] = xs match{
case Nil => acc.reverse
case h :: Nil => rankR(Nil, (rank, h) :: acc, rank)
case h :: ht :: t if (h == ht) => rankR(xs.tail, (rank, h) :: acc, rank)
case h :: ht :: t if (h != ht) => rankR(xs.tail, (rank, h) :: acc, rank + 1)
}
rankR(xs.toList, List[RankValuePair](), 0)
}
Output:
scala> rank(xs)
res14: List[RankValuePair] = List((0,1), (0,1), (1,3), (1,3), (1,3), (2,5), (3,6))
This is a modification of the solution by #jwvh, that doesn't make any assumptions about the values:
val vs = Vector(1, 1, 3, 3, 3, 5, 6)
vs.sliding(2).scanLeft(0, vs.head) {
case ((rank, _), Seq(a, b)) => (if (a != b) rank + 1 else rank, b)
}.toVector
Note, that it would throw if vs is empty, so you'd have to use vs.headOption getOrElse 0, or check if the input is empty beforehand: if (vs.isEmpty) Vector.empty else ...
import scala.annotation.tailrec
type Rank = Int
// defined type alias Rank
type Value = Int
// defined type alias Value
type RankValuePair = (Rank, Value)
// defined type alias RankValuePair
def rankThem(values: List[Value]): List[RankValuePair] = {
// Assumes that the "values" are sorted
#tailrec
def _rankThem(currentRank: Rank, currentValue: Value, ranked: List[RankValuePair], values: List[Value]): List[RankValuePair] = values match {
case value :: tail if value == currentValue => _rankThem(currentRank, value, (currentRank, value) +: ranked, tail)
case value :: tail if value > currentValue => _rankThem(currentRank + 1, value, (currentRank + 1, value) +: ranked, tail)
case Nil => ranked.reverse
}
_rankThem(0, Int.MinValue, List.empty[RankValuePair], values.sorted)
}
// rankThem: rankThem[](val values: List[Value]) => List[RankValuePair]
val valueList = List(1, 1, 3, 3, 5, 6)
// valueList: List[Int] = List(1, 1, 3, 3, 5, 6)
val rankValueList = rankThem(valueList)[RankedValuePair], values: Vector[Value])
// rankValueList: List[RankValuePair] = List((1,1), (1,1), (2,3), (2,3), (3,5), (4,6))
val list = List(1, 1, 3, 3, 5, 6)
val result = list
.groupBy(identity)
.mapValues(_.size)
.toArray
.sortBy(_._1)
.zipWithIndex
.flatMap(tuple => List.fill(tuple._1._2)((tuple._2, tuple._1._1)))
result: Array[(Int, Int)] = Array((0,1), (0,1), (1,3), (1,3), (2,5), (3,6))
The idea is using groupBy to find identical elements and find their occurrences and then sort and then flatMap. Time complexity I would say is O(nlogn), groupBy is O(n), sort is O(nlogn), fl

How to remove 2 or more duplicates from list and maintain their initial order?

Lets assume we have a Scala list:
val l1 = List(1, 2, 3, 1, 1, 3, 2, 5, 1)
We can easily remove duplicates using the following code:
l1.distinct
or
l1.toSet.toList
But what if we want to remove duplicates only if there are more than 2 of them? So if there are more than 2 elements with the same value we remain only two and remove the rest of them.
I could achieve it with following code:
l1.groupBy(identity).mapValues(_.take(2)).values.toList.flatten
that gave me the result:
List(2, 2, 5, 1, 1, 3, 3)
Elements are removed but the order of remaining elements is different from how these elements appeared in the initial list. How to do this operation and remain the order from original list?
So the result for l1 should be:
List(1, 2, 3, 1, 3, 2, 5)
Not the most efficient.
scala> val l1 = List(1, 2, 3, 1, 1, 3, 2, 5, 1)
l1: List[Int] = List(1, 2, 3, 1, 1, 3, 2, 5, 1)
scala> l1.zipWithIndex.groupBy( _._1 ).map(_._2.take(2)).flatten.toList.sortBy(_._2).unzip._1
res10: List[Int] = List(1, 2, 3, 1, 3, 2, 5)
My humble answer:
def distinctOrder[A](x:List[A]):List[A] = {
#scala.annotation.tailrec
def distinctOrderRec(list: List[A], covered: List[A]): List[A] = {
(list, covered) match {
case (Nil, _) => covered.reverse
case (lst, c) if c.count(_ == lst.head) >= 2 => distinctOrderRec(list.tail, covered)
case _ => distinctOrderRec(list.tail, list.head :: covered)
}
}
distinctOrderRec(x, Nil)
}
With the results:
scala> val l1 = List(1, 2, 3, 1, 1, 3, 2, 5, 1)
l1: List[Int] = List(1, 2, 3, 1, 1, 3, 2, 5, 1)
scala> distinctOrder(l1)
res1: List[Int] = List(1, 2, 3, 1, 3, 2, 5)
On Edit: Right before I went to bed I came up with this!
l1.foldLeft(List[Int]())((total, next) => if (total.count(_ == next) >= 2) total else total :+ next)
With an answer of:
res9: List[Int] = List(1, 2, 3, 1, 3, 2, 5)
Not the prettiest. I look forward to seeing the other solutions.
def noMoreThan(xs: List[Int], max: Int) =
{
def op(m: Map[Int, Int], a: Int) = {
m updated (a, m(a) + 1)
}
xs.scanLeft( Map[Int,Int]().withDefaultValue(0) ) (op).tail
.zip(xs)
.filter{ case (m, a) => m(a) <= max }
.map(_._2)
}
scala> noMoreThan(l1, 2)
res0: List[Int] = List(1, 2, 3, 1, 3, 2, 5)
More straightforward version using foldLeft:
l1.foldLeft(List[Int]()){(acc, el) =>
if (acc.count(_ == el) >= 2) acc else el::acc}.reverse
Similar to how distinct is implemeted, with a multiset instead of a set:
def noMoreThan[T](list : List[T], max : Int) = {
val b = List.newBuilder[T]
val seen = collection.mutable.Map[T,Int]().withDefaultValue(0)
for (x <- list) {
if (seen(x) < max) {
b += x
seen(x) += 1
}
}
b.result()
}
Based on experquisite's answer, but using foldLeft:
def noMoreThanBis(xs: List[Int], max: Int) = {
val initialState: (Map[Int, Int], List[Int]) = (Map().withDefaultValue(0), Nil)
val (_, result) = xs.foldLeft(initialState) { case ((count, res), x) =>
if (count(x) >= max)
(count, res)
else
(count.updated(x, count(x) + 1), x :: res)
}
result.reverse
}
distinct is defined for SeqLike as
/** Builds a new $coll from this $coll without any duplicate elements.
* $willNotTerminateInf
*
* #return A new $coll which contains the first occurrence of every element of this $coll.
*/
def distinct: Repr = {
val b = newBuilder
val seen = mutable.HashSet[A]()
for (x <- this) {
if (!seen(x)) {
b += x
seen += x
}
}
b.result()
}
We can define our function in very similar fashion:
def distinct2[A](ls: List[A]): List[A] = {
val b = List.newBuilder[A]
val seen1 = mutable.HashSet[A]()
val seen2 = mutable.HashSet[A]()
for (x <- ls) {
if (!seen2(x)) {
b += x
if (!seen1(x)) {
seen1 += x
} else {
seen2 += x
}
}
}
b.result()
}
scala> distinct2(l1)
res4: List[Int] = List(1, 2, 3, 1, 3, 2, 5)
This version uses internal state, but is still pure. It is also quite easy to generalise for arbitrary n (currently 2), but specific version is more performant.
You can implement the same function with folds carrying the "what is seen once and twice" state with you. Yet the for loop and mutable state does the same job.
How about this:
list
.zipWithIndex
.groupBy(_._1)
.toSeq
.flatMap { _._2.take(2) }
.sortBy(_._2)
.map(_._1)
Its a bit ugly, but its relatively faster
val l1 = List(1, 2, 3, 1, 1, 3, 2, 5, 1)
l1.foldLeft((Map[Int, Int](), List[Int]())) { case ((m, ls), x) => {
val z = m + ((x, m.getOrElse(x, 0) + 1))
(z, if (z(x) <= 2) x :: ls else ls)
}}._2.reverse
Gives: List(1, 2, 3, 1, 3, 2, 5)
Here is a recursive solution (it will stack overflow for large lists):
def filterAfter[T](l: List[T], max: Int): List[T] = {
require(max > 1)
//keep the state of seen values
val seen = Map[T, Int]().withDefaultValue(0)//init to 0
def filterAfter(l: List[T], seen: Map[T, Int]): (List[T], Map[T, Int]) = {
l match {
case x :: xs =>
if (seen(x) < max) {
//Update the state and pass to next
val pair = filterAfter(xs, seen updated (x, seen(x) + 1))
(x::pair._1, pair._2)
} else {
//already seen more than max
filterAfter(xs, seen)
}
case _ => (l, seen)//empty, terminate recursion
}
}
//call inner recursive function
filterAfter(l, seen, 2)._1
}
Here is canonical Scala code to do reduce three or more in a row to two in a row:
def checkForTwo(candidate: List[Int]): List[Int] = {
candidate match {
case x :: y :: z :: tail if x == y && y == z =>
checkForTwo(y :: z :: tail)
case x :: tail =>
x :: checkForTwo(tail)
case Nil =>
Nil
}
}
It looks at the first three elements of the list, and if they are the same, drops the first one and repeats the process. Otherwise, it passes items on through.
Solution with groupBy and filter, without any sorting (so it's O(N), sorting will give you additional O(Nlog(N)) in typical case):
val li = l1.zipWithIndex
val pred = li.groupBy(_._1).flatMap(_._2.lift(1)) //1 is your "2", but - 1
for ((x, i) <- li if !pred.get(x).exists(_ < i)) yield x
I prefer approach with immutable Map:
def noMoreThan[T](list: List[T], max: Int): List[T] = {
def go(tail: List[T], freq: Map[T, Int]): List[T] = {
tail match {
case h :: t =>
if (freq(h) < max)
h :: go(t, freq + (h -> (freq(h) + 1)))
else go(t, freq)
case _ => Nil
}
}
go(list, Map[T, Int]().withDefaultValue(0))
}

How to generate the power set of a set in Scala

I have a Set of items of some type and want to generate its power set.
I searched the web and couldn't find any Scala code that adresses this specific task.
This is what I came up with. It allows you to restrict the cardinality of the sets produced by the length parameter.
def power[T](set: Set[T], length: Int) = {
var res = Set[Set[T]]()
res ++= set.map(Set(_))
for (i <- 1 until length)
res = res.map(x => set.map(x + _)).flatten
res
}
This will not include the empty set. To accomplish this you would have to change the last line of the method simply to res + Set()
Any suggestions how this can be accomplished in a more functional style?
Looks like no-one knew about it back in July, but there's a built-in method: subsets.
scala> Set(1,2,3).subsets foreach println
Set()
Set(1)
Set(2)
Set(3)
Set(1, 2)
Set(1, 3)
Set(2, 3)
Set(1, 2, 3)
Notice that if you have a set S and another set T where T = S ∪ {x} (i.e. T is S with one element added) then the powerset of T - P(T) - can be expressed in terms of P(S) and x as follows:
P(T) = P(S) ∪ { p ∪ {x} | p ∈ P(S) }
That is, you can define the powerset recursively (notice how this gives you the size of the powerset for free - i.e. adding 1-element doubles the size of the powerset). So, you can do this tail-recursively in scala as follows:
scala> def power[A](t: Set[A]): Set[Set[A]] = {
| #annotation.tailrec
| def pwr(t: Set[A], ps: Set[Set[A]]): Set[Set[A]] =
| if (t.isEmpty) ps
| else pwr(t.tail, ps ++ (ps map (_ + t.head)))
|
| pwr(t, Set(Set.empty[A])) //Powerset of ∅ is {∅}
| }
power: [A](t: Set[A])Set[Set[A]]
Then:
scala> power(Set(1, 2, 3))
res2: Set[Set[Int]] = Set(Set(1, 2, 3), Set(2, 3), Set(), Set(3), Set(2), Set(1), Set(1, 3), Set(1, 2))
It actually looks much nicer doing the same with a List (i.e. a recursive ADT):
scala> def power[A](s: List[A]): List[List[A]] = {
| #annotation.tailrec
| def pwr(s: List[A], acc: List[List[A]]): List[List[A]] = s match {
| case Nil => acc
| case a :: as => pwr(as, acc ::: (acc map (a :: _)))
| }
| pwr(s, Nil :: Nil)
| }
power: [A](s: List[A])List[List[A]]
Here's one of the more interesting ways to write it:
import scalaz._, Scalaz._
def powerSet[A](xs: List[A]) = xs filterM (_ => true :: false :: Nil)
Which works as expected:
scala> powerSet(List(1, 2, 3)) foreach println
List(1, 2, 3)
List(1, 2)
List(1, 3)
List(1)
List(2, 3)
List(2)
List(3)
List()
See for example this discussion thread for an explanation of how it works.
(And as debilski notes in the comments, ListW also pimps powerset onto List, but that's no fun.)
Use the built-in combinations function:
val xs = Seq(1,2,3)
(0 to xs.size) flatMap xs.combinations
// Vector(List(), List(1), List(2), List(3), List(1, 2), List(1, 3), List(2, 3),
// List(1, 2, 3))
Note, I cheated and used a Seq, because for reasons unknown, combinations is defined on SeqLike. So with a set, you need to convert to/from a Seq:
val xs = Set(1,2,3)
(0 to xs.size).flatMap(xs.toSeq.combinations).map(_.toSet).toSet
//Set(Set(1, 2, 3), Set(2, 3), Set(), Set(3), Set(2), Set(1), Set(1, 3),
//Set(1, 2))
Can be as simple as:
def powerSet[A](xs: Seq[A]): Seq[Seq[A]] =
xs.foldLeft(Seq(Seq[A]())) {(sets, set) => sets ++ sets.map(_ :+ set)}
Recursive implementation:
def powerSet[A](xs: Seq[A]): Seq[Seq[A]] = {
def go(xsRemaining: Seq[A], sets: Seq[Seq[A]]): Seq[Seq[A]] = xsRemaining match {
case Nil => sets
case y :: ys => go(ys, sets ++ sets.map(_ :+ y))
}
go(xs, Seq[Seq[A]](Seq[A]()))
}
All the other answers seemed a bit complicated, here is a simple function:
def powerSet (l:List[_]) : List[List[Any]] =
l match {
case Nil => List(List())
case x::xs =>
var a = powerSet(xs)
a.map(n => n:::List(x)):::a
}
so
powerSet(List('a','b','c'))
will produce the following result
res0: List[List[Any]] = List(List(c, b, a), List(b, a), List(c, a), List(a), List(c, b), List(b), List(c), List())
Here's another (lazy) version... since we're collecting ways of computing the power set, I thought I'd add it:
def powerset[A](s: Seq[A]) =
Iterator.range(0, 1 << s.length).map(i =>
Iterator.range(0, s.length).withFilter(j =>
(i >> j) % 2 == 1
).map(s)
)
Here's a simple, recursive solution using a helper function:
def concatElemToList[A](a: A, list: List[A]): List[Any] = (a,list) match {
case (x, Nil) => List(List(x))
case (x, ((h:List[_]) :: t)) => (x :: h) :: concatElemToList(x, t)
case (x, (h::t)) => List(x, h) :: concatElemToList(x, t)
}
def powerSetRec[A] (a: List[A]): List[Any] = a match {
case Nil => List()
case (h::t) => powerSetRec(t) ++ concatElemToList(h, powerSetRec (t))
}
so the call of
powerSetRec(List("a", "b", "c"))
will give the result
List(List(c), List(b, c), List(b), List(a, c), List(a, b, c), List(a, b), List(a))

Expand a Set[Set[String]] into Cartesian Product in Scala

I have the following set of sets. I don't know ahead of time how long it will be.
val sets = Set(Set("a","b","c"), Set("1","2"), Set("S","T"))
I would like to expand it into a cartesian product:
Set("a&1&S", "a&1&T", "a&2&S", ..., "c&2&T")
How would you do that?
I think I figured out how to do that.
def combine(acc:Set[String], set:Set[String]) = for (a <- acc; s <- set) yield {
a + "&" + s
}
val expanded = sets.reduceLeft(combine)
expanded: scala.collection.immutable.Set[java.lang.String] = Set(b&2&T, a&1&S,
a&1&T, b&1&S, b&1&T, c&1&T, a&2&T, c&1&S, c&2&T, a&2&S, c&2&S, b&2&S)
Nice question. Here's one way:
scala> val seqs = Seq(Seq("a","b","c"), Seq("1","2"), Seq("S","T"))
seqs: Seq[Seq[java.lang.String]] = List(List(a, b, c), List(1, 2), List(S, T))
scala> val seqs2 = seqs.map(_.map(Seq(_)))
seqs2: Seq[Seq[Seq[java.lang.String]]] = List(List(List(a), List(b), List(c)), List(List(1), List(2)), List(List(S), List(T)))
scala> val combined = seqs2.reduceLeft((xs, ys) => for {x <- xs; y <- ys} yield x ++ y)
combined: Seq[Seq[java.lang.String]] = List(List(a, 1, S), List(a, 1, T), List(a, 2, S), List(a, 2, T), List(b, 1, S), List(b, 1, T), List(b, 2, S), List(b, 2, T), List(c, 1, S), List(c, 1, T), List(c, 2, S), List(c, 2, T))
scala> combined.map(_.mkString("&"))
res11: Seq[String] = List(a&1&S, a&1&T, a&2&S, a&2&T, b&1&S, b&1&T, b&2&S, b&2&T, c&1&S, c&1&T, c&2&S, c&2&T)
Came after the batle ;) but another one:
sets.reduceLeft((s0,s1)=>s0.flatMap(a=>s1.map(a+"&"+_)))
Expanding on dsg's answer, you can write it more clearly (I think) this way, if you don't mind the curried function:
def combine[A](f: A => A => A)(xs:Iterable[Iterable[A]]) =
xs reduceLeft { (x, y) => x.view flatMap { y map f(_) } }
Another alternative (slightly longer, but much more readable):
def combine[A](f: (A, A) => A)(xs:Iterable[Iterable[A]]) =
xs reduceLeft { (x, y) => for (a <- x.view; b <- y) yield f(a, b) }
Usage:
combine[String](a => b => a + "&" + b)(sets) // curried version
combine[String](_ + "&" + _)(sets) // uncurried version
Expanding on #Patrick's answer.
Now it's more general and lazier:
def combine[A](f:(A, A) => A)(xs:Iterable[Iterable[A]]) =
xs.reduceLeft { (x, y) => x.view.flatMap {a => y.map(f(a, _)) } }
Having it be lazy allows you to save space, since you don't store the exponentially many items in the expanded set; instead, you generate them on the fly. But, if you actually want the full set, you can still get it like so:
val expanded = combine{(x:String, y:String) => x + "&" + y}(sets).toSet