Scala - two operand or/and functions - scala

In Java there are static and/or functions which receive two boolean operands and return the correct result: logicalAnd and logicalOr (both located in java.lang.Boolean) [1]:
public static boolean logicalAnd(boolean a, boolean b)
public static boolean logicalOr(boolean a, boolean b)
I'm looking for a similar concept in the Scala standard library. Is there any?
Edit
I know Scala can call Java functions, but this code won't compile since Java's functions are not high-order, so it's not usable for me:
val op =
if (input == "or") java.lang.Boolean.logicalOr
else java.lang.Boolean.logicalAnd
[1] https://docs.oracle.com/javase/8/docs/api/java/lang/Boolean.html

You can just use a function literal:
val op: (Boolean, Boolean) => Boolean =
if (input == "or")
_ || _
else
_ && _

The fact that
val op =
if (input == "or") java.lang.Boolean.logicalOr
else java.lang.Boolean.logicalAnd
doesn't compile has nothing to do with those methods being defined in Java or in Scala.
The code you posted actually compiles in Scala 3. In Scala 2 you have to be explicit when you want to turn a method into a function value:
val op =
if (input == "or") java.lang.Boolean.logicalOr _
else java.lang.Boolean.logicalAnd _

In Scala, you can avoid using these methods in lambda processing, using the _ syntax and the bitwise operators:
Boolean::logicalAnd => _ & _
Boolean::logicalOr => _ | _
For example, if you have in Java something like this:
Stream.of(/* .. some objects .. */)
.map(/* some function that returns a boolean */)
.reduce(Boolean::logicalOr);
In Scala is possible to write using _ syntax:
LazyList.fill(/* ... */)
.map(/* ... */)
.reduce(_ & _) // or | or ^

Related

Strange Scala Syntax wherein Future is mapped such that "==" and "!=" appear with only one operand (not two)

I came across a puzzling, but interesting, code construct that I whittled down to a small example, and that
I still have trouble wrapping my head around.
The example is shown below. Note that I have a simple Future that immediately returns a String. I map this
to a comparison of the Future itself using != and ==
import scala.concurrent.{Await, Future}
import scala.concurrent.ExecutionContext.Implicits.global
import scala.concurrent.duration._
object Dummy extends App {
val taskReturningString: Future[String] = Future{ "foo"}
val falseResult: Future[Boolean] = taskReturningString.map(taskReturningString ==)
System.out.println("false result:" + Await.result(falseResult, 0 nanos) )
val trueResult: Future[Boolean] = taskReturningString.map(taskReturningString !=)
System.out.println("true result:" + Await.result(trueResult, 0 nanos) )
}
The output is
false result:false
true result:true
But I'm not sure why I got those results. In the case of ==, and != the first item being compared is
'taskReturningString' -- the Future. But what is it being compared to ? I am assuming that what is happening
is a comparison, but I've never seen a case where the operators == and != appear with one operand instead of two.
That behavior is due to eta expansion. Those map need a function String => Boolean (because type inference) and taskReturningString == is a method that can be expanded to that kind of function.
Here is a simplified example.
val equals: String => Boolean = "foo" ==
println(equals("foo"))
// true
println(equals("bar"))
// false
or with +
val plusTwo: Int => Int = 2 +
println(plusTwo(2))
// 4
drop from String
val drop: Int => String = "abcd".drop
println(drop(2))
// cd
or ::: from List
val concat: List[Int] => List[Int] = List(1,2,3) :::
println(concat(List(4,5,6)))
// List(4,5,6,1,2,3)
The use of _ is not always necessary if the compiler realizes that it can expand a method with missing parameters and the types are correct.
This code doesn't work because there is no way that the compiler knows the type of equals
val equals = "foo" ==
so I need to help it with _
val equals = "foo" == _
The answer for this lies with following facts,
Fact 1: Even operators are methods in Scala,
// something like
val a = "abc" == "def"
// is actually
val a = "abc".==("def")
So, taskReturningString == is actually method taskReturningString.==
Fact 2: methods can be converted to functions (by using _),
val string1 = "abc"
val func1 = string1.== _
// func1: Any => Boolean = sof.A$A0$A$A0$$Lambda$1155/266767500#12f02be4
// or if you want more specific type,
val func2: String => Boolean = string1.== _
// func2: String => Boolean = sof.A$A0$A$A0$$Lambda$1156/107885703#19459210
Fact 3: Scala compiler is smart. It supports eta-expansion which is conversion of a method to an appropriate function (to match the requirement)(if possible). So, if we tell the compiler that we want a Function of type String => Boolean and give it a method, it will smartly convert it to function.
// so our func3 did not need that explicit conversion using `_`
val func3: String => Boolean = string1.==
// func3: String => Boolean = sof.A$A1$A$A1$$Lambda$1161/1899231632#4843e7f0
Now, since your taskReturningString is a Future[String] thus taskReturningString.map wants an function of type String => A for any type A.
Also, taskReturningString.== takes an argument of type Any and return type of Boolean,
the compiler will expand it to a function String => Boolean,
val future1 = Future("abc")
val func4: String => Boolean = future1.==
// func4: String => Boolean = sof.A$A4$A$A4$$Lambda$1237/1577797787#2e682ccb
// And since `func4` will compare the argument string with `future1` for equality
// it will always return `false`
// So what you are doing is actually,
val falseResult: Future[Boolean] = future1.map(func4)

Scala: decompose the filter parameter on a Spark DataSet?

I have the following code and I need to type x._1. and x._2. a lot of times.
case class T (Field1: String, Field2: Int, ....)
val j: DataSet[(T, T)] = ...
j.filter(x => x._1.Field1 == x._2.Field1
&& x._1.Field2 == x._2.Field2
&& ....)
Is it a way to decompose x to (l, r) so the expression can be a little bit shorter?
The following doesn't work on Spark's DataSet. why? How can Spark's DataSet not support Scala's language construct?
filter{ case (l,r) => ...
In F#, you can write something like
j.filter((l, r) -> ....)
even
j.filtere(({Field1 = l1; Field2 = l2; ....}, {Field1 = r1; Field2 = r2; ....}) -> ....)
The trick is to use the fact that PartialFunction[A,B] is a subclass of Function1[A,B], so, you can use partial function syntax everywhere, a Function1 is expected (filter, map, flatMap etc.):
j.filter {
case (l,r) if (l.Field1 == lr.Field1 && l.Field2 == r.Field2 => true
case _ => false
}
UPDATE
As mentioned in the comments, unfortunately this does not work with spark's Dataset. This seems to be due to the fact, that filter is overloaded in Dataset, and that throws the typer off (method overloads are generally discouraged in scala and don't work very well with its other features).
One work around for this, is to define a method with a different name, that you can tack on Dataset with an implicit conversion, and then use that method instead of filter:
object PimpedDataset {
implicit class It[T](val ds: Dataset[T]) extends AnyVal {
def filtered(f: T => Boolean) = ds.filter(f)
}
}
...
import PimpedDataset._
j.filtered {
case (l,r) if (l.Field1 == r.Field1 && l.Field2 == r.Field2 => true
case _ => false
}
This will compile ...
Spark's Dataset class has multiple overloaded filter(...) methods, and the compiler isn't able to infer which one to use. You can explicitly specify the function type, but it's a bit ugly.
j.filter({
case (l, r) => true
}: ((Field1, Field2)) => Boolean)
That syntax (without explicitly specifying the type) is still available for RDDs. Unfortunately, in the interest of supporting Python/R/Etc, the Spark developers decided to forsake users preferring to write idiomatic Scala. :(

How to "de-sugar" this Scala statement?

LINQ-style queries in Scala with json4s look as follows:
val jvalue = parse(text) // (1)
val jobject = for(JObject(o) <- jvalue) yield o // (2)
I do not understand exactly how (2) works. How would you de-sugar this for-statement ?
for-comprehensions of the form
for(v <- generator) yield expr
are translated into
generator.map(v => expr)
When you have a pattern match on the left, then any input values which do not match the pattern are filtered out. This means a partial function is created containing the match, and each input argument can be tested with isDefinedAt e.g.
val f: PartialFunction[JValue, JObject] = { case o#JObject(_) => o }
f.isDefinedAt(JObject(List[JField]())) //true
f.isDefinedAt(JNull) //false
This means your example will be translated into something like:
PartialFunction[JValue, List[JField]] mfun = { case JObject(o) -> o }
var jobject = jvalue.filter(mfun.isDefinedAt(_)).map(mfun)

Functional code for looping with early exit

How can I refactor this code in functional style (scala idiomatic)
def findFirst[T](objects: List[T]):T = {
for (obj <- objects) {
if (expensiveFunc(obj) != null) return obj
}
null.asInstanceOf[T]
}
This is almost exactly what the find method does, except that it returns an Option. So if you want this exact behavior, you can add a call to Option.orNull, like this:
objects.find(expensiveFunc).orNull
First, don't use null in Scala (except when interacting with Java code) but Options. Second, replace loops with recursion. Third, have a look at the rich API of Scala functions, the method you are looking for already exists as pointed by sepp2k.
For learning puprose your example could be rewritten as:
def findFirst[T](objects: List[T]):Option[T] = objects match {
case first :: rest if expensiveFunc( first ) != null => Some( first )
case _ :: rest => findFirst( rest )
case Nil => None
}
How about a fold?
Our somehow pseudo-expensive function:
scala> def divByFive (n: Int) : Option[Int] = {
| println ("processing " + n)
| if (n % 5 == 0) Some (n) else None }
divByFive: (n: Int)Option[Int]
Folding on an Option:
scala> ((None: Option[Int]) /: (1 to 11)) ((a, b) =>
| if (a != None) a else divByFive (b))
processing 1
processing 2
processing 3
processing 4
processing 5
res69: Option[Int] = Some(5)

Issues with maps and their entries in Scala

I have a recursive function that takes a Map as single parameter. It then adds new entries to that Map and calls itself with this larger Map. Please ignore the return values for now. The function isn't finished yet. Here's the code:
def breadthFirstHelper( found: Map[AIS_State,(Option[AIS_State], Int)] ): List[AIS_State] = {
val extension =
for(
(s, v) <- found;
next <- this.expand(s) if (! (found contains next) )
) yield (next -> (Some(s), 0))
if ( extension.exists( (s -> (p,c)) => this.isGoal( s ) ) )
List(this.getStart)
else
breadthFirstHelper( found ++ extension )
}
In extension are the new entries that shall get added to the map. Note that the for-statement generates an iterable, not a map. But those entries shall later get added to the original map for the recursive call. In the break condition, I need to test whether a certain value has been generated inside extension. I try to do this by using the exists method on extension. But the syntax for extracting values from the map entries (the stuff following the yield) doesn't work.
Questions:
How do I get my break condition (the boolean statement to the if) to work?
Is it a good idea to do recursive work on a immutable Map like this? Is this good functional style?
When using a pattern-match (e.g. against a Tuple2) in a function, you need to use braces {} and the case statement.
if (extension.exists { case (s,_) => isGoal(s) } )
The above also uses the fact that it is more clear when matching to use the wildcard _ for any allowable value (which you subsequently do not care about). The case xyz gets compiled into a PartialFunction which in turn extends from Function1 and hence can be used as an argument to the exists method.
As for the style, I am not functional programming expert but this seems like it will be compiled into a iterative form (i.e. it's tail-recursive) by scalac. There's nothing which says "recursion with Maps is bad" so why not?
Note that -> is a method on Any (via implicit conversion) which creates a Tuple2 - it is not a case class like :: or ! and hence cannot be used in a case pattern match statement. This is because:
val l: List[String] = Nil
l match {
case x :: xs =>
}
Is really shorthand/sugar for
case ::(x, xs) =>
Similarly a ! b is equivalent to !(a, b). Of course, you may have written your own case class ->...
Note2: as Daniel says below, you cannot in any case use a pattern-match in a function definition; so while the above partial function is valid, the following function is not:
(x :: xs) =>
This is a bit convoluted for me to follow, whatever Oxbow Lakes might think.
I'd like first to clarify one point: there is no break condition in for-comprehensions. They are not loops like C's (or Java's) for.
What an if in a for-comprehension means is a guard. For instance, let's say I do this:
for {i <- 1 to 10
j <- 1 to 10
if i != j
} yield (i, j)
The loop isn't "stopped" when the condition is false. It simply skips the iterations for which that condition is false, and proceed with the true ones. Here is another example:
for {i <- 1 to 10
j <- 1 to 10
if i % 2 != 0
} yield (i, j)
You said you don't have side-effects, so I can skip a whole chapter about side effects and guards on for-comprehensions. On the other hand, reading a blog post I made recently on Strict Ranges is not a bad idea.
So... give up on break conditions. They can be made to work, but they are not functional. Try to rephrase the problem in a more functional way, and the need for a break condition will be replaced by something else.
Next, Oxbow is correct in that (s -> (p,c) => isn't allowed because there is no extractor defined on an object called ->, but, alas, even (a :: b) => would not be allowed, because there is no pattern matching going on in functional literal parameter declaration. You must simply state the parameters on the left side of =>, without doing any kind of decomposition. You may, however, do this:
if ( extension.exists( t => val (s, (p,c)) = t; this.isGoal( s ) ) )
Note that I replaced -> with ,. This works because a -> b is a syntactic sugar for (a, b), which is, itself, a syntactic sugar for Tuple2(a, b). As you don't use neither p nor c, this works too:
if ( extension.exists( t => val (s, _) = t; this.isGoal( s ) ) )
Finally, your recursive code is perfectly fine, though probably not optimized for tail-recursion. For that, you either make your method final, or you make the recursive function private to the method. Like this:
final def breadthFirstHelper
or
def breadthFirstHelper(...) {
def myRecursiveBreadthFirstHelper(...) { ... }
myRecursiveBreadthFirstHelper(...)
}
On Scala 2.8 there is an annotation called #TailRec which will tell you if the function can be made tail recursive or not. And, in fact, it seems there will be a flag to display warnings about functions that could be made tail-recursive if slightly changed, such as above.
EDIT
Regarding Oxbow's solution using case, that's a function or partial function literal. It's type will depend on what the inference requires. In that case, because that's that exists takes, a function. However, one must be careful to ensure that there will always be a match, otherwise you get an exception. For example:
scala> List(1, 'c') exists { case _: Int => true }
res0: Boolean = true
scala> List(1, 'c') exists { case _: String => true }
scala.MatchError: 1
at $anonfun$1.apply(<console>:5)
... (stack trace elided)
scala> List(1, 'c') exists { case _: String => true; case _ => false }
res3: Boolean = false
scala> ({ case _: Int => true } : PartialFunction[AnyRef,Boolean])
res5: PartialFunction[AnyRef,Boolean] = <function1>
scala> ({ case _: Int => true } : Function1[Int, Boolean])
res6: (Int) => Boolean = <function1>
EDIT 2
The solution Oxbow proposes does use pattern matching, because it is based on function literals using case statements, which do use pattern matching. When I said it was not possible, I was speaking of the syntax x => s.