What are named and default arguments? - scala

I heard that Scala contains a feature called named and default arguments but I don't know what such parameters do or how to use them.
Can someone explain their usage?

Some special functions call type in Scala
Named arguments:
Named arguments allow you to pass arguments to a function in a different order.For example:
def speed(distance: Float, time: Float): Float = distance / time
And the it can be used like this:
speed(distance = 100, time = 10)
or
speed(time = 10, distance = 100)
Default arguments:
Scala lets you specify default values for function parameters. For example:
def printTime(out: java.io.PrintStream = Console.out) =
out.println("time = "+ System.currentTimeMillis())
Then you can call printTime without giving any output stream like this:
printTime()
Repeated arguments:
Scala allows you to indicate that the last parameter to a function may be repeat. For example:
def echo(args: String*) =
for (arg <- args)
println(arg)
Then you can use it like this:
echo()
echo("one")
echo("hello", "world!")

Default arguments solve the problem other programming languages normally solve with method overloading. When there is a method
def addNumbers(a: Int, b: Int, c: Int, d: Int) = a+b+c+d
that takes multiple parameters it can be useful to set some default values in overloaded methods to provide an API that is easier to use if one doesn't want to fill all parameters:
def addNumbers(a: Int, b: Int, c: Int) = addNumbers(a, b, c, 0)
def addNumbers(a: Int, b: Int) = addNumbers(a, b, 0, 0)
With default arguments it is no longer necessary to overload such a method:
def addNumbers(a: Int, b: Int, c: Int = 0, d: Int = 0) = a+b+c+d
The compiler automatically calls the method with the specific default arguments if they are not specified:
scala> addNumbers(1, 2, 3)
res2: Int = 6
A useful place for default arguments is in constructors. It is easier to write
class A(i: Int, s: String = "")
than
class A(i: Int, s: String) {
def this(i: Int) = this(i, "")
}
Named arguments on the other side can improve the readability of a method call:
def compute(xs: List[Int], executeInParallel: Boolean) = ???
compute(xs, executeInParallel = true) is easier to read than only compute(xs, true)
One can always specify the name of a parameter regardless of its order. This means compute(executeInParallel = true, xs = xs) is the same as compute(xs, true). The compiler often just needs a hint which parameter must be placed at which position when the ordering of the parameters is changed.
A use case where both named and default arguments can be used lies in the copy method of case classes, which are automatically generated by the compiler:
scala> case class Person(name: String, age: Int)
defined class Person
scala> val p = Person("Ruben", 43)
p: Person = Person(Ruben,43)
scala> val oneYearOlder = p.copy(age = p.age+1)
oneYearOlder: Person = Person(Ruben,44)
It may be important to mention that named arguments only work for methods defined in Scala. Parameters of methods defined in Java can't be called by their name.
Furthermore named arguments don't work on function literals:
scala> val f = (i: Int) => i
f: Int => Int = <function1>
scala> f(i = 1)
<console>:9: error: not found: value i
f(i = 1)
^
For further information to this feature one can take a look to docs.scala-lang.org.

Related

Passing a function with default parameter to a higher order function

This maybe a very basic question. I am new to Scala. In Scala, I have a function with a default parameter value for its second parameter:
def fun(number : Int, defaultNumber : Int = 0): Int = {number + defaultNumber}
I want to have a higherOrder function to which I can pass a lowerOrder function such as the function above whose second argument has a default value. First attempt is as follows:
def higherOrder(lowerOrder : (Int, Int) => Int, number: Int) =
{lowerOrder(number)}
This obviously gives an error:
error: not enough arguments for method apply: (v1: Int, v2: Int)Int in trait Function2.
Unspecified value parameter v2.
A way around this is to get the default value in the higherOrder function:
def higherOrder(lowerOrder : (Int, Int) => Int, number: Int, defaultNumber: Int = 0) =
{lowerOrder(number, defaultNumber)}
But I don't want to do that because I may want to pass different lowerOrder functions that have different default values for their second parameter and I may not know what that default value is to pass the default value to the higherOrder function. Is there any solution for this?
Define the trait for your lower Order function.
trait Fun {
def apply(number : Int, defaultNumber : Int = 0): Int
}
Create lambda function for your lower Order function. Now you can use the function literal notation to define a Fun (note that you'll need to have started the REPL with -Xexperimental for this step to work)
val fun: Fun = { (number : Int, defaultNumber : Int) => number + defaultNumber }
Now, you can use this lambda function in your higher order function.
def higherOrder(lowerOrder : Fun, number: Int) =
{lowerOrder(number)}
Call the higher order function now.
higherOrder(fun, 10)
> result: Int = 10

Do default parameter values always trump implicits in Scala?

def foo(implicit i: Int = 1) = i
def bar(implicit j: Int) = foo()
bar(4)
This piece of code evaluates to 1. So the default value has priority over the implicit j, which is instantiated to 4. Therefore, it seems that, at least in this example, the default parameter value trumps implicit, making the definition of foo equivalent to def foo(i: Int = 1) = i.
Do default parameter values always trump implicits? If yes, why is this code legal (given that it is confusing)? If not, what is a counter-example?
Is there a way to get the other behavior, i.e. that a piece of code similar to the above (with a default value for i) would evaluate to 4, without having to pass the argument explicitly?
implicit is applied to the entire parameter list, not just a single parameter (e.g., you can have foo(implicit i: Int, j: Int), and both parameters are implicit, but, if you wanted only one of them to be, you'd have to split them in two lists: def foo(i: Int)(implicit j: Int).
So, to pass implicit parameters to the function, you have to omit the entire list: foo, not foo().
When you have def foo(implicit i: Int), foo() does not even compile, because you are trying to send an empty parameter to list. foo does (as long as an implicit int is in scope), because the list is passed implicitly.
With def foo(implicit i: Int = 1), both uses compile, but mean different things. foo() means "call foo with default values for all parameters", foo means "call foo, passing the implicit parameter list".
So, with bar(implicit j: Int) = foo it will evaluate to the value of j, while bar(implicit j: Int) = foo() evaluates to 1.
Scala compiler gets confused between implicit and default value. foo() makes the compiler to ignore implicit. so just foo does the trick here
def foo(implicit i: Int = 1) = i
def bar(implicit j: Int) = foo
println(bar(5))
Result : 5

Scala can't overload two methods [duplicate]

While there might be valid cases where such method overloadings could become ambiguous, why does the compiler disallow code which is neither ambiguous at compile time nor at run time?
Example:
// This fails:
def foo(a: String)(b: Int = 42) = a + b
def foo(a: Int) (b: Int = 42) = a + b
// This fails, too. Even if there is no position in the argument list,
// where the types are the same.
def foo(a: Int) (b: Int = 42) = a + b
def foo(a: String)(b: String = "Foo") = a + b
// This is OK:
def foo(a: String)(b: Int) = a + b
def foo(a: Int) (b: Int = 42) = a + b
// Even this is OK.
def foo(a: Int)(b: Int) = a + b
def foo(a: Int)(b: String = "Foo") = a + b
val bar = foo(42)_ // This complains obviously ...
Are there any reasons why these restrictions can't be loosened a bit?
Especially when converting heavily overloaded Java code to Scala default arguments are a very important and it isn't nice to find out after replacing plenty of Java methods by one Scala methods that the spec/compiler imposes arbitrary restrictions.
I'd like to cite Lukas Rytz (from here):
The reason is that we wanted a deterministic naming-scheme for the
generated methods which return default arguments. If you write
def f(a: Int = 1)
the compiler generates
def f$default$1 = 1
If you have two overloads with defaults on the same parameter
position, we would need a different naming scheme. But we want to keep
the generated byte-code stable over multiple compiler runs.
A solution for future Scala version could be to incorporate type names of the non-default arguments (those at the beginning of a method, which disambiguate overloaded versions) into the naming schema, e.g. in this case:
def foo(a: String)(b: Int = 42) = a + b
def foo(a: Int) (b: Int = 42) = a + b
it would be something like:
def foo$String$default$2 = 42
def foo$Int$default$2 = 42
Someone willing to write a SIP proposal?
It would be very hard to get a readable and precise spec for the interactions of overloading resolution with default arguments. Of course, for many individual cases, like the one presented here, it's easy to say what should happen. But that is not enough. We'd need a spec that decides all possible corner cases. Overloading resolution is already very hard to specify. Adding default arguments in the mix would make it harder still. That's why we have opted to separate the two.
I can't answer your question, but here is a workaround:
implicit def left2Either[A,B](a:A):Either[A,B] = Left(a)
implicit def right2Either[A,B](b:B):Either[A,B] = Right(b)
def foo(a: Either[Int, String], b: Int = 42) = a match {
case Left(i) => i + b
case Right(s) => s + b
}
If you have two very long arg lists which differ in only one arg, it might be worth the trouble...
What worked for me is to redefine (Java-style) the overloading methods.
def foo(a: Int, b: Int) = a + b
def foo(a: Int, b: String) = a + b
def foo(a: Int) = a + "42"
def foo(a: String) = a + "42"
This ensures the compiler what resolution you want according to the present parameters.
Here is a generalization of #Landei answer:
What you really want:
def pretty(tree: Tree, showFields: Boolean = false): String = // ...
def pretty(tree: List[Tree], showFields: Boolean = false): String = // ...
def pretty(tree: Option[Tree], showFields: Boolean = false): String = // ...
Workarround
def pretty(input: CanPretty, showFields: Boolean = false): String = {
input match {
case TreeCanPretty(tree) => prettyTree(tree, showFields)
case ListTreeCanPretty(tree) => prettyList(tree, showFields)
case OptionTreeCanPretty(tree) => prettyOption(tree, showFields)
}
}
sealed trait CanPretty
case class TreeCanPretty(tree: Tree) extends CanPretty
case class ListTreeCanPretty(tree: List[Tree]) extends CanPretty
case class OptionTreeCanPretty(tree: Option[Tree]) extends CanPretty
import scala.language.implicitConversions
implicit def treeCanPretty(tree: Tree): CanPretty = TreeCanPretty(tree)
implicit def listTreeCanPretty(tree: List[Tree]): CanPretty = ListTreeCanPretty(tree)
implicit def optionTreeCanPretty(tree: Option[Tree]): CanPretty = OptionTreeCanPretty(tree)
private def prettyTree(tree: Tree, showFields: Boolean): String = "fun ..."
private def prettyList(tree: List[Tree], showFields: Boolean): String = "fun ..."
private def prettyOption(tree: Option[Tree], showFields: Boolean): String = "fun ..."
One of the possible scenario is
def foo(a: Int)(b: Int = 10)(c: String = "10") = a + b + c
def foo(a: Int)(b: String = "10")(c: Int = 10) = a + b + c
The compiler will be confused about which one to call. In prevention of other possible dangers, the compiler would allow at most one overloaded method has default arguments.
Just my guess:-)
My understanding is that there can be name collisions in the compiled classes with default argument values. I've seen something along these lines mentioned in several threads.
The named argument spec is here:
http://www.scala-lang.org/sites/default/files/sids/rytz/Mon,%202009-11-09,%2017:29/named-args.pdf
It states:
Overloading If there are multiple overloaded alternatives of a method, at most one is
allowed to specify default arguments.
So, for the time being at any rate, it's not going to work.
You could do something like what you might do in Java, eg:
def foo(a: String)(b: Int) = a + (if (b > 0) b else 42)

What does treating a method as a function mean in Scala?

By assigning a variable (or value?) a method name with a space and an underscore, you tell scala to treat the method as a function, which apparently means doing more than simply taking the value generated by a call to the method and assigning to the variable. What else is/can go on through such an assignment?
Since Scala runs on the JVM, it's easier to understand in terms of simple Java-like classes without Scala's syntactic sugar.
Remember that Scala functions are essentially members of a class similar to the following (signature deliberately simplified):
class Function[X, Y] {
def apply(x: X): Y
}
Application of a function f to an argument x is desugared into a method application f.apply(x).
Now suppose that you have another class Foo with method bar:
class Foo {
def bar(x: Int): String
}
If you now have an instance foo of type Foo, then whenever its method bar is transformed into a function by writing:
val f = foo.bar(_)
a new instance of an anonymous subclass of Function is created:
val f = new Function[Int, String] {
def apply(x: Int) = foo.bar(x)
}
If you use this syntax inside a class, this is closed over instead of an instance foo.
This is what all those weirdly named classes Main$$anon$1$$anonfun$1 are: they are the anonymous classes that represent functions. The functions can appear quite implicitly (for example, as blocks passed to the for-loops).
That's all there is to it semantically. The rest is just syntactic sugar.
Here is a complete runnable example that demonstrates the conversion of an instance method into a function:
with sugar, from the outside (a)
with sugar, from the inside (b)
without sugar, from the outside (c)
without sugar, from the inside (d)
You can save it into a file and execute with scala <filename.scala>:
/** A simple greeter that prints 'hello name' multiple times */
case class Hey(name: String) { thisHeyInst =>
def hello(x: Int): String = ("hello " + name + " ") * x
def withSugarFromInside = hello(_)
def noSugarFromInside = new Function[Int, String] {
def apply(y: Int) = thisHeyInst.hello(y)
}
}
val heyAlice = Hey("Alice")
val heyBob = Hey("Bob")
val heyCharlie = Hey("Charlie")
val heyDonald = Hey("Donald")
val a = heyAlice.hello(_)
val b = heyBob.withSugarFromInside
val c = new Function[Int, String] { def apply(y: Int) = heyCharlie.hello(y) }
val d = heyDonald.noSugarFromInside
println(a(3))
println(b(3))
println(c(3))
println(d(3))
In all four cases, a greeting is printed three times.
What _ actually does is an eta-conversion. It takes compile-time construction called method and returns runtime construction called anonymous function, which is actually an instance of scala's Function. Exactly the class depends on arity, so it might be Function1, Function2, Function3 and so on. The point here is to make First-class citizen, which may act like a value.
OOP needs a little more than some new object. Before making the code that creates instance, compiler generates a new class (extending FunctionN) in compile-time, but theoretically it shouldn't be necessary a whole new class. For Java 8 it could be native Java-lambdas.
Btw, you may extend Function1 by yourself and even eta-abstract it again:
scala> object f extends (Int => Int) { def apply(a: Int) = a }
scala> f(1)
res0: Int = 1
scala> f.apply _
res1: Int => Int = <function1>
scala> res1(5)
res2: Int = 5
As a conclusion a little copy-paste from #Daniel C. Sobral's answer:
the former can be easily converted into the latter:
val f = m _
Scala will expand that, assuming m type is (List[Int])AnyRef
into (Scala 2.7):
val f = new AnyRef with Function1[List[Int], AnyRef] {
def apply(x$1: List[Int]) = this.m(x$1)
}
On Scala 2.8, it actually uses an AbstractFunction1 class to reduce
class sizes.
Or simply saying val f = m _ is same as val f = (x: List[Int]) => m(x)
To make this answer more modern and precise let's see what's happening using scalac 2.11.2 and javap:
$ echo "object Z{def f(a: Int) = a}" > Z.scala //no eta-abstraction here
$ scalac Z.scala
$ ls
Z$.class Z.class Z.scala
$ echo "object Z{def f(a: Int) = a; val k = f _}" > Z.scala
$ scalac Z.scala
$ ls
Z$$anonfun$1.class Z.class //new anonfun class added for lambda
Z$.class Z.scala
$ javap -c Z\$\$anonfun\$1.class
Compiled from "Z.scala" // I've simplified output a bit
public final class Z$$anonfun$1 extends scala.runtime.AbstractFunction1$mcII$sp implements scala.Serializable {
public final int apply(int);
Code:
calling apply$mcII$sp(int)
public int apply$mcII$sp(int); //that's it
Code:
0: getstatic #25 // reading Field Z$.MODULE$:LZ$;, which points to `object Z`
3: iload_1
4: invokevirtual #28 // calling Method Z$.f
7: ireturn
public final java.lang.Object apply(java.lang.Object); //just boxed version of `apply`
Code:
unboxToInt
calling apply(int) method
boxToInteger
public Z$$anonfun$1();
Code:
AbstractFunction1$mcII$sp."<init>":()V //initialize
}
So it still extends AbstractFunction1
I'll try to provide some examples how a function or method are assigned to values with underscore.
If it's need to reference a zero-argument function
scala> val uuid = java.util.UUID.randomUUID _
uuid: () => java.util.UUID = <function0>
scala> uuid()
res15: java.util.UUID = 3057ef51-8407-44c8-a09e-e2f4396f566e
scala> uuid()
uuid: java.util.UUID = c1e934e4-e722-4279-8a86-004fed8b9090
Check how it's different when one does
scala> val uuid = java.util.UUID.randomUUID
uuid: java.util.UUID = 292708cb-14dc-4ace-a56b-4ed80d7ccfc7
In first case one assigned a reference to a function. And then calling uuid() generates new UUID every time.
In second, function randomUUID has been called and value assigned to a val uuid.
There are some other cases why _ might be useful.
It's possible to use a function with two arguments and create a function with a single argument out of it.
scala> def multiply(n: Int)(m: Int) = n*m
multiply: (n: Int)(m: Int)Int
scala> val by2 = multiply(2) _
by2: Int => Int = <function1>
scala> by2(3)
res16: Int = 6
To be able to do, it's crucial to define function multiply as curried. It's called function currying.

Why does the Scala compiler disallow overloaded methods with default arguments?

While there might be valid cases where such method overloadings could become ambiguous, why does the compiler disallow code which is neither ambiguous at compile time nor at run time?
Example:
// This fails:
def foo(a: String)(b: Int = 42) = a + b
def foo(a: Int) (b: Int = 42) = a + b
// This fails, too. Even if there is no position in the argument list,
// where the types are the same.
def foo(a: Int) (b: Int = 42) = a + b
def foo(a: String)(b: String = "Foo") = a + b
// This is OK:
def foo(a: String)(b: Int) = a + b
def foo(a: Int) (b: Int = 42) = a + b
// Even this is OK.
def foo(a: Int)(b: Int) = a + b
def foo(a: Int)(b: String = "Foo") = a + b
val bar = foo(42)_ // This complains obviously ...
Are there any reasons why these restrictions can't be loosened a bit?
Especially when converting heavily overloaded Java code to Scala default arguments are a very important and it isn't nice to find out after replacing plenty of Java methods by one Scala methods that the spec/compiler imposes arbitrary restrictions.
I'd like to cite Lukas Rytz (from here):
The reason is that we wanted a deterministic naming-scheme for the
generated methods which return default arguments. If you write
def f(a: Int = 1)
the compiler generates
def f$default$1 = 1
If you have two overloads with defaults on the same parameter
position, we would need a different naming scheme. But we want to keep
the generated byte-code stable over multiple compiler runs.
A solution for future Scala version could be to incorporate type names of the non-default arguments (those at the beginning of a method, which disambiguate overloaded versions) into the naming schema, e.g. in this case:
def foo(a: String)(b: Int = 42) = a + b
def foo(a: Int) (b: Int = 42) = a + b
it would be something like:
def foo$String$default$2 = 42
def foo$Int$default$2 = 42
Someone willing to write a SIP proposal?
It would be very hard to get a readable and precise spec for the interactions of overloading resolution with default arguments. Of course, for many individual cases, like the one presented here, it's easy to say what should happen. But that is not enough. We'd need a spec that decides all possible corner cases. Overloading resolution is already very hard to specify. Adding default arguments in the mix would make it harder still. That's why we have opted to separate the two.
I can't answer your question, but here is a workaround:
implicit def left2Either[A,B](a:A):Either[A,B] = Left(a)
implicit def right2Either[A,B](b:B):Either[A,B] = Right(b)
def foo(a: Either[Int, String], b: Int = 42) = a match {
case Left(i) => i + b
case Right(s) => s + b
}
If you have two very long arg lists which differ in only one arg, it might be worth the trouble...
What worked for me is to redefine (Java-style) the overloading methods.
def foo(a: Int, b: Int) = a + b
def foo(a: Int, b: String) = a + b
def foo(a: Int) = a + "42"
def foo(a: String) = a + "42"
This ensures the compiler what resolution you want according to the present parameters.
Here is a generalization of #Landei answer:
What you really want:
def pretty(tree: Tree, showFields: Boolean = false): String = // ...
def pretty(tree: List[Tree], showFields: Boolean = false): String = // ...
def pretty(tree: Option[Tree], showFields: Boolean = false): String = // ...
Workarround
def pretty(input: CanPretty, showFields: Boolean = false): String = {
input match {
case TreeCanPretty(tree) => prettyTree(tree, showFields)
case ListTreeCanPretty(tree) => prettyList(tree, showFields)
case OptionTreeCanPretty(tree) => prettyOption(tree, showFields)
}
}
sealed trait CanPretty
case class TreeCanPretty(tree: Tree) extends CanPretty
case class ListTreeCanPretty(tree: List[Tree]) extends CanPretty
case class OptionTreeCanPretty(tree: Option[Tree]) extends CanPretty
import scala.language.implicitConversions
implicit def treeCanPretty(tree: Tree): CanPretty = TreeCanPretty(tree)
implicit def listTreeCanPretty(tree: List[Tree]): CanPretty = ListTreeCanPretty(tree)
implicit def optionTreeCanPretty(tree: Option[Tree]): CanPretty = OptionTreeCanPretty(tree)
private def prettyTree(tree: Tree, showFields: Boolean): String = "fun ..."
private def prettyList(tree: List[Tree], showFields: Boolean): String = "fun ..."
private def prettyOption(tree: Option[Tree], showFields: Boolean): String = "fun ..."
One of the possible scenario is
def foo(a: Int)(b: Int = 10)(c: String = "10") = a + b + c
def foo(a: Int)(b: String = "10")(c: Int = 10) = a + b + c
The compiler will be confused about which one to call. In prevention of other possible dangers, the compiler would allow at most one overloaded method has default arguments.
Just my guess:-)
My understanding is that there can be name collisions in the compiled classes with default argument values. I've seen something along these lines mentioned in several threads.
The named argument spec is here:
http://www.scala-lang.org/sites/default/files/sids/rytz/Mon,%202009-11-09,%2017:29/named-args.pdf
It states:
Overloading If there are multiple overloaded alternatives of a method, at most one is
allowed to specify default arguments.
So, for the time being at any rate, it's not going to work.
You could do something like what you might do in Java, eg:
def foo(a: String)(b: Int) = a + (if (b > 0) b else 42)