In Scala 2 as explained here we had a Function Type that was implementing trait FunctionX and Method Type that was a non-value type. We could transform a method to Method Value which was an instance of Function Type like this:
class Sample {
def method(x:Int) = x+1
val methodValue = method _
}
Now in Scala 3, we can leave the underscore so it looks more like this:
class Sample:
def method(x:Int) = x+1
val methodValue = method
Isn't the equality sign suggesting semantic equivalence of method and function value in val methodValue = method? Also in Scala 2 I couldn't use any methods (at least in Scastie with Scala version 2.13.5) on created method like apply but in Scala 3 I can do that suggesting that methods in Scala 3 are regular objects:
scala> val s = Sample()
val s: Sample = Sample#793c2cde
scala> s.method
val res13: Int => Int = Lambda$1530/856511870#ab595e8
scala> s.methodValue
val res14: Int => Int = Sample$$Lambda$1422/1191732945#1bbbede1
scala> s.method.
!= andThen compose finalize isInstanceOf notifyAll →
## apply ensuring formatted ne synchronized
-> asInstanceOf eq getClass nn toString
== clone equals hashCode notify wait
So is Scala 3 functions and methods the same or very similar objects or at least the difference has been significantly reduced?
Isn't the equality sign suggesting semantic equivalence of method and
function value in val methodValue = method?
The key concept to understand is eta expansion which converts methods into functions. Scala 3 has automated this process so
The syntax m _ is no longer needed and will be deprecated in the
future.
Hence methods and functions are not the same, however Scala 3 tries to transparently convert between them so programmers do not have to worry about the distinction.
I'm looking to call the ATR function from this scala wrapper for ta-lib. But I can't figure out how to use wrapper correctly.
package io.github.patceev.talib
import com.tictactec.ta.lib.{Core, MInteger, RetCode}
import scala.concurrent.Future
object Volatility {
def ATR(
highs: Vector[Double],
lows: Vector[Double],
closes: Vector[Double],
period: Int = 14
)(implicit core: Core): Future[Vector[Double]] = {
val arrSize = highs.length - period + 1
if (arrSize < 0) {
Future.successful(Vector.empty[Double])
} else {
val begin = new MInteger()
val length = new MInteger()
val result = Array.ofDim[Double](arrSize)
core.atr(
0, highs.length - 1, highs.toArray, lows.toArray, closes.toArray,
period, begin, length, result
) match {
case RetCode.Success =>
Future.successful(result.toVector)
case error =>
Future.failed(new Exception(error.toString))
}
}
}
}
Would someone be able to explain how to use function and print out the result to the console.
Many thanks in advance.
Regarding syntax, Scala is one of many languages where you call functions and methods passing arguments in parentheses (mostly, but let's keep it simple for now):
def myFunction(a: Int): Int = a + 1
myFunction(1) // myFunction is called and returns 2
On top of this, Scala allows to specify multiple parameters lists, as in the following example:
def myCurriedFunction(a: Int)(b: Int): Int = a + b
myCurriedFunction(2)(3) // myCurriedFunction returns 5
You can also partially apply myCurriedFunction, but again, let's keep it simple for the time being. The main idea is that you can have multiple lists of arguments passed to a function.
Built on top of this, Scala allows to define a list of implicit parameters, which the compiler will automatically retrieve for you based on some scoping rules. Implicit parameters are used, for example, by Futures:
// this defines how and where callbacks are run
// the compiler will automatically "inject" them for you where needed
implicit val ec: ExecutionContext = concurrent.ExecutionContext.global
Future(4).map(_ + 1) // this will eventually result in a Future(5)
Note that both Future and map have a second parameter list that allows to specify an implicit execution context. By having one in scope, the compiler will "inject" it for you at the call site, without having to write it explicitly. You could have still done it and the result would have been
Future(4)(ec).map(_ + 1)(ec)
That said, I don't know the specifics of the library you are using, but the idea is that you have to instantiate a value of type Core and either bind it to an implicit val or pass it explicitly.
The resulting code will be something like the following
val highs: Vector[Double] = ???
val lows: Vector[Double] = ???
val closes: Vector[Double] = ???
implicit val core: Core = ??? // instantiate core
val resultsFuture = Volatility.ATR(highs, lows, closes) // core is passed implicitly
for (results <- resultsFuture; result <- results) {
println(result)
}
Note that depending on your situation you may have to also use an implicit ExecutionContext to run this code (because you are extracting the Vector[Double] from a Future). Choosing the right execution context is another kind of issue but to play around you may want to use the global execution context.
Extra
Regarding some of the points I've left open, here are some pointers that hopefully will turn out to be useful:
Operators
Multiple Parameter Lists (Currying)
Implicit Parameters
Scala Futures
I have developed a tool using pyspark. In that tool, the user provides a dict of model parameters, which is then passed to an spark.ml model such as Logistic Regression in the form of LogisticRegression(**params).
Since I am transferring to Scala now, I was wondering how this can be done in Spark using Scala? Coming from Python, my intuition is to pass a Scala Map such as:
val params = Map("regParam" -> 100)
val model = new LogisticRegression().set(params)
Obviously, it's not as trivial as that. It seem as in scala, we need to set every single parameter separately, like:
val model = new LogisticRegression()
.setRegParam(0.3)
I really want to avoid being forced to iterate over all user input parameters and set the appropriate parameters with tons of if clauses.
Any ideas how to solve this as elegantly as in Python?
According to the LogisticRegression API you need to set each param individually via setter:
Users can set and get the parameter values through setters and
getters, respectively.
An idea is to build your own mapping function to dynamically call the corresponding param setter using reflection.
Scala is a statically typed language, hence by-design doesn't have anything like Python's **params. As already being considered, you can store them in a Map of type[K, Any], but type erasure would erase types of the Map values due to JVM's runtime constraint.
Shapeless provides some neat mixed-type features that can circumvent the problem. An alternative is to use Scala's TypeTag to preserve type information, as in the following example:
import scala.reflect.runtime.universe._
case class Params[K]( m: Map[(K, TypeTag[_]), Any] ) extends AnyVal {
def add[V](k: K, v: V)(implicit vt: TypeTag[V]) = this.copy(
m = this.m + ((k, vt) -> v)
)
def grab[V](k: K)(implicit vt: TypeTag[V]) = m((k, vt)).asInstanceOf[V]
}
val params = Params[String](Map.empty).
add[Int]("a", 100).
add[String]("b", "xyz").
add[Double]("c", 5.0).
add[List[Int]]("d", List(1, 2, 3))
// params: Params[String] = Params( Map(
// (a,TypeTag[Int]) -> 100, (b,TypeTag[String]) -> xyz, (c,TypeTag[Double]) -> 5.0,
// (d,TypeTag[scala.List[Int]]) -> List(1, 2, 3)
// ) )
params.grab[Int]("a")
// res1: Int = 100
params.grab[String]("b")
// res2: String = xyz
params.grab[Double]("c")
// res3: Double = 5.0
params.grab[List[Int]]("d")
// res4: List[Int] = List(1, 2, 3)
What's the more idiomatic way to write the following?
val starting_value = ...
val result1 = f1(startingValue)
val result2 = f2(result1)
...
val resultN = fN(resultN-1)
If starting_value were a list of items to which I wanted to apply these functions, I could write
starting_list.map(f1).map(f2)...map(fN)
I can fake this by doing something like
Some(starting_value).map(f1)....map(fN).get
or
List(starting_value).map(f1)....map(fN).head
but this seems unnecessarily confusing.
Note: This question seems related but seems to be about a downstream issue.
(f1 andThen f2 andThen ... fN) {
startingValue
}
Use the forward pipe operator. For example (for def f(s: String) = s.length):
scala> "abc" |> f |> 0.until
res16: scala.collection.immutable.Range = Range(0, 1, 2)
You can find it in Scalaz, but you can find its definition elsewhere as well.
You can define an extension class that is also a value class. A value class is a special Scala construct that wraps up a single value inside a new compile-time type but the same exact runtime object, thus avoiding any new memory allocation. The class would look like:
implicit class Piper[A](val x: A) extends AnyVal {
def |>[B](f: A => B) = f(x)
}
Now when you do:
startingValue |> f1 |> f2 |> ... |> fN
Scala implicitly wraps up the starting value and each intermediate value in a Piper object at compile time and applies the Piper object's |> method, passing the intermediate values forward. At runtime no extra memory or time costs are incurred.
via /u/Baccata64
You could as well consider wrapping you value into Try, and then applying map(fN) on it. Then you will have prove, that if any function will fail, you won't get unexpected exception. You can then match on Success/Failure and do some recovery or just printing exact failure.
(I'm using Scala nightlies, and see the same behaviour in 2.8.0b1 RC4. I'm a Scala newcomer.)
I have two SortedMaps that I'd like to form the union of. Here's the code I'd like to use:
import scala.collection._
object ViewBoundExample {
class X
def combine[Y](a: SortedMap[X, Y], b: SortedMap[X, Y]): SortedMap[X, Y] = {
a ++ b
}
implicit def orderedX(x: X): Ordered[X] = new Ordered[X] { def compare(that: X) = 0 }
}
The idea here is the 'implicit' statement means Xs can be converted to Ordered[X]s, and then it makes sense combine SortedMaps into another SortedMap, rather than just a map.
When I compile, I get
sieversii:scala-2.8.0.Beta1-RC4 scott$ bin/scalac -versionScala compiler version
2.8.0.Beta1-RC4 -- Copyright 2002-2010, LAMP/EPFL
sieversii:scala-2.8.0.Beta1-RC4 scott$ bin/scalac ViewBoundExample.scala
ViewBoundExample.scala:8: error: type arguments [ViewBoundExample.X] do not
conform to method ordered's type parameter bounds [A <: scala.math.Ordered[A]]
a ++ b
^
one error found
It seems my problem would go away if that type parameter bound was [A <% scala.math.Ordered[A]], rather than [A <: scala.math.Ordered[A]]. Unfortunately, I can't even work out where the method 'ordered' lives! Can anyone help me track it down?
Failing that, what am I meant to do to produce the union of two SortedMaps? If I remove the return type of combine (or change it to Map) everything works fine --- but then I can't rely on the return being sorted!
Currently, what you are using is the scala.collection.SortedMap trait, whose ++ method is inherited from the MapLike trait. Therefore, you see the following behaviour:
scala> import scala.collection.SortedMap
import scala.collection.SortedMap
scala> val a = SortedMap(1->2, 3->4)
a: scala.collection.SortedMap[Int,Int] = Map(1 -> 2, 3 -> 4)
scala> val b = SortedMap(2->3, 4->5)
b: scala.collection.SortedMap[Int,Int] = Map(2 -> 3, 4 -> 5)
scala> a ++ b
res0: scala.collection.Map[Int,Int] = Map(1 -> 2, 2 -> 3, 3 -> 4, 4 -> 5)
scala> b ++ a
res1: scala.collection.Map[Int,Int] = Map(1 -> 2, 2 -> 3, 3 -> 4, 4 -> 5)
The type of the return result of ++ is a Map[Int, Int], because this would be the only type it makes sense the ++ method of a MapLike object to return. It seems that ++ keeps the sorted property of the SortedMap, which I guess it is because ++ uses abstract methods to do the concatenation, and those abstract methods are defined as to keep the order of the map.
To have the union of two sorted maps, I suggest you use scala.collection.immutable.SortedMap.
scala> import scala.collection.immutable.SortedMap
import scala.collection.immutable.SortedMap
scala> val a = SortedMap(1->2, 3->4)
a: scala.collection.immutable.SortedMap[Int,Int] = Map(1 -> 2, 3 -> 4)
scala> val b = SortedMap(2->3, 4->5)
b: scala.collection.immutable.SortedMap[Int,Int] = Map(2 -> 3, 4 -> 5)
scala> a ++ b
res2: scala.collection.immutable.SortedMap[Int,Int] = Map(1 -> 2, 2 -> 3, 3 -> 4, 4 -> 5)
scala> b ++ a
res3: scala.collection.immutable.SortedMap[Int,Int] = Map(1 -> 2, 2 -> 3, 3 -> 4, 4 -> 5)
This implementation of the SortedMap trait declares a ++ method which returns a SortedMap.
Now a couple of answers to your questions about the type bounds:
Ordered[T] is a trait which if mixed in a class it specifies that that class can be compared using <, >, =, >=, <=. You just have to define the abstract method compare(that: T) which returns -1 for this < that, 1 for this > that and 0 for this == that. Then all other methods are implemented in the trait based on the result of compare.
T <% U represents a view bound in Scala. This means that type T is either a subtype of U or it can be implicitly converted to U by an implicit conversion in scope. The code works if you put <% but not with <: as X is not a subtype of Ordered[X] but can be implicitly converted to Ordered[X] using the OrderedX implicit conversion.
Edit: Regarding your comment. If you are using the scala.collection.immutable.SortedMap, you are still programming to an interface not to an implementation, as the immutable SortedMap is defined as a trait. You can view it as a more specialised trait of scala.collection.SortedMap, which provides additional operations (like the ++ which returns a SortedMap) and the property of being immutable. This is in line with the Scala philosophy - prefer immutability - therefore I don't see any problem of using the immutable SortedMap. In this case you can guarantee the fact that the result will definitely be sorted, and this can't be changed as the collection is immutable.
Though, I still find it strange that the scala.collection.SortedMap does not provide a ++ method witch returns a SortedMap as a result. All the limited testing I have done seem to suggest that the result of a concatenation of two scala.collection.SortedMaps indeed produces a map which keeps the sorted property.
Have you picked a tough nut to crack as a beginner to Scala! :-)
Ok, brief tour, don't expect to fully understand it right now. First, note that the problem happens at the method ++. Searching for its definition, we find it at the trait MapLike, receiving either an Iterator or a Traversable. Since y is a SortedMap, then it is the Traversable version being used.
Note in its extensive type signature that there is a CanBuildFrom being passed. It is being passed implicitly, so you don't normally need to worry about it. However, to understand what is going on, this time you do.
You can locate CanBuildFrom by either clicking on it where it appears in the definition of ++, or by filtering. As mentioned by Randall on the comments, there's an unmarked blank field on the upper left of the scaladoc page. You just have to click there and type, and it will return matches for whatever it is you typed.
So, look up the trait CanBuildFrom on ScalaDoc and select it. It has a large number of subclasses, each one responsible for building a specific type of collection. Search for and click on the subclass SortedMapCanBuildFrom. This is the class of the object you need to produce a SortedMap from a Traversable. Note on the instance constructor (the constructor for the class) that it receives an implicit Ordering parameter. Now we are getting closer.
This time, use the filter filter to search for Ordering. Its companion object (click on the small "o" the name) hosts an implicit that will generate Orderings, as companion objects are examined for implicits generating instances or conversions for that class. It is defined inside the trait LowPriorityOrderingImplicits, which object Ordering extends, and looking at it you'll see the method ordered[A <: Ordered[A]], which will produce the Ordering required... or would produce it, if only there wasn't a problem.
One might assume the implicit conversion from X to Ordered[X] would be enough, just as I had before looking more carefully into this. That, however, is a conversion of objects, and ordered expects to receive a type which is a subtype of Ordered[X]. While one can convert an object of type X to an object of type Ordered[X], X, itself, is not a subtype of Ordered[X], so it can't be passed as a parameter to ordered.
On the other hand, you can create an implicit val Ordering[X], instead of the def Ordered[X], and you'll get around the problem. Specifically:
object ViewBoundExample {
class X
def combine[Y](a: SortedMap[X, Y], b: SortedMap[X, Y]): SortedMap[X, Y] = {
a ++ b
}
implicit val orderingX = new Ordering[X] { def compare(x: X, y: X) = 0 }
}
I think most people initial reaction to Ordered/Ordering must be one of perplexity: why have classes for the same thing? The former extends java.lang.Comparable, whereas the latter extends java.util.Comparator. Alas, the type signature for compare pretty much sums the main difference:
def compare(that: A): Int // Ordered
def compare(x: T, y: T): Int // Ordering
The use of an Ordered[A] requires for either A to extend Ordered[A], which would require one to be able to modify A's definition, or to pass along a method which can convert an A into an Ordered[A]. Scala is perfectly capable of doing the latter easily, but then you have to convert each instance before comparing.
On the other hand, the use of Ordering[A] requires the creation of a single object, such as demonstrated above. When you use it, you just pass two objects of type A to compare -- no objects get created in the process.
So there are some performance gains to be had, but there is a much more important reason for Scala's preference for Ordering over Ordered. Look again on the companion object to Ordering. You'll note that there are several implicits for many of Scala classes defined in there. You may recall I mentioned earlier that an implicit for class T will be searched for inside the companion object of T, and that's exactly what is going on.
This could be done for Ordered as well. However, and this is the sticking point, that means every method supporting both Ordering and Ordered would fail! That's because Scala would look for an implicit to make it work, and would find two: one for Ordering, one for Ordered. Being unable to decide which is it you wanted, Scala gives up with an error message. So, a choice had to be made, and Ordering had more going on for it.
Duh, I forgot to explain why the signature isn't defined as ordered[A <% Ordered[A]], instead of ordered[A <: Ordered[A]]. I suspect doing so would cause the double implicits failure I have mentioned before, but I'll ask the guy who actually did this stuff and had the double implicit problems whether this particular method is problematic.