Scala, why when I pass a lambda function as argument to another function i cannot execute it - scala

I am trying to create a concurrent method that saves in two different variables the result of two blocks of code task1 and task2:
def parallel[A, B](fun_a: => A, fun_b: => B): (A, B) = {
var res_a: A = 1.asInstanceOf[A]
var res_b: B = 1.asInstanceOf[B]
val hilo = new Thread {
override def run(): Unit = {
res_a = fun_a
}
}
val hilo1 = new Thread() {
override def run(): Unit = {
res_b = fun_b
}
}
hilo.start()
hilo1.start()
hilo.join()
hilo1.join()
(res_a, res_b)
}
val task1 = {3+4}
val task2 = {3*1}
println(parallel(task1, task2))
This is the correct answer in theory but I do not understand if val task1/task2 are functions or just the result of the block inside brackets. I tried to edit the method parallel to accept only generic values:
def parallel[A, B](fun_a: A, fun_b: B): (A, B)
and it still works fine. In this case, The computation of task1 and task2 is made before they pass as parameter to the function, or is made when the result is assigned to the variables res_a, res_b in each thread?
I tried to pass task1 and task2 as a method, it works fine but when I pass it as function, in the variable is saved the serialization of the function.
def task1():Int = {3+4} //method
val task2: Int => Int = (x:Int) => {3*x} //anonymous function
println(parallel(task1, task2))
i changed the code so the anonymous function gets executed but now i get an exception:
def parallel[A, B](fun_a: => A, fun_b: => B): (A, B) = {
var res_a: A = 1.asInstanceOf[A]
var res_b: B = 1.asInstanceOf[B]
val hilo = new Thread {
override def run(): Unit = {
res_a = fun_a
}
}
val hilo1 = new Thread() {
override def run(): Unit = {
res_b = fun_b(3) // here, because i think this is a function {val task2: Int => Int = (x:Int) => {3*x} i try to pass its argument
}
}
hilo.start()
hilo1.start()
hilo.join()
hilo1.join()
(res_a, res_b)
}
it says that B does not take parameters but i defined it before as a lambda that takes one parameter.
I do not understand why i defined the method as one that takes as arguments two functions but it does not let me pass arguments to that functions

The following code works:
def parallel[A, B](fun_a: => A, fun_b: Int => B): (A, B) = {
var res_a: A = 1.asInstanceOf[A]
var res_b: B = 1.asInstanceOf[B]
val hilo = new Thread {
override def run(): Unit = {
res_a = fun_a
}
}
val hilo1 = new Thread() {
override def run(): Unit = {
res_b = fun_b(3)
}
}
hilo.start()
hilo1.start()
hilo.join()
hilo1.join()
(res_a, res_b)
}
println(
parallel(
{ 3 * 3 },
(x: Int) => { 3 + x }
)
)
Note how the type declaration of fun_b is changed to fun_b: Int => B. In this case, fun_a doesn't do much, though.
And admittedly, the generic type doesn't make much sense, if Int is hard coded in there. So a more sensible solution would be something like this:
def parallel[A, B](fun_a: A => A, fun_b: B => B, a: A, b: B): (A, B) = {
var res_a: A = 1.asInstanceOf[A]
var res_b: B = 1.asInstanceOf[B]
val hilo = new Thread {
override def run(): Unit = {
res_a = fun_a(a)
}
}
val hilo1 = new Thread() {
override def run(): Unit = {
res_b = fun_b(b)
}
}
hilo.start()
hilo1.start()
hilo.join()
hilo1.join()
(res_a, res_b)
}
println(
parallel(
(y: Int) => { y * 3 },
(x: Int) => { x + 3 },
3,
3
)
)
println(
parallel(
(y: Int) => { y * 3 },
(x: String) => { x + 3 },
3,
"3"
)
)
println(
parallel(
(y: String) => { y * 3 },
(x: Int) => { x + 3 },
"3",
3
)
)
This will print:
(9,6)
(9,33)
(333,6)

Related

How should I get B form A => B

I'm new to Scala, and I'm running into this strange situation.
def bar[A, B](implicit foo: A => B): B = {
// do something
foo
}
And then I got error like
require B but found A => B
How should I get B form A => B
Here's the reason why I did this, I have two functions:
def funcA: String = {
def getStrA: String = "A"
// then there's the same operation in both functions
Try{ } match {
case Success(_) => getStrA
case Failure(_) => // exactlly same error handler in both function
}
}
def funcB: Int = {
def doSomething(x: Int): Int = {
// do something
x / 1
}
val x = 1
Try{ } match {
case Success(_) => doSomething(1)
case Failure(_) => // exactlly same error handler in both function
}
}
Here's what I want to achieve
def funcA: String = {
implicit def getStrA: String = "A"
bar
}
def funcB: Int = {
val x = 1
implicit def doSomething(x: Int): Int = {
// do something
x / 1
}
bar
}
def bar[A, B](implicit foo: A => B): B = {
Try{ } match {
case Success(_) => foo
case Failure(_) => // exactlly same error handler in both function
}
}
You have a conversion from A to B. You need to return B. The only way to do this is to pass A into the function. This signature has an implied assumption that you have some valid A value (most likely hardcoded) that you will always use here.
def bar[A, B](implicit foo: A => B): B = {
val a: A = ... // hmm...
foo(a)
}
Considering, that A is parametric, then you are either missing some information, or this A is impossible to create (it cannot be null because not all types can take null as a value), so you might need to throw exception in such case. Probably you are either missing some A provider or you should always fail this operation.
UPDATE:
There is no need for using implicits at all in your code:
def bar[B](f: onSuccess: A => B) =
Try{ some operations } match {
case Success(value) => onSuccess(value)
case Failure(_) => // error handler
}
def funcA = bar(_ => "A")
def funcB = bar(_ => 1)

Wrap higher order function into progress bar

I have an iteration module which can apply an arbitrary function (Build generic reusable iteration module from higher order function) and would love to wrap it into a progressbar.
val things = Range(1,10)
def iterationModule[A](
iterationItems: Seq[A],
functionToApply: A => Any
): Unit = {
iterationItems.foreach(functionToApply)
}
def foo(s:Int) = println(s)
iterationModule[Int](things, foo)
A basic progressbar could look like:
import me.tongfei.progressbar.ProgressBar
val pb = new ProgressBar("Test", things.size)
things.foreach(t=> {
println(t)
pb.step
})
But how can the function which is passed to the iterator module be intercepted and surrounded with a progressbar, i.e. call the pb.step?
An annoying possibility would be to pass the mutable pb object into each function (have it implement an interface).
But is it also possible to intercept and surround the function being passed by this stepping logic?
However, when looping with Seq().par.foreach, this might be problematic.
I need the code to work in Scala 2.11.
edit
A more complex example:
val things = Range(1,100).map(_.toString)
def iterationModule[A](
iterationItems: Seq[A],
functionToApply: A => Any,
parallel: Boolean = false
): Unit = {
val pb = new ProgressBar(functionToApply.toString(), iterationItems.size)
if (parallel) {
iterationItems.par.foreach(functionToApply)
} else {
iterationItems.foreach(functionToApply)
}
}
def doStuff(inputDay: String, inputConfigSomething: String): Unit = println(inputDay + "__"+ inputConfigSomething)
iterationModule[String](things, doStuff(_, "foo"))
The function should be able to take the iteration item and additional parameters.
edit 2
import me.tongfei.progressbar.ProgressBar
val things = Range(1,100).map(_.toString)
def doStuff(inputDay: String, inputConfigSomething: String): Unit = println(inputDay + "__"+ inputConfigSomething)
def iterationModulePb[A](items: Seq[A], f: A => Any, parallel: Boolean = false): Unit = {
val pb = new ProgressBar(f.toString, items.size)
val it = if (parallel) {
items.par.iterator
} else {
items.iterator
}
it.foreach { x =>
f(x)
pb.step()
}
}
iterationModulePb[String](things, doStuff(_, "foo"))
After a little discussion I figured out how to use a Seq with standard iterators.
For Scala 2.13 this would be the most general form.
import me.tongfei.progressbar.ProgressBar
def iterationModule[A](items: IterableOnce[A], f: A => Any): Unit = {
val (it, pb) =
if (items.knowSize != -1)
items.iterator -> new ProgressBar("Test", items.knowSize)
else {
val (iter1, iter2) = items.iterator.split
iter1 -> new ProgressBar("Test", iter2.size)
}
it.foreach { x =>
f(x)
pb.step()
}
}
Note: most of the changes are just to make the code more generic, but the general idea is just to create a function that wraps both the original function and the call to the ProgressBar.
Edit
A simplified solution for 2.11
def iterationModule[A](items: Seq[A], parallel: Boolean = false)
(f: A => Any): Unit = {
val pb = new ProgressBar("test", items.size)
val it = if (parallel) {
items.iterator.par
} else {
items.iterator
}
it.foreach { a =>
f(a)
pb.step()
}
}

How to pass a scala class as an object into a function parameter?

How do I run the refint1 function? I've triedvar x = new RefInt(5) and then doing scala> argpass.refint1(x)but get a found: RefInt, required : argpass.RefInt => Unit error in the console.
object argpass{
class RefInt (initial : Int) {
private var n : Int = initial
def get () : Int = n
def set (m : Int) : Unit = { n = m}
}
def refint1 ( f: RefInt => Unit) : (Int, Int, Int) = {
var x = new RefInt(5)
val first = f(x)
val firstget = x.get
val sec = f(x)
val secget = x.get
val third = f(x)
val thirdget = x.get
(firstget, secget, thirdget)
}
//How do i run the refint1 function?
As Luis said in the comments, f returns Unit, which is basically void. This should solve your problem:
class RefInt(initial: Int) {
var n: Int = initial
def get(): Int = n
def set(m: Int): Unit = { n = m }
}
def refint1(f: RefInt => Unit): (Int, Int, Int) = {
var x = new RefInt(5)
f(x)
val firstget = x.get
f(x)
val secget = x.get
f(x)
val thirdget = x.get
(firstget, secget, thirdget)
}
That being said, I think you can improve your design a little bit. Here's a different approach to solve the same problem:
case class RefInt(initial: Int)
def refInt1(initial: RefInt, f: RefInt => RefInt) : (Int, Int, Int) = {
val x0 = f(initial)
val x1 = f(x0)
val x2 = f(x1)
(x0.initial, x1.initial, x2.initial)
}
println(refInt1(RefInt(5), ri => ri.copy(ri.initial * 2)))

Debug a custom Pipeline Transformer in Flink

I am trying to implement a custom Transformer in Flink following indications in its documentation but when I try to executed it seems the fit operation is never being called. Here it is what I've done so far:
class InfoGainTransformer extends Transformer[InfoGainTransformer] {
import InfoGainTransformer._
private[this] var counts: Option[collection.immutable.Vector[Map[Key, Double]]] = None
// here setters for params, as Flink does
}
object InfoGainTransformer {
// ====================================== Parameters =============================================
// ...
// ==================================== Factory methods ==========================================
// ...
// ========================================== Operations =========================================
implicit def fitLabeledVectorInfoGain = new FitOperation[InfoGainTransformer, LabeledVector] {
override def fit(instance: InfoGainTransformer, fitParameters: ParameterMap, input: DataSet[LabeledVector]): Unit = {
val counts = collection.immutable.Vector[Map[Key, Double]]()
input.map {
v =>
v.vector.map {
case (i, value) =>
println("INSIDE!!!")
val key = Key(value, v.label)
val cval = counts(i).getOrElse(key, .0)
counts(i) + (key -> cval)
}
}
}
}
implicit def fitVectorInfoGain[T <: Vector] = new FitOperation[InfoGainTransformer, T] {
override def fit(instance: InfoGainTransformer, fitParameters: ParameterMap, input: DataSet[T]): Unit = {
input
}
}
implicit def transformLabeledVectorsInfoGain = {
new TransformDataSetOperation[InfoGainTransformer, LabeledVector, LabeledVector] {
override def transformDataSet(
instance: InfoGainTransformer,
transformParameters: ParameterMap,
input: DataSet[LabeledVector]): DataSet[LabeledVector] = input
}
}
implicit def transformVectorsInfoGain[T <: Vector : BreezeVectorConverter : TypeInformation : ClassTag] = {
new TransformDataSetOperation[InfoGainTransformer, T, T] {
override def transformDataSet(instance: InfoGainTransformer, transformParameters: ParameterMap, input: DataSet[T]): DataSet[T] = input
}
}
}
Then I tried to use it in two ways:
val scaler = StandardScaler()
val polyFeatures = PolynomialFeatures()
val mlr = MultipleLinearRegression()
val gain = InfoGainTransformer().setK(2)
// Construct the pipeline
val pipeline = scaler
.chainTransformer(polyFeatures)
.chainTransformer(gain)
.chainPredictor(mlr)
val r = pipeline.predict(dataSet map (_.vector))
r.print()
And only my transformer:
pipeline.fit(dataSet)
In both cases, when I set a breakpoint inside fitLabeledVectorInfoGain, for example in the line input.map, the debugger stops there, but if I also set a breakpoint inside the nested map, for example bellow println("INSIDE!!!"), it never stops there.
Does anyone knows how could I debug this custom transformer?
It seems its working now. I think what was happening was I wasn't implementing right the FitOperation because nothing was being saved in the instance state, this is the implementation now:
implicit def fitLabeledVectorInfoGain = new FitOperation[InfoGainTransformer, LabeledVector] {
override def fit(instance: InfoGainTransformer, fitParameters: ParameterMap, input: DataSet[LabeledVector]): Unit = {
// val counts = collection.immutable.Vector[Map[Key, Double]]()
val r = input.map {
v =>
v.vector.foldLeft(Map.empty[Key, Double]) {
case (m, (i, value)) =>
println("INSIDE fit!!!")
val key = Key(value, v.label)
val cval = m.getOrElse(key, .0) + 1.0
m + (key -> cval)
}
}
instance.counts = Some(r)
}
}
Now the debugger enters correctly in all breakpoints and the TransformOperation its also being called.

Scala worksheet not working for this code , no compilation error shown

I have am trying something in scala worksheet in eclipse. This is not showing any output , and doesn't show any error or warning either.
object stream {
println("Welcome to the Scala worksheet")
def cons[T](hd: T, t1: => Stream[T]): Stream[T] = new Stream[T] {
def head = hd
private var t1Opt: Option[Stream[T]] = None
def tail: Stream[T] = t1Opt match {
case Some(x) => x
case None => t1Opt = Some(t1); tail
}
}
def streamIncrementedby2(x: Int): Stream[Int] = x #:: streamIncrementedby2(x + 2)
val x = this.cons(-1, this.streamIncrementedby2(5))
println(x.head)
}
I am trying out the example in the courera odersky course: scala functional design week3 video. Interestingly in the above example, if I remove everything below the first println statement, I see an evaluated output.
******* Update ********
To help other readers , I am posting corrected version of the above program , inspired by the answer.
def cons[T](hd: T, t1: => Stream[T]) = new Stream[T] {
override def head = hd
override def isEmpty = false
private[ this ] var t1Opt: Option[Stream[T]] = None
def tailDefined: Boolean = true
override def tail: Stream[T] = t1Opt match {
case Some(x) => x
case None => {t1Opt = Some(t1); tail}
}
}
If you just want to make a generic element head of a Stream, you can use existing method in Stream package called cons
def streamIncrementedby2( x: Int ): Stream[ Int ] = x #:: streamIncrementedby2( x + 2 )
val x = Stream.cons( -1, this.streamIncrementedby2( 5 ) )
println( x.head )
It works fine. However, if you want to make your own version you have to dig deeper. By using the following function definition you are making the constructor, not the ordinary function
def cons[T](hd: T, t1: => Stream[T]): Stream[T] = new Stream[T] { ...
Key thing here is = new Stream[T]. Therefore you have to provide a lot of stuff that a proper constructor needs: overriding abstract functions like head, tail and isEmpty and providing necessary function tailDefined.
def classCons[ T ]( hd: T, t1: => Stream[ T ] ): Stream[ T ] = new Stream[ T ] {
override def isEmpty = false
override def head = hd
private[ this ] var tlVal: Stream[ T ] = _
def tailDefined: Boolean = tlVal ne null
override def tail: Stream[T] = {
if ( !tailDefined )
synchronized {
if ( !tailDefined )
tlVal = t1
}
tlVal
}
}
inspiration
You can also make your cons function a normal function ang get the same result without messing around constructors.
def funcCons[ T ]( hd: T, t1: => Stream[ T ] ): Stream[ T ] = {
if ( t1.isEmpty )
Stream( hd )
else
hd #:: t1.tail
}
results are the same:
val ccStream = classCons( -1, this.streamIncrementedby2( 5 ) )
val ffStream = funcCons( -1, this.streamIncrementedby2( 5 ) )
println( ccStream.head )
println( ffStream.head )
// Result => -1
// Result => -1