I am porting this JavaScript function to Scala.js:
once: function (el, type, callback) {
var typeArray = type.split(' ');
for (var i = typeArray.length - 1; i >= 0; i--) {
el.addEventListener(typeArray[i], function(e) {
e.target.removeEventListener(e.type, arguments.callee);
return callback(e);
});
};
},
Here is my attempt at writing the Scala code:
def once(element: TopNode, tpe: String, callback: Function1[Event,Any]): Unit = {
tpe.split(" ").foreach(item => element.addEventListener(item, (e : Event) => {
e.target.removeEventListener(e.`type`, ?) // <-- what do I put here?
callback(e)
}))
}
How can I reference my lambda in that place holder?
Scala.js does not have an equivalent of the arguments.callee of JavaScript. More generally, it does not have an equivalent of arguments. So a lambda cannot read a reference to itself, unless it is given to it via its captured environment. This can be achieved by storing the lambda in a val in the once method. Ideally, one would like to write this:
def once(element: TopNode, tpe: String,
callback: Function1[Event,Any]): Unit = {
val cb: js.Function1[Event, Any] = { (e: Event) =>
e.target.removeEventListener(e.`type`, cb) // using cb here
callback(e)
}
tpe.split(" ").foreach(item => element.addEventListener(item, cb))
}
However, this won't compile, because you cannot use a val (here cb) in the right-hand-side of its definition, even if that happens to be inside a lambda. Trying to do so results in the following compile error:
Main.scala:17: error: forward reference extends over definition of value cb
e.target.removeEventListener(e.`type`, cb) // using cb here
There is a simple solution, though: use a lazy val instead:
def once(element: TopNode, tpe: String,
callback: Function1[Event,Any]): Unit = {
lazy val cb: js.Function1[Event, Any] = { (e: Event) =>
e.target.removeEventListener(e.`type`, cb) // using cb here
callback(e)
}
tpe.split(" ").foreach(item => element.addEventListener(item, cb))
}
Make sure to declare cb as a js.Function1, not a Function1. Otherwise cb itself would refer to the Scala function, but not the JS function resulting from the automatic conversion (which have different identities). You have to make sure cb refers to the JS function.
Fiddle: http://www.scala-js-fiddle.com/gist/2b848e2f01e7af522dc1
Related
def map[U: ClassTag](f: T => U): RDD[U] = withScope {
val cleanF = sc.clean(f)
new MapPartitionsRDD[U, T](this, (context, pid, iter) => iter.map(cleanF))
}
This code snippet is from spark 2.2 source code. I am not professional in scala, so just wondering can anyone explain this code in a programmatic perspective? I am not sure what the square bracket after map does. And also refer to https://www.tutorialspoint.com/scala/scala_functions.htm, a scala function should just have curly bracket after '=', but why in this code snippet there is a function named withScope after the '=' sign?
actually, a scala function can have no bracket after "=". eg.
def func():Int = 1
so you can think withScope{} is a function with return type RDD[U],and the map function is to run the withScope function.
let's see the withScope's source code:
private[spark] def withScope[U](body: => U): U = RDDOperationScope.withScope[U](sc)(body)
see, it's a function here. let's go on:
private[spark] def withScope[T](
sc: SparkContext,
allowNesting: Boolean = false)(body: => T): T = {
val ourMethodName = "withScope"
val callerMethodName = Thread.currentThread.getStackTrace()
.dropWhile(_.getMethodName != ourMethodName)
.find(_.getMethodName != ourMethodName)
.map(_.getMethodName)
.getOrElse {
// Log a warning just in case, but this should almost certainly never happen
logWarning("No valid method name for this RDD operation scope!")
"N/A"
}
withScope[T](sc, callerMethodName, allowNesting, ignoreParent = false)(body)
}
let continue with the withScope at the end:
private[spark] def withScope[T](
sc: SparkContext,
name: String,
allowNesting: Boolean,
ignoreParent: Boolean)(body: => T): T = {
// Save the old scope to restore it later
val scopeKey = SparkContext.RDD_SCOPE_KEY
val noOverrideKey = SparkContext.RDD_SCOPE_NO_OVERRIDE_KEY
val oldScopeJson = sc.getLocalProperty(scopeKey)
val oldScope = Option(oldScopeJson).map(RDDOperationScope.fromJson)
val oldNoOverride = sc.getLocalProperty(noOverrideKey)
try {
if (ignoreParent) {
// Ignore all parent settings and scopes and start afresh with our own root scope
sc.setLocalProperty(scopeKey, new RDDOperationScope(name).toJson)
} else if (sc.getLocalProperty(noOverrideKey) == null) {
// Otherwise, set the scope only if the higher level caller allows us to do so
sc.setLocalProperty(scopeKey, new RDDOperationScope(name, oldScope).toJson)
}
// Optionally disallow the child body to override our scope
if (!allowNesting) {
sc.setLocalProperty(noOverrideKey, "true")
}
body
} finally {
// Remember to restore any state that was modified before exiting
sc.setLocalProperty(scopeKey, oldScopeJson)
sc.setLocalProperty(noOverrideKey, oldNoOverride)
}
}
in the end, it execute body parameter, int this case, body equals to
{
val cleanF = sc.clean(f)
new MapPartitionsRDD[U, T](this, (context, pid, iter) => iter.map(cleanF))
}
in conclusion, withScope is a Closure takes a function as argument, it first runs some code itself and the run the argument.
A simple function that accepts a File and a function that will be passed a PrintWriter for that file:
def printToFile(f: java.io.File)(op: java.io.PrintWriter => Unit) {
val p = new java.io.PrintWriter(f)
try { op(p) } finally { p.close() }
}
How to generalise this to any number of Files, while just passing the resulting PrintWriters to one function? I want to make the decision as to which PrintWriter to use in the client function.
I want a signature similar to (psuedocode):
def printToFile(f: java.io.File*)(op: (java.io.PrintWriter*) => Unit)
Here's how I'd like to write my client function:
printToFile(new File("file1.txt"), new File("file2.txt"), new File("file3.txt")) {
(file1PrintWriter, file2PrintWriter, file3PrintWriter) =>
// do stuff, decide which PrintWriter to write to
}
Where the cardinality of both *ed types are the same.
Importantly, I want the client function to be able to declare the PrintWriter variables it receives and not just have a Seq[PrintWriter] or similar to deal with.
I looks like you need to create as many PrintWriter instances as the number of arguments. The following should work.
import java.io._
object Foo {
def printToFile(files: File*)(op: (PrintWriter*) => Unit) {
val printers: Seq[PrintWriter] = files.map(file => new PrintWriter(file))
try {
op(printers :_*)
} finally {
printers.foreach(_.close)
}
}
}
I need help trying to get my anonymous function to compile in Scala.
See below:
private def mapBlock(helper: Helper): (Any) => Block = {
(original: Any) => {
val block = original.asInstanceOf[Block]
// logic with helper here
return block
}
}
However, when I compile this I get "Expression of type block does not conform to expected"
What am I doing wrong here?
The problem is that you're calling return block which is the returning to the mapBlock function the value block. But your mapBlock expectes a function typed (Any) => Block. To solve this just remove the return and have block.
private def mapBlock(helper: Helper): (Any) => Block = {
(original: Any) => {
val block = original.asInstanceOf[Block]
// logic with helper here
block
}
}
If you want to have a return then you could name your function and return that. Although in Scala we generally omit all returns, so this would not be idiomatic Scala:
private def mapBlock(helper: Helper): (Any) => Block = {
val function = (original: Any) => {
val block = original.asInstanceOf[Block]
// logic with helper here
block
}
return function
}
Why is a return statement required to allow this while statement to be
evaluated properly? The following statement allows
import java.io.File
import java.io.FileInputStream
import java.io.InputStream
import java.io.BufferedReader
import java.io.InputStreamReader
trait Closeable {
def close ()
}
trait ManagedCloseable extends Closeable {
def use (code: () => Unit) {
try {
code()
}
finally {
this.close()
}
}
}
class CloseableInputStream (stream: InputStream)
extends InputStream with ManagedCloseable {
def read = stream.read
}
object autoclose extends App {
implicit def inputStreamToClosable (stream: InputStream):
CloseableInputStream = new CloseableInputStream(stream)
override
def main (args: Array[String]) {
val test = new FileInputStream(new File("test.txt"))
test use {
val reader = new BufferedReader(new InputStreamReader(test))
var input: String = reader.readLine
while (input != null) {
println(input)
input = reader.readLine
}
}
}
}
This produces the following error from scalac:
autoclose.scala:40: error: type mismatch;
found : Unit
required: () => Unit
while (input != null) {
^
one error found
It appears that it's attempting to treat the block following the use as an
inline statement rather than a lambda, but I'm not exactly sure why. Adding
return after the while alleviates the error:
test use {
val reader = new BufferedReader(new InputStreamReader(test))
var input: String = reader.readLine
while (input != null) {
println(input)
input = reader.readLine
}
return
}
And the application runs as expected. Can anyone describe to me what is going
on there exactly? This seems as though it should be a bug. It's been
persistent across many versions of Scala though (tested 2.8.0, 2.9.0, 2.9.1)
That's because it's use is declared as () => Unit, so the compiler expects the block you are giving use to return something that satisfies this signature.
It seems that what you want is to turn the entire block into a by-name parameter, to do so change def use (code : () => Unit) to def use (code : => Unit).
() => Unit is the type of a Function0 object, and you've required the use expression to be of that type, which it obviously isn't. => Unit is a by name parameter, which you should use instead.
You might find my answer to this question useful.
To go the heart of the matter, blocks are not lambdas. A block in Scala is a scope delimiter, nothing more.
If you had written
test use { () =>
val reader = new BufferedReader(new InputStreamReader(test))
var input: String = reader.readLine
while (input != null) {
println(input)
input = reader.readLine
}
}
Then you'd have a function (indicated by () =>) which is delimited by the block.
If use had been declared as
def use (code: => Unit) {
Then the syntax you used would work, but not because of any lambda thingy. That syntax indicates the parameter is passed by name, which, roughly speaking, means you'd take the whole expression passed as parameter (ie, the whole block), and substitute it for code inside the body of use. The type of code would be Unit, not a function, but the parameter would not be passed by value.
return or return expr has the type Nothing. You can substitute this for any type, as it never yields a value to the surrounding expression, instead it returns control to the caller.
In your program, it masquerades as the required type () => Unit.
Here's an occasionally convenient use for that (although you might be tarnished as unidiomatic if you use it too often, don't tell anyone you heard this from me!)
def foo(a: Option[Int]): Int = {
val aa: Int = a.getOrElse(return 0)
aa * 2
}
For the record, you should probably write:
def foo(a: Option[Int]): Int =
a.map(_ * 2).getOrElse(0)
You can get an insight into the mind of the compiler by checking the output of scala -Xprint:typer -e <one-liner>. Add -Ytyper-debug if you like sifting through the reams of output!
scala210 -Ytyper-debug -Xprint:typer -e 'def foo: Any = {val x: () => Any = { return }}'
... elided ...
typed return (): Nothing
adapted return (): Nothing to () => Any,
Let's say I want to make a little wrapper along the lines of:
def wrapper(f: (Any) => Any): Any = {
println("Executing now")
val res = f
println("Execution finished")
res
}
wrapper {
println("2")
}
Does this make sense? My wrapper method is obviously wrong, but I think the spirit of what I want to do is possible. Am I right in thinking so? If so, what's the solution? Thanks!
If you want your wrapper method to execute the wrapped method inside itself, you should change the parameter to be 'by name'. This uses the syntax => ResultType.
def wrapper(f: => Any): Any = {
println("Executing now")
val res = f
println("Execution finished")
res
}
You can now do this,
wrapper {
println("2")
}
and it will print
Executing now
2
Execution finished
If you want to be able to use the return type of the wrapped function, you can make your method generic:
def wrapper[T](f: => T): T = {
println("Executing now")
val res: T = f
println("Execution finished")
res
}
In your case you are already executing the function println and then pass the result to your wrapper while it is expecting a function with one arguments (Any) and that return Any.
Not sure if this answer to your question but you can use a generic type parameter and accept a function with no arguments that return that type:
def wrapper[T](f: () => T) = {
println("Executing now")
val res = f() // call the function
println("Execution finished")
res
}
wrapper {
()=>println("2") // create an anonymous function that will be called
}