How to implement a jQuery Plugin using Scala.js - scala.js

The Scala.js documentation has a nice example on how to implement a jQuery facade which also supports chaining:
#js.native
trait JQueryGreenify extends JQuery {
def greenify(): this.type = ???
}
object JQueryGreenify {
implicit def jq2greenify(jq: JQuery): JQueryGreenify =
jq.asInstanceOf[JQueryGreenify]
}
However as far as I understood it, this example assumes that the greenify plugin has been implemented in Javascript already. E.g. like
$.fn.greenify = function() {
this.css( "color", "green" );
return this;
}
How would this plugin be implemented in Scala.js?

There two ways to define a "jQuery plugin" in Scala.js. The first one is a safe and idiomatic Scala-esque way, but is only for consumption by other Scala.js code. The second is a bit ugly, but it can be used by JavaScript code.
For Scala.js
In Scala, and in Scala.js, the patch-the-prototype thing that actual jQuery plugins do is discouraged. Instead, we use implicit classes:
implicit class JQueryGreenify(val self: JQuery) extends AnyVal {
def greenify(): self.type = {
self.css("color", "green")
self
}
}
And then you can simply do
jQuery("#some-element").greenify()
when you have JQueryGreenify in scope (as an import, typically).
This method does not pollute the actual prototype of jQuery objects. It is a pure-Scala abstraction. It is clean and safe, but that means it cannot be used by JavaScript code.
For JavaScript
If you need to call greenify from JavaScript, you actually need to add a function property on the jQuery.fn object. Since the function needs to use the this parameter, we have to explicitly ascribe it to a js.ThisFunction (see also Access the JS this from scala.js):
js.Dynamic.global.jQuery.fn.greenify = { (self: JQuery) =>
self.css("color", "green")
self
}: js.ThisFunction
which is basically a transliteration of the original JavaScript code from the question.

Related

Spark: Custom operator for JavaPairRDD

I'm trying to declare a custom operator for JavaPairRDD, here is the code:
object CustomOperators {
implicit class CustomRDDOperator[K: ClassTag, V: ClassTag](rdd: JavaPairRDD[K, V]) {
def customOp = {
// logic
}
}
}
But I'm not able to call this function from my JavaPairRDD.
I'm very new to Scala, so there is a good chance that I'm doing something fundamentally wrong. Need some guidance.
What would be the best way to add a custom function to JavaPairRDD?
You should just need to add import CustomOperators._ in the file where you are using it. But if you are working from Scala, you shouldn't end up with a JavaPairRDD in the first place (unless you are using third-party library intended to be used primarily from Java).

How do I create "options objects" in Scala.Js?

In idiomatic JavaScript it's common to have a function accept an "options object" as the last parameter. This is where you'd usually put all the option/seldom-used parameters, e.g.
jQuery.ajax({
url: "http://www.example.com/foo",
success: function() {
..
}
})
The current documentation for Scala.JS recommands using a Scala trait to represent the options object, but that leads to a problem when you have to create the options since you cannot pass an anonymous class into JavaScript code.
How can such option objects be created from Scala code?
We recommend the following (if you chose to create facade-types):
trait AjaxOptions extends js.Object {
val url: String = js.native
val success: js.Function0[Unit] = js.native
}
object AjaxOptions {
def apply(url: String, success: js.Function0[Unit]): AjaxOptions = {
js.Dynamic.literal(url = url, success = success).asInstanceOf[AjaxOptions]
}
}
The advantage is that the type-unsafe cast is contained in a single location. Further, if you ever decide to add/remove fields to/from AjaxOptions, you will (hopefully) think of also adapting the companion's apply method. As a result, the typer will inform you where you have to change your invocations (rather than just having the new field set to undefined).
Please refer to What is the suggested way to instantiate a js.Object for API wrappers for more.
Since Scala.js has evolved, I'm amending my answer with the current best practice:
At this point, you should use a trait to describe the options object, like this:
trait AjaxOptions extends js.Object {
var url: String
var success: UndefOr[js.Function0[Unit]] = js.undefined
}
That UndefOr[T] means "this field might contain T, or might be undefined"; note that you are initializing those to js.undefined, so they have a default value.
Then, at the call site, you simply override the values that you need to set:
val ajaxResult = new AjaxOptions {
url = "http://www.example.com/foo"
}
Note the curly braces: you're actually creating an anonymous subclass of AjaxOptions here, that does what you want. Any fields you don't override (such as success above) get left as undefined, so the library will use its default.
Old Answer:
This question is pretty old, but since people are still coming here:
If you have a large and complex options object (as is typical of, say, jQuery UI classes), you may want to build a facade for that using JSOptionBuilder, which is found in the jsext library. JSOptionBuilder isn't quite a panacea, but it's a not-too-much boilerplate mechanism for constructing and using arbitrarily complex options objects.
Here's a method that I've found to work quite well:
val fooOptions = js.Dynamic.literal().asInstanceOf[FooOptions]
fooOptions.url = ...
fooOptions.bar = ...
jsFunc(..., fooOptions)
Of course this assumes that the FooOptions trait has declared the fields as var. If not, you'll have to use
val fooOptions = js.Dynamic.literal(
url = ...,
bar = ...,
)
jsFunc(..., fooOptions)
but that is less type-safe.
If you're declaring your own options trait, you could also add a companion object with an appropriate apply method:
trait FooOptions extends Js.Object {
var foo: js.String = ???
var bar: js.String = ???
}
object FooOptions {
def apply(): FooOptions =
js.Dynamic.literal().asInstanceOf[FooOptions]
}
That'll make calling code a lot prettier:
val fooOptions = FooOptions()
fooOptions.foo = ...
fooOptions.bar = ...
jsFunc(..., fooOptions)

Best Practice to Load Class in Scala

I'm new to Scala (and functional programming as well) and I'm developing a plugin based application to learn and study.
I've cretead a trait to be the interface of a plugin. So when my app starts, it will load all the classes that implement this trait.
trait Plugin {
def init(config: Properties)
def execute(parameters: Map[String, Array[String]])
}
In my learning of Scala, I've read that if I want to program in functional way, I should avoid using var. Here's my problem:
The init method will be called after the class being loaded. And probably I will want to use the values from the config parameter in the execute method.
How to store this without using a var? Is there a better practice to do what I want here?
Thanks
There is more to programming in a functional way than just avoiding vars. One key concept is also to prefer immutable objects. In that respect your Plugin API is already breaking functional principles as both methods are only executed for their side-effects. With such an API using vars inside the implementation does not make a difference.
For an immutable plugin instance you could split plugin creation:
trait PluginFactory {
def createPlugin (config: Properties): Plugin
}
trait Plugin {
def execute ...
}
Example:
class MyPluginFactory extends MyPlugin {
def createPlugin (config: Properties): Plugin = {
val someValue = ... // extract from config
new MyPlugin(someValue)
}
}
class MyPlugin (someValue: String) extends Plugin {
def execute ... // using someConfig
}
You can use a val! It's basically the same thing, but the value of a val field cannot be modified later on. If you were using a class, you could write:
For example:
class Plugin(val config: Properties) {
def init {
// do init stuff...
}
def execute = // ...
}
Unfortunately, a trait cannot have class parameters. If you want to have a config field in your trait, you wont be able to set its value immediately, so it will have to be a var.

How can I add new methods to a library object?

I've got a class from a library (specifically, com.twitter.finagle.mdns.MDNSResolver). I'd like to extend the class (I want it to return a Future[Set], rather than a Try[Group]).
I know, of course, that I could sub-class it and add my method there. However, I'm trying to learn Scala as I go, and this seems like an opportunity to try something new.
The reason I think this might be possible is the behavior of JavaConverters. The following code:
class Test {
var lst:Buffer[Nothing] = (new java.util.ArrayList()).asScala
}
does not compile, because there is no asScala method on Java's ArrayList. But if I import some new definitions:
class Test {
import collection.JavaConverters._
var lst:Buffer[Nothing] = (new java.util.ArrayList()).asScala
}
then suddenly there is an asScala method. So that looks like the ArrayList class is being extended transparently.
Am I understanding the behavior of JavaConverters correctly? Can I (and should I) duplicate that methodology?
Scala supports something called implicit conversions. Look at the following:
val x: Int = 1
val y: String = x
The second assignment does not work, because String is expected, but Int is found. However, if you add the following into scope (just into scope, can come from anywhere), it works:
implicit def int2String(x: Int): String = "asdf"
Note that the name of the method does not matter.
So what usually is done, is called the pimp-my-library-pattern:
class BetterFoo(x: Foo) {
def coolMethod() = { ... }
}
implicit def foo2Better(x: Foo) = new BetterFoo(x)
That allows you to call coolMethod on Foo. This is used so often, that since Scala 2.10, you can write:
implicit class BetterFoo(x: Foo) {
def coolMethod() = { ... }
}
which does the same thing but is obviously shorter and nicer.
So you can do:
implicit class MyMDNSResolver(x: com.twitter.finagle.mdns.MDNSResolver) = {
def awesomeMethod = { ... }
}
And you'll be able to call awesomeMethod on any MDNSResolver, if MyMDNSResolver is in scope.
This is achieved using implicit conversions; this feature allows you to automatically convert one type to another when a method that's not recognised is called.
The pattern you're describing in particular is referred to as "enrich my library", after an article Martin Odersky wrote in 2006. It's still an okay introduction to what you want to do: http://www.artima.com/weblogs/viewpost.jsp?thread=179766
The way to do this is with an implicit conversion. These can be used to define views, and their use to enrich an existing library is called "pimp my library".
I'm not sure if you need to write a conversion from Try[Group] to Future[Set], or you can write one from Try to Future and another from Group to Set, and have them compose.

Generating a Scala class automatically from a trait

I want to create a method that generates an implementation of a trait. For example:
trait Foo {
def a
def b(i:Int):String
}
object Processor {
def exec(instance: AnyRef, method: String, params: AnyRef*) = {
//whatever
}
}
class Bar {
def wrap[T] = {
// Here create a new instance of the implementing class, i.e. if T is Foo,
// generate a new FooImpl(this)
}
}
I would like to dynamically generate the FooImpl class like so:
class FooImpl(val wrapped:AnyRef) extends Foo {
def a = Processor.exec(wrapped, "a")
def b(i:Int) = Processor.exec(wrapped, "b", i)
}
Manually implementing each of the traits is not something we would like (lots of boilerplate) so I'd like to be able to generate the Impl classes at compile time. I was thinking of annotating the classes and perhaps writing a compiler plugin, but perhaps there's an easier way? Any pointers will be appreciated.
java.lang.reflect.Proxy could do something quite close to what you want :
import java.lang.reflect.{InvocationHandler, Method, Proxy}
class Bar {
def wrap[T : ClassManifest] : T = {
val theClass = classManifest[T].erasure.asInstanceOf[Class[T]]
theClass.cast(
Proxy.newProxyInstance(
theClass.getClassLoader(),
Array(theClass),
new InvocationHandler {
def invoke(target: AnyRef, method: Method, params: Array[AnyRef])
= Processor.exec(this, method.getName, params: _*)
}))
}
}
With that, you have no need to generate FooImpl.
A limitation is that it will work only for trait where no methods are implemented. More precisely, if a method is implemented in the trait, calling it will still route to the processor, and ignore the implementation.
You can write a macro (macros are officially a part of Scala since 2.10.0-M3), something along the lines of Mixing in a trait dynamically. Unfortunately now I don't have time to compose an example for you, but feel free to ask questions on our mailing list at http://groups.google.com/group/scala-internals.
You can see three different ways to do this in ScalaMock.
ScalaMock 2 (the current release version, which supports Scala 2.8.x and 2.9.x) uses java.lang.reflect.Proxy to support dynamically typed mocks and a compiler plugin to generate statically typed mocks.
ScalaMock 3 (currently available as a preview release for Scala 2.10.x) uses macros to support statically typed mocks.
Assuming that you can use Scala 2.10.x, I would strongly recommend the macro-based approach over a compiler plugin. You can certainly make the compiler plugin work (as ScalaMock demonstrates) but it's not easy and macros are a dramatically superior approach.