code generation using sbt - scala

In my scala source files
File.scala
#casejsTraitNative trait Variables extends js.Object {
val first: Int
}
case class Model(in:String)
I want to replace traits with #casejsTraitNative with some boiler plate during compile time
Expected Result : File.scala
trait Variables extends js.Object {
val first: Int
}
object Variables {
#inline def apply(first: Int): Variables = {
val p = FunctionObjectNativeMacro()
p.asInstanceOf[Variables]
}
def copy( source: Variables, first: OptionalParam[Int] = OptDefault): Variables = {
val p = FunctionCopyObjectNativeMacro()
p.asInstanceOf[Variables]
}
}
case class Model(in:String)
I can create expected source string using scalameta,but i don't know which sbt task i need to hook to modify source files before they passed to compiler...

As pointed out by Seth Tisue, the way to fix this is by modifying the sbt source generators keys in your project http://www.scala-sbt.org/1.x/docs/Howto-Generating-Files.html. You can there use Scalameta if you use sbt 1.x. Otherwise, you can't because Scalameta does not cross-compile to 2.10.x, the Scala version sbt 0.13.x uses.
Another way of solving this problem is annotation macros. Have a look at http://docs.scala-lang.org/overviews/macros/annotations.html, but that's more complicated.
All in all, I think the best solution is to use Paiges (https://github.com/typelevel/paiges). It's a little bit more constrained that Scala Meta, but it should allow you to generate that code and more.

Related

Option and null in Scala

If I have the following function:
def getOrNull[T >: Null](f: => T): T = {
try { f } catch { case _: NullPointerException => null }
}
And I want to use it with Option like so:
val r = Option(getOrNull(someString.split("/")(0)))
I get:
Error:(25, 19) Option.type does not take parameters
What is going on, and how can I overcome this?
You might wonder what Option you are referring to.
From sbt console, use //print<tab>:
scala> Option //print
scala.Option // : Option.type
For better context:
package nooption
class Option(arg: String) // some other option on class path
object Option
object Test {
import scala.reflect.internal.util.ScalaClassLoader
def main(args: Array[String]): Unit = println {
//Option(null)
//ScalaClassLoader.originOfClass(classOf[Option])
ScalaClassLoader.originOfClass(classOf[Option$])
}
}
The class name for the companion object has a dollar at the end.
Your IDE might "go to definition."
If you started a REPL at the command line, class files in the current directory are on its class path. If you previously compiled an Option in the default or "empty" package, it will hide scala.Option.
As noted in the comments, this code does compile OK, but you really shouldn't use null in Scala unless you are interfacing with Java.
This is a better way of implementing your code:
val r = Try{ someString.split("/")(0) }.toOption
Try is widely used in Scala so this code is clear to anyone experienced with the language, so there is no need for a separate function.

How to handle different package names in different versions?

I have a 3rd party library with package foo.bar
I normally use it as:
import foo.bar.{Baz => MyBaz}
object MyObject {
val x = MyBaz.getX // some method defined in Baz
}
The new version of the library has renamed the package from foo.bar to newfoo.newbar. I have now another version of my code with the slight change as follows:
import newfoo.newbar.{Baz => MyBaz}
object MyObject {
val x = MyBaz.getX // some method defined in Baz
}
Notice that only the first import is different.
Is there any way I can keep the same version of my code and still switch between different versions of the 3rd party library as and when needed?
I need something like conditional imports, or an alternative way.
The other answer is on the right track but doesn't really get you all the way there. The most common way to do this kind of thing in Scala is to provide a base compatibility trait that has different implementations for each version. In my little abstracted library, for example, I have the following MacrosCompat for Scala 2.10:
package io.travisbrown.abstracted.internal
import scala.reflect.ClassTag
private[abstracted] trait MacrosCompat {
type Context = scala.reflect.macros.Context
def resultType(c: Context)(tpe: c.Type)(implicit
tag: ClassTag[c.universe.MethodType]
): c.Type = {
import c.universe.MethodType
tpe match {
case MethodType(_, res) => resultType(c)(res)
case other => other
}
}
}
And this one for 2.11:
package io.travisbrown.abstracted.internal
import scala.reflect.ClassTag
private[abstracted] trait MacrosCompat {
type Context = scala.reflect.macros.whitebox.Context
def resultType(c: Context)(tpe: c.Type): c.Type = tpe.finalResultType
}
And then my classes, traits, and objects that use the macro reflection API can just extend MacrosCompat and they'll get the appropriate Context and an implementation of resultType for the version we're currently building (this is necessary because of changes to the macros API between 2.10 and 2.11).
(This isn't originally my idea or pattern, but I'm not sure who to attribute it to. Probably Eugene Burmako?)
If you're using SBT, there's special support for version-specific source trees—you can have a src/main/scala for your shared code and e.g. src/main/scala-2.10 and src/main/scala-2.11 directories for version-specific code, and SBT will take care of the rest.
You can try to use type aliases:
package myfoo
object mybar {
type MyBaz = newfoo.newbar.Baz
// val MyBaz = newfoo.newbar.Baz // if Baz is a case class/object, then it needs to be aliased twice - as a type and as a value
}
And then you may simply import myfoo.mybar._ and replace the object mybar to switch to different version of the library.

Best Practice to Load Class in Scala

I'm new to Scala (and functional programming as well) and I'm developing a plugin based application to learn and study.
I've cretead a trait to be the interface of a plugin. So when my app starts, it will load all the classes that implement this trait.
trait Plugin {
def init(config: Properties)
def execute(parameters: Map[String, Array[String]])
}
In my learning of Scala, I've read that if I want to program in functional way, I should avoid using var. Here's my problem:
The init method will be called after the class being loaded. And probably I will want to use the values from the config parameter in the execute method.
How to store this without using a var? Is there a better practice to do what I want here?
Thanks
There is more to programming in a functional way than just avoiding vars. One key concept is also to prefer immutable objects. In that respect your Plugin API is already breaking functional principles as both methods are only executed for their side-effects. With such an API using vars inside the implementation does not make a difference.
For an immutable plugin instance you could split plugin creation:
trait PluginFactory {
def createPlugin (config: Properties): Plugin
}
trait Plugin {
def execute ...
}
Example:
class MyPluginFactory extends MyPlugin {
def createPlugin (config: Properties): Plugin = {
val someValue = ... // extract from config
new MyPlugin(someValue)
}
}
class MyPlugin (someValue: String) extends Plugin {
def execute ... // using someConfig
}
You can use a val! It's basically the same thing, but the value of a val field cannot be modified later on. If you were using a class, you could write:
For example:
class Plugin(val config: Properties) {
def init {
// do init stuff...
}
def execute = // ...
}
Unfortunately, a trait cannot have class parameters. If you want to have a config field in your trait, you wont be able to set its value immediately, so it will have to be a var.

Is there anyway to create a new Scala object from a Java Class

I have a number of use cases for this, all around the idea of interop between existing Java libraries and new Scala Code. The use case I've selected is the easiest I think.
Use Case:
I working on providing a JUnit Runner for some scala tests (so that I can get my lovely red / green bar in Eclipse)
The runner needs to have a constructor with a java class as a parameter. So in Scala I can do the following:
class MyRunner(val clazz: Class[Any]) extends Runner {
def getDescription(): Description
def run(notifier: RunNotifier)
}
When I use either
#RunWith(MyRunner)
object MyTestObject
or
#RunWith(MyRunner)
class MyTestClass
then the runner is indeed instantiated correctly, and is passed a suitable class object
Unfortunately what i want to do now is to "get hold of" the object MyTestObject, or create a MyTestClass, which are both Scala entities. I would prefer to use Scala Reflection, but I also want to use the standard Junit jar.
What I have done
The following Stackover flow questions were educational, but not the same problem. There were the nearest questions I could find
How to create a TypeTag manually?
Any way to obtain a Java class from a Scala (2.10) type tag or symbol?
Using Scala reflection with Java reflection
The discussion on Environments, Universes and Mirrors in http://docs.scala-lang.org/overviews/reflection/environment-universes-mirrors.html was good, and the similar documents on other scala reflection also helped. Mostly through it is about the Scala reflection.
I browsed the Scaladocs, but my knowledge of Scala reflection wasn't enough (yet) to let me get what I wanted out of them.
Edit:
As asked here is the code of the class that is being created by reflection
#RunWith(classOf[MyRunner])
object Hello2 extends App {
println("starting")
val x= "xxx"
}
So the interesting thing is that the solution proposed below using the field called MODULE$ doesn't print anything and the value of x is null
This solution works fine if you want to use plan old java reflection. Not sure if you can use scala reflection given all you will have is a Class[_] to work with:
object ReflectTest {
import collection.JavaConversions._
def main(args: Array[String]) {
val fooObj = instantiate(MyTestObject.getClass())
println(fooObj.foo)
val fooClass = instantiate(classOf[MyTestClass])
println(fooClass.foo)
}
def instantiate(clazz:Class[_]):Foo = {
val rm = ru.runtimeMirror(clazz.getClassLoader())
val declaredFields = clazz.getDeclaredFields().toList
val obj = declaredFields.find(field => field.getName() == "MODULE$") match{
case Some(modField) => modField.get(clazz)
case None => clazz.newInstance()
}
obj.asInstanceOf[Foo]
}
}
trait Foo{
def foo:String
}
object MyTestObject extends Foo{
def foo = "bar"
}
class MyTestClass extends Foo{
def foo = "baz"
}

Generating a Scala class automatically from a trait

I want to create a method that generates an implementation of a trait. For example:
trait Foo {
def a
def b(i:Int):String
}
object Processor {
def exec(instance: AnyRef, method: String, params: AnyRef*) = {
//whatever
}
}
class Bar {
def wrap[T] = {
// Here create a new instance of the implementing class, i.e. if T is Foo,
// generate a new FooImpl(this)
}
}
I would like to dynamically generate the FooImpl class like so:
class FooImpl(val wrapped:AnyRef) extends Foo {
def a = Processor.exec(wrapped, "a")
def b(i:Int) = Processor.exec(wrapped, "b", i)
}
Manually implementing each of the traits is not something we would like (lots of boilerplate) so I'd like to be able to generate the Impl classes at compile time. I was thinking of annotating the classes and perhaps writing a compiler plugin, but perhaps there's an easier way? Any pointers will be appreciated.
java.lang.reflect.Proxy could do something quite close to what you want :
import java.lang.reflect.{InvocationHandler, Method, Proxy}
class Bar {
def wrap[T : ClassManifest] : T = {
val theClass = classManifest[T].erasure.asInstanceOf[Class[T]]
theClass.cast(
Proxy.newProxyInstance(
theClass.getClassLoader(),
Array(theClass),
new InvocationHandler {
def invoke(target: AnyRef, method: Method, params: Array[AnyRef])
= Processor.exec(this, method.getName, params: _*)
}))
}
}
With that, you have no need to generate FooImpl.
A limitation is that it will work only for trait where no methods are implemented. More precisely, if a method is implemented in the trait, calling it will still route to the processor, and ignore the implementation.
You can write a macro (macros are officially a part of Scala since 2.10.0-M3), something along the lines of Mixing in a trait dynamically. Unfortunately now I don't have time to compose an example for you, but feel free to ask questions on our mailing list at http://groups.google.com/group/scala-internals.
You can see three different ways to do this in ScalaMock.
ScalaMock 2 (the current release version, which supports Scala 2.8.x and 2.9.x) uses java.lang.reflect.Proxy to support dynamically typed mocks and a compiler plugin to generate statically typed mocks.
ScalaMock 3 (currently available as a preview release for Scala 2.10.x) uses macros to support statically typed mocks.
Assuming that you can use Scala 2.10.x, I would strongly recommend the macro-based approach over a compiler plugin. You can certainly make the compiler plugin work (as ScalaMock demonstrates) but it's not easy and macros are a dramatically superior approach.