Has Scala any equivalence to Haskell's undefined? - scala

When coding in Haskell it is helpfull to define function results as "undefined" while you write the skeleton of your application. That way the executable compiles and let's you work in order parts/cases under your attention.
Is there any equivalent thing in Scala? I'd like to write something similar to:
def notAbleToWriteThisYet = undefined

def notAbleToWriteThisYet = sys.error("todo")
Also see this thread on the mailing list.
Scala 2.10.0-M1:
def notAbleToWriteThisYet = ???
(defined in Predef.scala as def ??? : Nothing = throw new NotImplementedError.)

Related

How to obtain a tree for a higher-kinded type parameter in a scala macro

I'm trying to write a macro to simplify some monad-related code (I'm using cats 1.6.0 for the Monads). For now I just want to be able to write lift[F](a) where F is a unary type constructor, and have that expand to a.pure[F]. Seems simple enough, but I can't get it working.
For now I have this code to help with type inference:
object Macros {
class LiftPartiallyApplied[F[_]] {
def apply[A](a: A): F[A] = macro MacroImpl.liftImpl[F, A]
}
def lift[F[_]] = new LiftPartiallyApplied[F]
}
And for the actual implementation of the macro:
object MacroImpl {
def liftImpl[F[_], A](c: blackbox.Context)(a: c.Tree)(implicit tt: c.WeakTypeTag[F[_]]): c.Tree = {
import c.universe._
q"$a.pure[${tt.tpe.typeConstructor}]"
}
}
Now I can call the macro like this lift[List](42) and it'll expand to 42.pure[List], great. But when I call it with a more complicated type, like lift[({type F[A] = Either[String, A]})#F](42), It'll expand to 42.pure[Either], which is obviously broken, because Either is a binary type constructor and not a unary one. The problem is I just don't know what to put there instead of ${tt.tpe.typeConstructor}…
// edit: since people apparently have trouble reproducing the problem, I've made a complete repository:
https://github.com/mberndt123/macro-experiment
I will now try to figure out what the difference between Dmytro's and my own project is.
Don't put Main and Macros to the same compilation unit.
But when I call it with a more complicated type, like lift[({type F[A] = Either[String, A]})#F](42), It'll expand to 42.pure[Either]
Can't reproduce.
For me lift[List](42) produces (with scalacOptions += "-Ymacro-debug-lite")
Warning:scalac: 42.pure[List]
TypeApply(Select(Literal(Constant(42)), TermName("pure")), List(TypeTree()))
at compile time and List(42) at runtime.
lift[({ type F[A] = Either[String, A] })#F](42) produces
Warning:scalac: 42.pure[[A]scala.util.Either[String,A]]
TypeApply(Select(Literal(Constant(42)), TermName("pure")), List(TypeTree()))
at compile time and Right(42) at runtime.
This is my project https://gist.github.com/DmytroMitin/334c230a4f2f1fd3fe9e7e5a3bb10df5
Why do you need macros? Why can't you write
import cats.Applicative
import cats.syntax.applicative._
class LiftPartiallyApplied[F[_]: Applicative] {
def apply[A](a: A): F[A] = a.pure[F]
}
def lift[F[_]: Applicative] = new LiftPartiallyApplied[F]
?
Allright, I found out what the problem was.
Macros need to be compiled separately from their use sites. I thought that this meant that Macros needs to be compiled separately from MacroImpl, so I put those in separate sbt subprojects, and I called the macro in the project where Macros is defined. But what it in fact means is that calls of the macro need to be compiled separately from its definition. So I put MacroImpl and Macros in one subproject and called the macro in another and it worked perfectly.
Thanks to Dmytro for taking the time to demonstrate how to do it right!
// edit: looks like Dmytro beat me to it with his comment :-)

Can Scala infer the actual type from the return type actually expected by the caller?

I have a following question. Our project has a lot of code, that runs tests in Scala. And there is a lot of code, that fills the fields like this:
production.setProduct(new Product)
production.getProduct.setUuid("b1253a77-0585-291f-57a4-53319e897866")
production.setSubProduct(new SubProduct)
production.getSubProduct.setUuid("89a877fa-ddb3-3009-bb24-735ba9f7281c")
Eventually, I grew tired from this code, since all those fields are actually subclasses of the basic class that has the uuid field, so, after thinking a while, I wrote the auxiliary function like this:
def createUuid[T <: GenericEntity](uuid: String)(implicit m : Manifest[T]) : T = {
val constructor = m.runtimeClass.getConstructors()(0)
val instance = constructor.newInstance().asInstanceOf[T]
instance.setUuid(uuid)
instance
}
Now, my code got two times shorter, since now I can write something like this:
production.setProduct(createUuid[Product]("b1253a77-0585-291f-57a4-53319e897866"))
production.setSubProduct(createUuid[SubProduct]("89a877fa-ddb3-3009-bb24-735ba9f7281c"))
That's good, but I am wondering, if I could somehow implement the function createUuid so the last bit would like this:
// Is that really possible?
production.setProduct(createUuid("b1253a77-0585-291f-57a4-53319e897866"))
production.setSubProduct(createUuid("89a877fa-ddb3-3009-bb24-735ba9f7281c"))
Can scala compiler guess, that setProduct expects not just a generic entity, but actually something like Product (or it's subclass)? Or there is no way in Scala to implement this even shorter?
Scala compiler won't infer/propagate the type outside-in. You could however create implicit conversions like:
implicit def stringToSubProduct(uuid: String): SubProduct = {
val n = new SubProduct
n.setUuid(uuid)
n
}
and then just call
production.setSubProduct("89a877fa-ddb3-3009-bb24-735ba9f7281c")
and the compiler will automatically use the stringToSubProduct because it has applicable types on the input and output.
Update: To have the code better organized I suggest wrapping the implicit defs to a companion object, like:
case class EntityUUID(uuid: String) {
uuid.matches("[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}") // possible uuid format check
}
case object EntityUUID {
implicit def toProduct(e: EntityUUID): Product = {
val p = new Product
p.setUuid(e.uuid)
p
}
implicit def toSubProduct(e: EntityUUID): SubProduct = {
val p = new SubProduct
p.setUuid(e.uuid)
p
}
}
and then you'd do
production.setProduct(EntityUUID("b1253a77-0585-291f-57a4-53319e897866"))
so anyone reading this could have an intuition where to find the conversion implementation.
Regarding your comment about some generic approach (having 30 types), I won't say it's not possible, but I just do not see how to do it. The reflection you used bypasses the type system. If all the 30 cases are the same piece of code, maybe you should reconsider your object design. Now you can still implement the 30 implicit defs by calling some method that uses reflection similar what you have provided. But you will have the option to change it in the future on just this one (30) place(s).

How to compile/eval a Scala expression at runtime?

New to Scala and looking for pointers to an idiomatic solution, if there is one.
I'd like to have arbitrary user-supplied Scala functions (which are allowed to reference functions/classes I have defined in my code) applied to some data.
For example: I have foo(s: String): String and bar(s: String): String functions defined in my myprog.scala. The user runs my program like this:
$ scala myprog data.txt --func='(s: Str) => foo(bar(s)).reverse'
This would run line by line through the data file and emit the result of applying the user-specified function to that line.
For extra points, can I ensure that there are no side-effects in the user-defined function? If not, can I restrict the function to use only a restricted subset of functions (which I can assure to be safe)?
#kenjiyoshida has a nice gist that shows how to eval Scala code. Note that when using Eval from that gist, not specifying a return value will result in a runtime failure when Scala defaults to inferring Nothing.
scala> Eval("println(\"Hello\")")
Hello
java.lang.ClassCastException: scala.runtime.BoxedUnit cannot be cast to scala.runtime.Nothing$
... 42 elided
vs
scala> Eval[Unit]("println(\"Hello\")")
Hello
It nicely handles whatever's in scope as well.
object Thing {
val thing: Int = 5
}
object Eval {
def apply[A](string: String): A = {
val toolbox = currentMirror.mkToolBox()
val tree = toolbox.parse(string)
toolbox.eval(tree).asInstanceOf[A]
}
def fromFile[A](file: File): A =
apply(scala.io.Source.fromFile(file).mkString(""))
def fromFileName[A](file: String): A =
fromFile(new File(file))
}
object Thing2 {
val thing2 = Eval[Int]("Thing.thing") // 5
}
Twitter's util package used to have util-eval, but that seems to have been deprecated now (and also triggers a compiler bug when compiled).
As for the second part of your question, the answer seems to be no. Even if you disable default Predef and imports yourself, a user can always get to those functions with the fully qualified package name. You could perhaps use Scala's scala.tools.reflect.ToolBox to first parse your string and then compare against a whitelist, before passing to eval, but at that point things could get pretty hairy since you'll be manually writing code to sanitize the Scala AST (or at the very least reject dangerous input). It definitely doesn't seem to be an "idiomatic solution."
This should be possible by using the standard Java JSR 223 Scripting Engine
see https://issues.scala-lang.org/browse/SI-874
(also mentions using scala.tools.nsc.Interpreter but not sure this is still available)
import javax.script.*;
ScriptEngine e = new ScriptEngineManager().getEngineByName("scala");
e.getContext().setAttribute("label", new Integer(4), ScriptContext.ENGINE_SCOPE);
try {
engine.eval("println(2+label)");
} catch (ScriptException ex) {
ex.printStackTrace();
}

How to getClass implicitly

I'm writing a Scala wrapper over some Java code with a method signature like method(cl: Class[_], name: String) and many getClass methods in code looks not good:
Creator.create(getClass, "One")
Creator.create(getClass, "Two")
Creator.create(getClass, "Three")
Creator.create(getClass, "Four")
So can we somehow get enclosing class implicitly like Creator.create("some name")?
Answer 1.
In general I warmly discourage reflection. However, if you really want to do it, in Scala 2.9.1 you can use Manifest
def create[T:Manifest](name:String) = {
val klass:Class[_] = t.erasure
}
In scala 2.10 you should have a look to TypeTag.
Answer 2.
However, as I already said, the right approach would not be to use class but implicit builders
trait Builder[T]{
def build(name:String):T
}
def create[T](name:String)(implicit builder:Builder[T]) = {
builder.build(name)
}
in that way you can limit what you can create by importing only the right implicits in scope, you won't rely on reflection and you won't risk to get horrible RuntimeExceptions
Post comment answer
If your point is to avoid calling getClass at each invocation, you can do the following
def create(name:String)(implicit klass:Class[_]) {
}
And you can call it this way
implicit val klass1 = myInstance.getClass
create("hello")
create("goodbye")

Scala and Python's pass

I was wondering, is there an equivalent of python's pass expression? The idea is to write method signatures without implementations and compiling them just to type-check those signatures for some library prototyping. I was able to kind of simulate such behavior using this:
def pass[A]:A = {throw new Exception("pass"); (new Object()).asInstanceOf[A]}
now when i write:
def foo():Int = bar()
def bar() = pass[Int]
it works(it typechecks but runtime explodes, which is fine), but my implementation doesn't feel right (for example the usage of java.lang.Object()). Is there better way to simulate such behavior?
In Scala 2.10, there is the ??? method in Predef.
scala> ???
scala.NotImplementedError: an implementation is missing
at scala.Predef$.$qmark$qmark$qmark(Predef.scala:252)
...
In 2.9, you can define your own one like this:
def ???[A]:A = throw new Exception("not implemented")
If you use this version without an explicit type paramter, A will be inferred to be Nothing.