Generating import statements with scala macros - scala

I have the following code:
#mymacro #imports
val _1 = { import scala.collection.mutable.ListBuffer }
#mymacro
val _2 = { val yy: ListBuffer[Int] = ListBuffer.empty }
#mymacro is a scala macro that checks if it has been annotated with the #importsannotation. Part of the implementatation is as follows:
case (cc#q"${mods: Modifiers} val $tname: ${tpt: Tree} = ${expr: Tree}") :: Nil =>
if (tname.toString().startsWith("_"))
if (checkImports(mods, expr)) {
q"import scala.collection.mutable.ListBuffer"
}
else
q"{$expr}"
Currently the macro is able to transform the whole val _1 = ... statement to import scala.collection.mutable.ListBuffer (without the {} brackets!) But when the compilation continues, I keep getting the not found: type ListBuffer compilation error. Now I wonder if it is possible to fix this error somehow without having to define the import statement at the top of the file.
I am using the Scala 2.10 macro paradise plugin

Related

How to fix the problem "Fatal error: java.lang.Object is missing called from core module analyzer" in ScalaJS Linking

I'm trying to defer ScalaJS Linking to runtime, which allows multi-stage compilation to be more flexible and less dependent on sbt.
The setup looks like this:
Instead of using scalajs-sbt plugin, I chose to invoke scalajs-compiler directly as a scala compiler plugin:
scalaCompilerPlugins("org.scala-js:scalajs-compiler_${vs.scalaV}:${vs.scalaJSV}")
This can successfully generate the "sjsir" files under project output directory, but no further.
Use the solution in this post:
Build / Compile latest SalaJS (1.3+) using gradle on a windows machine?
"Linking scala.js yourself" to invoke the linker on all the compiled sjsir files to produce js files, this is my implementation:
in compile-time & runtime dependencies, add scalajs basics and scalajs-linker:
bothImpl("org.scala-js:scalajs-library_${vs.scalaBinaryV}:${vs.scalaJSV}")
bothImpl("org.scala-js:scalajs-linker_${vs.scalaBinaryV}:${vs.scalaJSV}")
bothImpl("org.scala-js:scalajs-dom_${vs.scalaJSSuffix}:2.1.0")
Write the following code:
import org.scalajs.linker.interface.{Report, StandardConfig}
import org.scalajs.linker.{PathIRContainer, PathOutputDirectory, StandardImpl}
import org.scalajs.logging.{Level, ScalaConsoleLogger}
import java.nio.file.{Path, Paths}
import java.util.Collections
import scala.concurrent.duration.Duration
import scala.concurrent.{Await, ExecutionContext}
object JSLinker {
implicit def gec = ExecutionContext.global
def link(classpath: Seq[Path], outputDir: Path): Report = {
val logger = new ScalaConsoleLogger(Level.Warn)
val linkerConfig = StandardConfig() // look at the API of this, lots of options.
val linker = StandardImpl.linker(linkerConfig)
// Same as scalaJSModuleInitializers in sbt, add if needed.
val moduleInitializers = Seq()
val cache = StandardImpl.irFileCache().newCache
val result = PathIRContainer
.fromClasspath(classpath)
.map(_._1)
.flatMap(cache.cached _)
.flatMap(linker.link(_, moduleInitializers, PathOutputDirectory(outputDir), logger))
Await.result(result, Duration.Inf)
}
def linkClasses(outputDir: Path = Paths.get("./")): Report = {
import scala.jdk.CollectionConverters._
val cl = Thread.currentThread().getContextClassLoader
val resources = cl.getResources("")
val rList = Collections.list(resources).asScala.toSeq.map { v =>
Paths.get(v.toURI)
}
link(rList, outputDir)
}
lazy val linkOnce = {
linkClasses()
}
}
The resources detection was successful, all roots containing sjsir are detected:
rList = {$colon$colon#1629} "::" size = 4
0 = {UnixPath#1917} "/home/peng/git-scaffold/scaffold-gradle-kts/build/classes/scala/test"
1 = {UnixPath#1918} "/home/peng/git-scaffold/scaffold-gradle-kts/build/classes/scala/testFixtures"
2 = {UnixPath#1919} "/home/peng/git-scaffold/scaffold-gradle-kts/build/classes/scala/main"
3 = {UnixPath#1920} "/home/peng/git-scaffold/scaffold-gradle-kts/build/resources/main"
But linking still fails:
Fatal error: java.lang.Object is missing
called from core module analyzer
There were linking errors
org.scalajs.linker.interface.LinkingException: There were linking errors
at org.scalajs.linker.frontend.BaseLinker.reportErrors$1(BaseLinker.scala:91)
at org.scalajs.linker.frontend.BaseLinker.$anonfun$analyze$5(BaseLinker.scala:100)
at scala.concurrent.impl.Promise$Transformation.run$$$capture(Promise.scala:467)
at scala.concurrent.impl.Promise$Transformation.run(Promise.scala)
at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1402)
at java.util.concurrent.ForkJoinTask.doExec$$$capture(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
I wonder what this error message entails. Clearly java.lang.Object is not compiled into sjsir. Does this error message make sense? How do I fix it?
Thanks to #sjrd I now have the correct runtime compilation stack. There are 2 problems in my old settings:
It turns out that cl.getResources("") is indeed not able to infer all classpath, so I switch to system property java.class.path, which contains classpaths of all dependencies
moduleInitializers has to be manually set to point to a main method, which will be invoked when the js function is called.
After correcting them, the compilation class becomes:
import org.scalajs.linker.interface.{ModuleInitializer, Report, StandardConfig}
import org.scalajs.linker.{PathIRContainer, PathOutputDirectory, StandardImpl}
import org.scalajs.logging.{Level, ScalaConsoleLogger}
import java.nio.file.{Files, Path, Paths}
import scala.concurrent.duration.Duration
import scala.concurrent.{Await, ExecutionContext, ExecutionContextExecutor}
object JSLinker {
implicit def gec: ExecutionContextExecutor = ExecutionContext.global
val logger = new ScalaConsoleLogger(Level.Info) // TODO: cannot be lazy val, why?
lazy val linkerConf: StandardConfig = {
StandardConfig()
} // look at the API of this, lots of options.
def link(classpath: Seq[Path], outputDir: Path): Report = {
val linker = StandardImpl.linker(linkerConf)
// Same as scalaJSModuleInitializers in sbt, add if needed.
val moduleInitializers = Seq(
ModuleInitializer.mainMethodWithArgs(SlinkyHelloWorld.getClass.getName.stripSuffix("$"), "main")
)
Files.createDirectories(outputDir)
val cache = StandardImpl.irFileCache().newCache
val result = PathIRContainer
.fromClasspath(classpath)
.map(_._1)
.flatMap(cache.cached _)
.flatMap { v =>
linker.link(v, moduleInitializers, PathOutputDirectory(outputDir), logger)
}
Await.result(result, Duration.Inf)
}
def linkClasses(outputDir: Path = Paths.get("./ui/build/js")): Report = {
val rList = getClassPaths
link(rList, outputDir)
}
def getClassPaths: Seq[Path] = {
val str = System.getProperty("java.class.path")
val paths = str.split(':').map { v =>
Paths.get(v)
}
paths
}
lazy val linkOnce: Report = {
val report = linkClasses()
logger.info(
s"""
|=== [Linked] ===
|${report.toString()}
|""".stripMargin
)
report
}
}
This is all it takes to convert sjsir artefacts to a single main.js file.

compiler crash when I use macros and playframework

I wrote a macro that parses JSON into a matching case class.
def parse(jsonTree: JsValue): BaseType = macro parserImpl
def parserImpl(c: blackbox.Context)(jsonTree: c.Tree) = {
import c.universe._
val q"$json" = jsonTree
val cases = List("X", "Y").map { caseClassName =>
val caseClass = c.parse(caseClassName)
val reader = c.parse(s"JSONHelp.${caseClassName}_reads")
val y = cq"""$caseClassName => (($json \ "config").validate[$caseClass]($reader)).get"""
println(showCode(y))
y
}.toList
val r =
q"""
import play.api.libs.json._
import JSONHelp._
println($json)
($json \ "type").as[String] match { case ..$cases }
"""
println(showCode(r))
r
}
The following is that code it generates (printed by the last println):
{
import play.api.libs.json._;
import JSONHelp._;
println(NodeParser.this.json);
NodeParser.this.json.\("type").as[String] match {
case "X" => NodeParser.this.json.\("config").validate[X](JSONHelp.X_reads).get
case "Y" => NodeParser.this.json.\("config").validate[Y](JSONHelp.Y_reads).get
}
}
The compilation of the subproject containing the macro definition works fine. But when I compile the project(using sbt 0.13.11 and scala 2.11.8) using the macro, I get the following error:
java.lang.NullPointerException
at play.routes.compiler.RoutesCompiler$GeneratedSource$.unapply(RoutesCompiler.scala:37)
at play.sbt.routes.RoutesCompiler$$anonfun$11$$anonfun$apply$2.isDefinedAt(RoutesCompiler.scala:180)
at play.sbt.routes.RoutesCompiler$$anonfun$11$$anonfun$apply$2.isDefinedAt(RoutesCompiler.scala:179)
at scala.Option.collect(Option.scala:250)
at play.sbt.routes.RoutesCompiler$$anonfun$11.apply(RoutesCompiler.scala:179)
at play.sbt.routes.RoutesCompiler$$anonfun$11.apply(RoutesCompiler.scala:178)
I'm not a user, but I see it seems to want tree positions with a source file:
val routesPositionMapper: Position => Option[Position] = position => {
position.sourceFile collect {
case GeneratedSource(generatedSource) => {
It's typical to use atPos(pos)(tree). You might assume the incoming tree.pos for synthetic trees.

Need workaround for scala breeze matrix slicing and vector indexing

Because of the odd behaviour in method foo I cannot write methods like bar,
which I need:
import breeze.linalg.DenseMatrix
import breeze.linalg.DenseVector
class Test {
val dim = 3
val X:DenseMatrix[Double] = DenseMatrix.rand(dim,dim)
val u:DenseVector[Double] = DenseVector.fill(dim){1.0}
def foo:Unit = {
val i=0;
val row_i:DenseVector[Double] = X(i,::).t // OK
val s = u(i)+u(i) // OK
val j:Integer = 0
val row_j:DenseVector[Double] = X(j,::).t // does not compile (A)
val a = u(j)+u(j) // does not compile (B)
}
def bar(i:Integer):Double = u(i)+u(i) // does not compile (C)
}
Is there a workaround?
Thanks in advance for all replies.
Compilation errors:
(A) could not find implicit value for parameter canSlice:
breeze.linalg.support.CanSlice2[breeze.linalg.DenseMatrix[Double],Integer,collection.immutable.::.type,Result]
not enough arguments for method apply: (implicit canSlice:
breeze.linalg.support.CanSlice2[breeze.linalg.DenseMatrix[Double],Integer,collection.immutable.::.type,Result])
Result in trait TensorLike. Unspecified value parameter canSlice.
(B), (C)
could not find implicit value for parameter canSlice:
breeze.linalg.support.CanSlice[breeze.linalg.DenseVector[Double],Integer,Result]
not enough arguments for method apply: (implicit canSlice: breeze.linalg.support.CanSlice[breeze.linalg.DenseVector[Double],Integer,Result])Result
in trait TensorLike. Unspecified value parameter canSlice.
First off: convert your Integer to Int . That would take care of at least the first compilation error. Following update to your code does compile
import breeze.linalg.DenseMatrix
import breeze.linalg.DenseVector
import breeze.linalg.DenseMatrix
import breeze.linalg.DenseVector
class Test {
val dim = 3
val X:DenseMatrix[Double] = DenseMatrix.rand(dim,dim)
val u:DenseVector[Double] = DenseVector.fill(dim){1.0}
def foo:Unit = {
val i=0;
val row_i:DenseVector[Double] = X(i,::).t // OK
val s = u(i)+u(i) // OK
val j:Int = 0
val row_j:DenseVector[Double] = X(j,::).t // does not compile (A)
val a = u(j)+u(j) // does not compile (B)
}
def bar(i:Int):Double = u(i)+u(i) // does not compile (C)
}
From the repl:
// Exiting paste mode, now interpreting.
import breeze.linalg.DenseMatrix
import breeze.linalg.DenseVector
defined class Test
So your other errors also disappear for me (scala 2.11.8). Let me know if you have further issues.

How to add symbols into a parsed AST?

(This is loosely related to Scala script in 2.11 and Generating a class from string and instantiating it in Scala 2.10).
In the code below I have a code parsed runtime using Toolbox into a corresponding AST. How can add symbol definitions (prefix in the code below) to the AST so that those symbols can be used in the expression tree?
import scala.reflect.runtime.universe._
import scala.tools.reflect.ToolBox
object Main extends App {
val cm = runtimeMirror(getClass.getClassLoader)
val tb = cm.mkToolBox()
val expr = tb.parse("println(i)")
val build = internal.reificationSupport
val prefix = build.setInfo(build.newFreeTerm("i", 2), typeOf[Int])
// TODO: add prefix before expr by some AST manipulation
tb.eval(expr)
}

Does TypeSafe Slick work on Scala 2.9.3?

Does TypeSafe Slick work on Scala 2.9.3? I get
[ERROR] exception when typing query.list
exception when typing query.listscala.tools.nsc.symtab.Types$TypeError: class file needed by StatementInvoker is missing.
[INFO] class file needed by StatementInvoker is missing.
[INFO] reference type Either of object package refers to nonexisting symbol.
which goes away when I use Scala 2.10.x, but I'm too new to Scala to understand why.
import slick.session.Database
import scala.slick.jdbc.StaticQuery
import Database.threadLocalSession
import com.typesafe.config.ConfigFactory
object PostgresDao {
protected val conf = ConfigFactory.load
def findFoo(a: Int, b: String): Option[Int] = {
Database.forURL("jdbc:postgresql://localhost/bar", driver = "org.postgresql.Driver") withSession {
val query = StaticQuery.query[(Int, String), Int](
"""
select some_int
from some_table t
where t.a = ? and t.b = ?
""".stripMargin)
val list: List[Int] = query.list((a, b))
if (list.isEmpty) {
None
}
else {
Some(list.head)
}
}
}
I shamelessly copy-past official doc here:
Slick requires Scala
2.10. (For Scala 2.9 please use ScalaQuery, the
predecessor of Slick).