I am writing a plugin for SBT that requires a list of the class files generated by the last run of the Scala compiler.
This list of class files is then passed into a program that performs some bytecode transformations. Since this transformation process can be slow, I only want the class files written by the last run of the Scala compiler (i.e. those that there modified), not all class files in the output directory.
How can I obtain a list of the files last generated by the compile task?
I think you cannot get this information directly from the Analysis object returned by the compile task.
However, what you could do is
to check analysis.relations.allProducts for changes. If any of the files is modified you can execute yours task, which performs bytecode transformations.
You could use a modified version of a FileFunction.cached, to check for changes.
def cached(cacheBaseDirectory: File, inStyle: FilesInfo.Style)(action: Set[File] => Unit): Set[File] => Unit = {
import Path._
lazy val inCache = Difference.inputs(cacheBaseDirectory / "in-cache", inStyle)
inputs => {
inCache(inputs) { inReport =>
if(!inReport.modified.isEmpty) action(inReport.modified)
}
}
}
The function takes following parameters:
cacheBaseDirectory - location of the cache
inStyle - description of how the changes should be discovered (see sbt.FilesInfo for possible options)
action - a function run, when the files has been modified. The function takes a list of modified files as an argument.
The function returns another function which is run only if the set of files passed to it as an argument is modified.
Example
val transformBytecode = taskKey[Unit]("Transforms bytecode of modified files")
def cached(cacheBaseDirectory: File, inStyle: FilesInfo.Style)(action: Set[File] => Unit): Set[File] => Unit = {
import Path._
lazy val inCache = Difference.inputs(cacheBaseDirectory / "in-cache", inStyle)
inputs => {
inCache(inputs) { inReport =>
if(!inReport.modified.isEmpty) action(inReport.modified)
}
}
}
transformBytecode <<= Def.task {
val analysis = (compile in Compile).value
val cachedFunction = cached(streams.value.cacheDirectory / "transform-bytecode", FilesInfo.lastModified) { modified =>
// here you want to run the bytecode transformations on `modified` files
println(s"Modified files $modified")
}
cachedFunction(analysis.relations.allProducts.toSet)
}.triggeredBy(compile in Compile)
Related
I have the following information:
Scalameta: has the ability to produce ASTs from a source file
SemanticDB: Contains information about symbols from a parsed source
file
ScalaFix: is based on ScalaMeta and SemanticDB so it has the ability to access symbol information and traverse ASTs.
Loading a source file using
ScalaMeta is as easy as the following:
val path = java.nio.file.Paths.get("path to source file")
val bytes = java.nio.file.Files.readAllBytes(path)
val text = new String(bytes, "UTF-8")
val input = Input.VirtualFile(path.toString, text)
val tree = input.parse[Source].get
As you can observe from the above code snippet, ScalaMeta parses the source file as type Source.
Now consider the below code snippet where ScalaFix uses a tree of type SemanticDocument:
class NamedLiteralArguments extends SemanticRule("NamedLiteralArguments") {
override def fix(implicit doc: SemanticDocument): Patch = {
doc.tree
.collect {
case Term.Apply(fun, args) =>
args.zipWithIndex.collect {
case (t # Lit.Boolean(_), i) =>
fun.symbol.info match {
case Some(info) =>
info.signature match {
case method: MethodSignature
if method.parameterLists.nonEmpty =>
val parameter = method.parameterLists.head(i)
val parameterName = parameter.displayName
Patch.addLeft(t, s"$parameterName = ")
case _ =>
// Do nothing, the symbol is not a method
Patch.empty
}
case None =>
// Do nothing, we don't have information about this symbol.
Patch.empty
}
}
}
.flatten
.asPatch
}
}
Inspecting the above two code snippets shows that ScalaMeta can parse a Scala source into type Source. ScalaFix seems to parse the same into an implicit SemanticDocument. The SemanticDocument has field tree which is implemented by ScalaMeta which results into a traversable AST data structure just like the one produced by parsing source file as type Source. This shows the relationship between ScalaMeta and ScalaFix. However, my concern is that I need to load a Scala source code and use ScalaFix on it to access symbol.info but the ScalaFix documentation does not show how to do this.
When I attempt to load source file as a SemanticDocument like this in the first code snippet instead of Source:
val tree = input.parse[SemanticDocument].get
I get an error that no parameters found for parameter parse in parse[SemanticDocument]. Also note that trying to use symbol.info
in the first code snippet also produces errors about implicit types. This is not the case with the second code snippet as the loaded doc parameter is an implicit parameter of type SemanticDocument.
So how does ScalaFix load source files as SemanticDocument?
In a Scala.js unit test, what is the easiest solution to load test data from a file residing in test/resources?
It turns out at least with recent Scala.js (0.6.14 and 0.6.15 tested) and Node.js (7.8.0 tested) the situation is simple. As tests are ran using Node runner by default, one can use Node.js sync file operations and read the file using fs readFileSync. A function handling this can look like:
def rscPath(path: String): String = "src/test/resources/" + path
def rsc(path: String): String = {
import scalajs.js.Dynamic.{global => g}
val fs = g.require("fs")
def readFile(name: String): String = {
fs.readFileSync(name).toString
}
readFile(rscPath(path))
}
val testInput = rsc("package/test-input.txt")
Instead of loading them directly from src/test/resources, one could also load them from target/scala-2.12/test-classes as the files are copied there by the SBT build. I think I would prefer this if I could find some simple API how to obtain this path, so that it does not have to be hardcoded in the rscPath function.
I want to create SBT-task to generate documentation for my classes based on annotations inside these classes.
So I am creating a task:
val genToolsDocs = TaskKey[Unit]("gendoc-tools", "gen doc")
genToolsDocs in Runtime <<=
(compile in Compile) map {
(compiled: Analysis) ⇒ {
???
}
}
But then I actually don't now how to properly get generated classes from this compiled: Analysis object?
there is compiled.apis.internal.values.map(_.api()...) from where I can get some info about my classes, but not too much.
there are also compiled.relations.classes and compiled.stamps.allBinaries and many other things but I can't understand what should I use to get compiled class files.
Maybe it is fully wrong way, isn't it?
Finally I've ended up with
genToolsDocs <<= (compile in Compile) map {
(compiled: Analysis) ⇒ {
val files = compiled.stamps.allProducts.filter(_.getPath.contains(???))
???
}
}
and in files now I get all my compiled classes.
Using scopt https://github.com/scopt/scopt
I have a very simple Scala CLI driver that errors out on the first line of .parse. The line is var i = 0, can’t imagine why that would fail, maybe in how i instantiated the OptionParser?
def parse(args: Seq[String], init: C): Option[C] = {
var i = 0 <———————————————— prints the error below
val pendingOptions = ListBuffer() ++ (nonArgs filterNot {_.hasParent})
Exception in thread "main" java.lang.NoSuchMethodError: scala.runtime.IntRef.create(I)Lscala/runtime/IntRef;
at scopt.OptionParser.parse(options.scala:306)
at org.apache.mahout.drivers.ItemSimilarityDriver$.main(ItemSimilarityDriver.scala:47)
at org.apache.mahout.drivers.ItemSimilarityDriver.main(ItemSimilarityDriver.scala)
Full code here, sorry but I’m new to Scala so this may be a really stupid question
object ItemSimilarityDriver {
/**
* #param args Command line args, if empty a help message is printed.
* #return
*/
def main(args: Array[String]): Unit = {
val parser = new OptionParser[Config]("ItemSimilarity") {
head("ItemSimilarity", "Spark")
opt[Unit]('r', "recursive") action { (_, c) =>
c.copy(recursive = true) } text("The input path should be searched recursively for files that match the filename pattern (optional), Default: false.")
opt[String]('o', "output") required() action { (x, c) =>
c.copy(output = x) } text("Output is a path for all output (required)")
opt[String]('i', "input") required() action { (x, c) =>
c.copy(input = x) } text("Input is a path for input, it may be a filename or dir name. If a directory it will be searched for files matching the -p pattern. (required)")
note("some notes.\n")
help("help") text("prints this usage text")
}
// parser.parse returns Option[C]
parser.parse(args, Config()) map { config => <—————————— parser was created
but this call fails in the parse
// do stuff
//val didIt = true
} getOrElse {
// arguments are bad, error message will have been displayed, throw exception, run away!
}
}
case class Config(recursive: Boolean = false, input: String = null, output: String = null)
}
I've also tried the mutable options method with the same error.
The problem seems to be mismatch in Scala library version and scopt. The current stable scopt 3.2.0 is cross published against:
Scala 2.9.1
Scala 2.9.2
Scala 2.9.3
Scala 2.10
Scala 2.11
Scala 2.10 and 2.11 artifacts uses the sbt 0.12's cross versioning convention and uses _2.10 suffix because Scala 2.10.x minor releases are binary compatible with Scala 2.10.0. In other words scopt_2.11 is NOT a later version of scopt_2.10. One is compiled against Scala 2.11.x while the other Scala 2.10.x.
I'd recommend you give sbt a try to manage external libraries. sbt has a plugin to generate IntelliJ project for you.
I want my Scala code to take a Scala class as input, compile and execute that class. How can I programmatically invoke a Scala compiler? I will be using the latest Scala version, i.e. 2.10.
ToolBox
I think the proper way of invoking the Scala compiler is doing it via Reflection API documented in Overview. Specifically, Tree Creation via parse on ToolBoxes section in 'Symbols, Trees, and Types' talks about parsing String into Tree using ToolBox. You can then invoke eval() etc.
scala.tools.nsc.Global
But as Shyamendra Solanki wrote, in reality you can drive scalac's Global to get more done. I've written CompilerMatcher so I can compile generated code with sample code to do integration tests for example.
scala.tools.ncs.IMain
You can invoke the REPL IMain to evaluate the code (this is also available in the above CompilerMatcher if you want something that works with Scala 2.10):
val main = new IMain(s) {
def lastReq = prevRequestList.last
}
main.compileSources(files.map(toSourceFile(_)): _*)
code map { c => main.interpret(c) match {
case IR.Error => sys.error("Error interpreting %s" format (c))
case _ =>
}}
val holder = allCatch opt {
main.lastReq.lineRep.call("$result")
}
This was demonstrated in Embedding the Scala Interpreter post by Josh Suereth back in 2009.
The class to be compiled and run (in file test.scala)
class Test {
println ("Hello World!")
}
// compileAndRun.scala (in same directory)
import scala.tools.nsc._
import java.io._
val g = new Global(new Settings())
val run = new g.Run
run.compile(List("test.scala")) // invoke compiler. it creates Test.class.
val classLoader = new java.net.URLClassLoader(
Array(new File(".").toURI.toURL), // Using current directory.
this.getClass.getClassLoader)
val clazz = classLoader.loadClass("Test") // load class
clazz.newInstance // create an instance, which will print Hello World.