I want to create an Action to automate GCJ compilation. Since I couldn't make it work with Ant, I decided to try SBT. The docs say how to create an Action and how to run an external process. What I don't yet see is how to reuse the directory tree traversal which exists for java and scala compiler Actions. In this case my input files would be all the .class files under a certain root folder. I would also need to specify a specific classpath for GCJ. Any pointers for this would be appreciated too.
I haven't used GCJ much at all and I'm still pretty new at SBT, but this is how I believe you could write a quick task to do exactly what you are looking for with SBT 0.7.1. You can use a PathFinder to grab all of the class files like so:
val allClasses = (outputPath ##) ** "*.class"
Using that PathFinder and the "compileClasspath" top level method, you can construct a task like this which will run gcj using the current project's classpath and compose all of the .class files into one gcjFile:
val gcj = "/usr/local/bin/gcj"
val gcjFile = "target/my_executable.o"
val allClasses = (outputPath ##) ** "*.class"
lazy val gcjCompile = execTask {
<x>{gcj} --classpath={compileClasspath.get.map(_.absolutePath).mkString(":")} -c {allClasses.get.map(_.absolutePath).mkString("-c ")} -o {gcjFile}</x>
} dependsOn(compile) describedAs("Create a GCJ executable object")
Related
I'm trying to run certain tasks and startup servers after running sbt. I want to be able to run commands in terminal to do this. How can I define them? Are plugins the right way to do this:
I see some code like this:
object DoThing extends AutoPlugin {
object autoImport {
val vpnCheck = taskKey[Boolean]("Check for a VPN connection.")
}
import autoImport._
override lazy val projectSettings = Seq(
vpnCheck := {
doVpnCheck()
}
)
What is the projectSettings method doing? Are plugins the way?
From the plugins page:
A plugin is a way to use external code in a build definition. A plugin can be a library used to implement a task (you might use Knockoff to write a markdown processing task). A plugin can define a sequence of sbt settings that are automatically added to all projects or that are explicitly declared for selected projects. For example, a plugin might add a proguard task and associated (overridable) settings. Finally, a plugin can define new commands (via the commands setting).
But I can't seem to figure this out.
For your scenario, maybe you can just a create Task in sbt file to do this, like:
val hello = taskKey[Unit]("hello world")
hello := {
println("hello")
}
and if you run it automatically in startup time, you can create .sbtrc file in project directory, and it like:
alias boot = ;reload ;hello ;iflast shell
I am trying to define a separate package task without modifying the original task in compile configuration. This new task will package only a subset of classes conforming an API which we need to be able to share with other teams so they can write plugins for our application. So the end result will be two jars, one with the full application and a second one with a subset of the classes.
I approached this problem by creating a different configuration which I called pluginApi and would redefine the packageBin task within this new configuration so it does not change the original definition of packageBin. This idea was taken from here:
How to create custom "package" task to jar up only specific package in SBT?
In my build.stb I have:
lazy val PluginApi = config("pluginApi") extend(Compile) describedAs("Custom plugin api configuration")
lazy val root = project in file(".") overrideConfigs (PluginApi)
This effectively creates my new configuration and I can call
sbt pluginApi:packageBin
Which generates the complete jar in the same way as compile:packageBin would do. I then try to modify the mappings in the new packageBin task with:
mappings in (PluginApi, packageBin) ~= { (ms: Seq[(File, String)]) =>
ms filter { case (file, toPath) =>
toPath.startsWith("some/path/defining/api")
}
}
but this has no effect. I think the reason is because the call to pluginApi:packageBin is delegated to compile:packageBin rather than it being a cloned task.
I can redefine a new packageBin within the new scope like:
packageBin in PluginApi := {
}
However I would have to rewrite all packageBin functionality instead of reusing existing code. Also, in case that rewriting is unavoidable I am not sure how that implementation would be.
Could somebody provide an example about how to achieve this?
You could have it done as follows
lazy val PluginApi = config("pluginApi").extend(Compile)
inConfig(PluginApi)(Defaults.compileSettings) // you have to have standard
mappings in (PluginApi, packageBin) := {
val original = (mappings in (PluginApi, packageBin)).value
original.filter { case (file, toPath) => toPath.startsWith("some/path/defining/api") }
}
unmanagedSourceDirectories in PluginApi := (unmanagedSourceDirectories in Compile).value
Note that, if you keep your sources in src/main/scala you'll have to override unmanagedSourceDirectories in the newly created configuration.
Normally the unmanagedSourceDirectories contains the configuration name. E.g. src/pluginApi/scala or src/pluginApi/java.
I have had similar problems (with more than one jar per project). Our project uses ant - here you can do it, you just will repeat yourself a lot.
However, I have come to the conclusion that this scenario (2 JARs for one project) actually can be simplified by splitting the project - i.e. making 2 modules out of it.
This way, I don't have to "fight" tools which assume project==artifact (like sbt, maybe maven?, IDEA's default setting,...).
As a bonus point the compiler helps me to verify that my dependencies are correct, i.e. that I did not accidentally make my API package depend on the implementation package - when compiling everything together and only splitting classes apart in the JAR step, you do run the risk of getting an invalid dependency in your setup which you would only see when testing, because during compile time everything is compiled together.
I use sbt in the following fashion: I run ~ test:compile in sbt and then work in IDE, watching occasionaly if the project still compiles, because the IDE's presentation compiler tends to be buggy. When I git pull some code, there might be changes in the project/ files, so I want to have reload. Is there a way, how to watch both source files and project files, so when there is change in project files, I actually get the update?
As jsuereth explained this isn't a task SBT can handle in 1 instance. What's required is a reboot of SBT to abort the watching process and reload it's own configuration.
The following Scala script does exactly this, it uses Java NIO WatchService and Scala Process to monitor a path and restart a process. The code should be fairly simple to understand:
#!/usr/bin/env scala
import java.nio.file._
import scala.collection.JavaConversions._
import scala.sys.process._
val file = Paths.get(args(0))
val cmd = args(1)
val watcher = FileSystems.getDefault.newWatchService
file.register(
watcher,
StandardWatchEventKinds.ENTRY_CREATE,
StandardWatchEventKinds.ENTRY_MODIFY,
StandardWatchEventKinds.ENTRY_DELETE
)
def exec = cmd run true
#scala.annotation.tailrec
def watch(proc: Process): Unit = {
val key = watcher.take
val events = key.pollEvents
val newProc =
if (!events.isEmpty) {
proc.destroy()
exec
} else proc
if (key.reset) watch(newProc)
else println("aborted")
}
watch(exec)
Usage in your sbt dir would be:
watchr.scala project/ "sbt ~ test:compile"
If anything is unclear don't hesitate to ask, of course any scripting language could be used to implement this behavior.
You actually can't use ~ <task> and have it rebuild the project itself right now, because ~ <task> needs to read the build definition itself to determine:
What source files to watch
How to run the task.
What you're doing is altering the config whe project/ changes. This requires a full reload or reboot of sbt to pull in the new configuration.
So, as of sbt 0.13, this isn't possible. You can have it so it will rebuild your source code when project/ changes, but without rebuilding the build definition, not much help.
You could create a new sbt prompt, or task, that when run could check to see if source files in project/ are updated and issue a warning/error so you know to reboot. That's probably the best option right now.
I'm just starting with Scala and have run into a problem that has me stumped, but I'm guessing that I'm missing something easy.
I was following instructions to use the Clapper ClassFinder:
http://thoughts.inphina.com/2011/09/15/building-a-plugin-based-architecture-in-scala/
val classpath = List("./plugins").map(new File(_))
val finder = ClassFinder(classpath)
val classes = finder.getClasses
val classMap = ClassFinder.classInfoMap(classes)
After executing the first line, I see that classpath is set simply to
List(.\plugins)
I'm running this on windows, so the swapping of the slash seems to be OK.
But I expected to see a list of File objects, although I am not sure about this Scala syntax, and perhaps I'm missing something in the Scala IDE. The value for classes shows an "empty iterator".
It seems not to be finding any files in the path that I specified. I tried using an absolute path, but I had the same results. I have a single jar file in the plugins directory that I'm hoping it will find. The plugins directory is at the root of the Play2 project I'm using.
Edit ---
I did find that when I explicitly list the path to one jar that it is able to find it:
val classpath = List("./plugins/myPlugin.jar").map(new File(_))
But I want to find all jar files in the directory.
The following didn't work:
val classpath = List("./plugins/*").map(new File(_))
Nor did this:
val classpath = List("./plugins/*.jar").map(new File(_))
Judging by this issue on the ClassFinder repo on Github it may be a bug.
I think you need to create an explicit list of jar files or to list the ones contained in your folder like:
val classpath =(new File("./plugins")).listFiles.filter(_.getName.endsWith(".jar"))
EDIT: from a cursory glance at ClassFinder's source on GitHub I think it's not a bug. ClassFinder searches for .class files either in jars or in zip files or directly in folders but it looks like it does not mix these things recursively (i.e. if you give it a folder it will look for classes directly in the folder but it won't look for classes in jars in the folder)
if you objective is to list all jar files, you can use following code:
val classpath = List("./plugins").map(path => Option(new File(path).listFiles).getOrElse(Array.empty[java.io.File]) filter(file => file.isFile && file.getName.endsWith(".jar"))).flatten
According to sbt tutorial on changing paths I'm trying to change "target" output directory to "someother"
override def outputDirectoryName = "someother"
Everything goes fine except one: sbt automatically creates target directory with ".history" file inside. Why sbt does this when it supposed do create only "someother" dir ? I tryied to override all methods that are inherited from BasicProjectPaths (I use sbt.DefaultProject as superclass of my project descriptor)
override def mainCompilePath = ...
override def testCompilePath = ...
...
But sbt creates "target" folder in spite of paths overriding.
It certainly seems that it should use the overridden outputDirectoryName in trunk...
/** The path to the file that provides persistence for history. */
def historyPath: Option[Path] = Some(outputRootPath / ".history")
def outputPath = crossPath(outputRootPath)
def outputRootPath: Path = outputDirectoryName
def outputDirectoryName = DefaultOutputDirectoryName
(from SBT's current trunk).
It may have been different in a previous version. Have you considered raising a new bug?
In sbt 0.13.5, I found a way to change the target folder by just re-assigning target in the build.sbt file:
target := file("someotherParent") / "someotherSubdir"
This only modifies the directory for the built classes and artifacts, however, the .history file is always in the project root directory.
Unfortunately, some other plugins (xsbt-web-plugin) seem to have problems with that - running the webapp via SBT console produced weird errors, when I switched back to the standard directory layout, these problems disappeared.
A better way to achieve my goals (of all JARS in one directory, whose names contains the JAVA-VM-version) seems to be to specify an appropriate target for publishing - there are less restrictions on "sbt publish", and other plugins are not disturbed by a different directory layout.