I am using SBT to build my scala project. AFter the compilation of a submodule which sues fastOptJS, I need to push the compiled files to another module within the same project, I designed a custom command fastOptCopy to do so.
lazy val copyjs = TaskKey[Unit]("copyjs", "Copy javascript files to public directory")
copyjs := {
val outDir = baseDirectory.value / "public/js"
val inDir = baseDirectory.value / "js/target/scala-2.11"
val files = Seq("js-fastopt.js", "js-fastopt.js.map", "js-jsdeps.js") map { p => (inDir / p, outDir / p) }
IO.copy(files, true)
}
addCommandAlias("fastOptCopy", ";fastOptJS;copyjs")
However, when I enter into the sbt console and type
~fastOptCopy
it keeps compiling, copying, compiling, copying, ... in an infinite loop. I guess that because I am copying the files, it thinks that the sources have changed and retriggers compilation.
How can I prevent this?
You can exclude specified files from watchSources in sbt configuration
http://www.scala-sbt.org/0.13/docs/Triggered-Execution.html
watchSources defines the files for a single project that are monitored
for changes. By default, a project watches resources and Scala and
Java sources.
Here is a similar question:
How to not watch a file for changes in Play Framework
watchSources := watchSources.value.filter { _.getName != "BuildInfo.scala" }
Related
I would like to have a sbt task that I can run to generate some code. I don't want to generate this with each run, just manually run this task once in awhile. I created a skeleton project to explain (https://github.com/jinyk/sbtmanagedsrc).
build.sbt:
lazy val root = (project in file("."))
.settings(scalaVersion := "2.11.8")
.settings(gensomecode := genSomeCodeTask.value)
/////////////////////////////////////////////////////////////
// fugly way to get managed sources compiled along with main
.settings(unmanagedSourceDirectories in Compile += baseDirectory.value / "target/scala-2.11/src_managed/")
/////////////////////////////////////////////////////////////
lazy val gensomecode = taskKey[Seq[File]]("gen-code")
lazy val genSomeCodeTask = Def.task {
val file = (sourceManaged in Compile).value / "SomeGenCode.scala"
println("file: " + file)
IO.write(file, """object SomeGenCode {
| def doSomething() {
| println("Hi!")
| }
|}""".stripMargin)
Seq(file)
}
So with the build.sbt above I can run sbt gensomecode which creates
target/scala-2.11/src_managed/main/SomeGenCode.scala the default place that sbt puts "managed sources."
I would like to make this SomeGenCode available to the root project.
src/main/scala/Main.scala:
object Main extends App {
SomeGenCode.doSomething()
}
The only thing I can figure out to do is to include the default sourceManaged directory in the root project's unmanagedSourceDirectories (see build.sbt:line 4 aka the line below the fugly way... comment). This is ugly as hell and doesn't seem like it's how managed sources are supposed to be handled.
I'm probably not understanding something basic about sbt's managed sources concept or how to handle the situation of creating an sbt task to generate sources.
What am I missing?
There are three options that I am familiar with:
Generate into the unmanaged source directories.
Generate on every run, by adding sourceGenerators in Compile <+= gensomecode
Similar to (2), but use caching so it doesn't generate the file on every compile. Full example below.
In this example, the cache is based on the content of build.sbt, so whenever that file is changed it will regenerate the file.
lazy val root = (project in file("."))
.settings(scalaVersion := "2.11.8")
.settings(gensomecode <<= genSomeCodeTask)
sourceGenerators in Compile <+= genSomeCodeTask
lazy val gensomecode = taskKey[Seq[File]]("gen-code")
def generateFile(sourceManaged: java.io.File) = {
val file = sourceManaged / "main" / "SomeGenCode.scala"
println("file: " + file)
IO.write(file, """object SomeGenCode {
| def doSomething() {
| println("Hi!")
| }
|}""".stripMargin)
Set(file)
}
def genSomeCodeTask = (sourceManaged in Compile, streams).map {
(sourceManaged, streams) =>
val cachedCompile = FileFunction.cached(
streams.cacheDirectory / "mything",
inStyle = FilesInfo.lastModified,
outStyle = FilesInfo.exists) {
(in: Set[java.io.File]) =>
generateFile(sourceManaged)
}
cachedCompile(Set(file("build.sbt"))).toSeq
}
I hope I wasn't too late for the answer, but let's look at this section about Unmanaged vs Managed files
Classpaths, sources, and resources are separated into two main categories: unmanaged and managed. Unmanaged files are manually created files that are outside of the control of the build. They are the inputs to the build. Managed files are under the control of the build. These include generated sources and resources as well as resolved and retrieved dependencies and compiled classes.
It seems that the key difference between "Unmanaged vs Managed" is "Manually vs Automatically". Now, if we look at documentation for "generating files". We will notice immediately that it means "generating files automatically", since generating files will happen at sbt compile.
Compile / sourceGenerators += <task of type Seq[File]>.taskValue
It makes sense. Since anything that happened during sbt compile should be removed during sbt clean.
Now, from your code below, It seems that you were trying to generate an unmanaged source file (you were not using sourceGenerators, didn't you?), to the managed source file directory. The most obvious problem with this is, your source file will be removed every time you call sbt clean, so you have to run this task again to get this file back (worse, you have to run the task manually, opposed to having the sbt compile do it for you.), thus defeating your purpose of doing it manually once in a while.
val file = (sourceManaged in Compile).value / "SomeGenCode.scala"
To fix this, you have to manually generate files to unmanaged source, which is basically your source code directory (it depends -- mine is "/app"). Yet, you have to annotate it somehow that these files are generated by some means. My solution is something like:
val file = (scalaSource in Compile).value / "generated" / "SomeGenCode.scala"
Hope this help!
I am using Play 2.2.3.
I am new to Play and SBT. I want to make Play not watch a folder during development. In this case, a node_modules folder under public folder.
I tried these below but it did not seems to work. Also, I dont' know what is the difference between watchSources and playMonitoredFiles.
.settings(
playMonitoredFiles <<= playMonitoredFiles map { (files: Seq[String]) =>
files.filterNot(file => file.contains("node_modules"))
}
)
.settings(
watchSources <<= watchSources map { (sources: Seq[java.io.File]) =>
sources
.filterNot(source => source.isFile && source.getPath.contains("node_modules") )
.filterNot(source => source.isDirectory && source.getPath.contains("node_modules"))
}
)
.settings(
watchSources := watchSources.value.filter { !_.getPath.contains("node_modules") }
)
Note: The original question was about a legacy release (2.2.3), however I found useful the same question for modern versions.
So basically watchSources is a SBT key used for the triggered-execution functionality.
On the contrary, playMonitoredFiles is a TaskKey defined by the PlayFramework sbt-plugin, which controls a set of directories to watch by
the development server (sbt run) for the auto-reloading functionality.
When launching an sbt run, for EVERY REQUEST, there is a check of the directories defined by playMonitoredFiles and if any change is found,
a silent re-compilation is triggered. This means shutting down the server, recompiling and starting again.
The default values for playMonitoredFiles are calculated in playMonitoredFilesTask.
So for example, we have a certain version file added to our resources, computed in each compilation. We needed PlayFramework to avoid checking changes on this file
on development, because on every-request, the auto-recompiling regenerated this file and play marked the sources to be recompiled again, however no java files were recompiled and
nothing is printed out, so we had to debug the application in order to find the issue.
This setting only applies to development server (sbt run).
Within a project settings in build.sbt, the task can be overridden to exclude a directory (in the example, target/version directory):
.settings(
...,
playMonitoredFiles := playMonitoredFiles.value.filterNot {
_.getPath() endsWith String.format("target%sversion", File.separator)
}
...,
)
The value is of type File and only contains directories.
I've been writing an SBT plugin that generates resources into resource_managed. I'm now looking to include these generated resources in the generated jar as the SBT docs detail:
Generating resources:
By default, generated resources are not included in the packaged source artifact. To do so, add them as you would other mappings. See Adding files to a package
I've read the docs but honestly how to do this I can't figure out. Can anyone explain it or point me to another project that does this so I can see how they do it?
First just to clarify, they are included in jars containing compiled classes. They are not included in jars containing sources.
By default, generated resources are not included in the packaged
source artifact.
For packageBin the generated files should already be included - just make sure you return all generated files from the generator method. Assuming you want to package them in the sources artifact, this is what you have to do.
Let's assume you have a generator that generates a property file.
lazy val generatePropertiesTask = Def.task {
val file = (Compile / resourceManaged).value / "stack-overflow" / "res.properties"
val contents = s"name=${name.value}\nversion=${version.value}"
IO.write(file, contents)
Seq(file)
}
resourceGenerators in Compile += generatePropertiesTask.taskValue
To include that in the generated sources you have to tell sbt to where the res.properties must be copied in the generated sources artefact. The task, which generates the packaged sources is called packageSrc, therefore you have to set mappings scoped to that task.
mappings in (Compile, packageSrc) += {
((resourceManaged in Compile).value / "stack-overflow" / "res.properties") -> "path/in/jar/res.properties"
}
Because your generator can generate many tasks, and mapping each by hand would be a tedious task, sbt gives you an utility to map multiple paths at once.
mappings in (Compile, packageSrc) ++= {
val allGeneratedFiles = ((resourceManaged in Compile).value ** "*") filter { _.isFile }
allGeneratedFiles.get pair relativeTo((resourceManaged in Compile).value)
}
The first line finds all generated files using path finders and second line maps them to their path in the target jar.
Generating boilerplate source code with sbt works fine:
sourceGenerators in Compile <+= sourceManaged in Compile map { srcDir =>
DslBoilerplate.generate(srcDir, Seq(
"path/to/a/definition/file"
))
}
When I run sbt compile this also compiles the generated source code files, thus producing some class files. I just don't want the generated source code to be re-compiled every time I re-compile the project during development.
So, from the class files I made a jar file and used this instead of the generated source/class files (I deleted those). This worked fine, now having access to the generated code through the jar file. Is there a way though to let sbt do the 4 steps (if needed?) in the initial project build?:
generate source code files
compile those files
create a jar from the produced class files
delete source and class files
(In this question they use the sbt.IO.jar method to create a jar but there they already have existing files...)
Or is there another better approach than making a jar to avoid re-compiling generated source code?
Update 1 (see update 2 below)
Thanks, Seth, for your answer! It worked great to avoid generating the source files with each project compilation since the cache now remembers that they have been created. I'll certainly use this feature, thanks!
But this was actually not what I had in mind with my original question. Sorry for not being clear enough. It might be clearer if we think of this as 2 transformations happening:
Input file ---1---> Source file (*.scala) ---2---> Target file (*.class)
where the transformations are
generation of source code (from some information in an input file) and
compilation of the generated source code
This all works fine when I compile the project with sbt compile.
But then if I "rebuild the project" (in IntelliJ), the generated source code (from the sbt compilation) will compile again, and that's what I want to avoid - but at the same time have access to that code. Is there any other way to avoid compilation than placing this code in a jar and then delete the source and target files?
So I tried to continue along that line of thought wrestling with sbt to make it create a source and target jar - still can't make the target jar. This is what I came up with so far (with help from this):
sourceGenerators in Compile += Def.task[Seq[File]] {
val srcDir = (sourceManaged in Compile).value
val targetDir = (classDirectory in Compile).value
// Picking up inputs for source generation
val inputDirs = Seq("examples/src/main/scala/molecule/examples/seattle")
// generate source files
val srcFiles = DslBoilerplate.generate(srcDir, inputDirs)
// prepare data to create jars
val srcFilesData = files2TupleRec("", srcDir)
val targetFilesData = files2TupleRec("", targetDir)
// debug
println("### srcDir: " + srcDir)
println("### srcFilesData: \n" + srcFilesData.mkString("\n"))
println("### targetDir: " + targetDir)
println("### targetFilesData: \n" + targetFilesData.mkString("\n"))
// Create jar from generated source files - works fine
val srcJar = new File("lib/srcFiles.jar/")
println("### sourceJar: " + srcJar)
sbt.IO.jar(srcFilesData, srcJar, new java.util.jar.Manifest)
// Create jar from target files compiled from generated source files
// Oops - those haven't been created yet, so this jar becomes empty... :-(
// Could we use dependsOn to have the source files compiled first?
val targetJar = new File("lib/targetFiles.jar/")
println("### targetJar: " + targetJar)
sbt.IO.jar(targetFilesData, targetJar, new java.util.jar.Manifest)
val cache = FileFunction.cached(
streams.value.cacheDirectory / "filesCache",
inStyle = FilesInfo.hash,
outStyle = FilesInfo.hash
) {
in: Set[File] => srcFiles.toSet
}
cache(srcFiles.toSet).toSeq
}.taskValue
def files2TupleRec(pathPrefix: String, dir: File): Seq[Tuple2[File, String]] = {
sbt.IO.listFiles(dir) flatMap {
f => {
if (f.isFile && f.name.endsWith(".scala")) Seq((f, s"${pathPrefix}${f.getName}"))
else files2TupleRec(s"${pathPrefix}${f.getName}/", f)
}
}
}
Maybe I still don't need to create jars? Maybe they shouldn't be created in the source generation task? I need help...
Update 2
Silly me!!! No wonder I can't make a jar with class files if I filter them with f.name.endsWith(".scala"), dohh
Since my initial question was not that clear, and Seth's answer is addressing an obvious interpretation, I'll accept his answer (after investigating more, I see that I should probably ask another question).
You want to use FileFunction.cached so that the source files aren't regenerated every time.
Here's an example from my own build:
Compile / sourceGenerators += Def.task[Seq[File]] {
val src = (Compile / sourceManaged).value
val base = baseDirectory.value
val s = streams.value
val cache =
FileFunction.cached(s.cacheDirectory / "lexers", inStyle = FilesInfo.hash, outStyle = FilesInfo.hash) {
in: Set[File] =>
Set(flex(s.log.info(_), base, src, "ImportLexer"),
flex(s.log.info(_), base, src, "TokenLexer"))
}
cache(Set(base / "project" / "flex" / "warning.txt",
base / "project" / "flex" / "ImportLexer.flex",
base / "project" / "flex" / "TokenLexer.flex")).toSeq
}.taskValue
Here the .txt and .flex files are input files to the generator. The actual work of generating the source files is farmed out to my flex method, which returns a java.io.File:
def flex(log: String => Unit, base: File, dir: File, kind: String): File =
...
You should be able to adapt this technique to your build.
FileFunction.cached is described in the API doc and in the sbt FAQ under "How can a task avoid redoing work if the input files are unchanged?" (http://www.scala-sbt.org/0.13/docs/Faq.html). (It would be nice if the material on caching was referenced from http://www.scala-sbt.org/0.13/docs/Howto-Generating-Files.html as well; currently it isn't.)
My project file structure looks like this:
build.sbt
lib
project
src
target
test
Inside lib folder I have sub folders that contain additional jar files. How can I get SBT to recognize sub-folders or to treat jar files recursively?
EDIT:
thanks to #Jhonny Everson I am able to get this working. Here is how:
added the following line in my build.sbt
unmanagedJars in Compile <++= baseDirectory map { base =>
val baseDirectories = (base / "lib" / "mycustomlib" )
val customJars = (baseDirectories ** "*.jar")
customJars.classpath
}
Note that the base directory is where build.sbt is located.
if you put jars on lib folder, Sbt will use them automatically. You can use unmanagedJars directive to specify multiple directories in which jar files can be found. See https://github.com/harrah/xsbt/wiki/Library-Management#manual-dependency-management
I wanted to implement something like vim's pathogen and here's what I came up with:
unmanagedJars in Compile ++= {
val libs = baseDirectory.value / "lib"
val subs = (libs ** "*") filter { _.isDirectory }
val targets = ( (subs / "target") ** "*" ) filter {f => f.name.startsWith("scala-") && f.isDirectory}
((libs +++ subs +++ targets) ** "*.jar").classpath
}
Using sbt 0.13.x or any typesafe-activator relying on it, this will check /lib, /lib/* and /lib/*/target/scala-* for JARs and load them into the classpath. If the example isn't clear enough to understand what's going on, it might help to know that baseDirectory.value, libs, subs and targets are sbt.Pathfinder instances.