Suppose I have a Scala program that creates files ending in .foo.
I'm building with sbt and want to remove these files whenever sbt clean is called.
Add additional directory to clean task in SBT build shows that a singe file can be added by
cleanFiles <+= baseDirectory { _ / "test.foo" }
However, it's unclear how to extend this to do:
cleanFiles <append> <*.foo>
All .foo files will be in the same directory, so I don't need to recursively check directories.
Though, that would also be interesting to see.
How can I configure sbt to clean files matching a wildcard, or regex?
Is it a bad design decision to have sbt clean remove files my program generates?
Should I instead use a flag in my program? Using sbt clean seems cleaner to me rather
than having to call sbt clean then sbt "run --clean".
This will find anything that matches *.foo in the base directory (but not child directories):
cleanFiles <++= baseDirectory (_ * "*.foo" get)
This works because Seq[File] gets implicitly converted to a PathFinder, which has methods such as * (match the pattern in the base directory) and ** (match the pattern including child directories). Then get converts it back to a Seq[File].
Related
I'm having trouble concatenating and fingerprinting all the CoffeeScript files in Play application. Everything works fine for JavaScript files with build.sbt like this one
pipelineStages := Seq(concat, digest)
Concat.groups := Seq(
"javascripts/app.js" -> group(((sourceDirectory in Assets).value / "javascripts") * "*.js")
)
But when sourceDirectory is changed to resourcesManaged that supposedly contains compiled CoffeeScript files sbt-concat doesn't pick them up.
sbt-coffeescript, and all other official source task plugins, don't put their files in resourcesManaged in Assets, but instead their own sub-directory in target/web/<taskname>. They scope the resourcesManaged setting to their main task, in this case this means resourcesManaged in (Assets, coffeescript) and resourcesManaged in (TestAssets, coffeescript).
When you run sbt coffeescript you can see the files are output to target/web/coffeescript/main. You can verify this by running show web-assets:coffeescript::resourceManaged from the sbt console.
This should be very easy, but I am missing something. I apologize for the too-basic question.
I am reorganizing some code. I'd like to get the main package fixed, and then I'll have to modify code in some packages that depend upon the main package. Temporarily, I'd like for those dependent packages not to try to compile in my sbt ~compile world.
I know there exists a setting, excludeFilter in Compile in unmanagedSources, but I don't know what syntax I should use to keep whatever default exclusions are there but to add an new exclusions for (deeply nested) source directories that correspond to dependent packages.
Many thanks for any help!
Here's a working example that excludes anything with any parent directory named foo:
Compile / unmanagedSources / excludeFilter ~= { _ ||
new FileFilter {
def accept(f: File) = f.getPath.containsSlice("/foo/")
} }
(Updated to use sbt 1 style syntax.)
I would like to add a symlink from the .git/hooks directory to a file in my working tree during a regular Play! framework 2.0 build. According to the Play documentation, all sbt functionality is available as normal in a Play build. Based on google searches, I'm trying to add this code to the ApplicationBuild object in my project/Build.scala file:
val symlinkGitPrepushHookTask = compile in Compile <<= compile in Compile map {comp =>
val output = "ln -sf ../../.hooks/pre-push.py .git/hooks/pre-push".!!
print(output)
comp
}
From my reading of the sbt docs, this should be adding a dependency to the compile task in the Compile scope. The dependency is on its existing value, but with my additional function mapped to it. Now when the compile task runs, my anonymous function should be run too. This does not successfully create the symlink, and does not even seem to run.
Immediately after posting this, I thought I would try adding the example I had found to the project/plugins.sbt file. This works, although it seems an abuse of a file to specify plugins.
... existing plugins.sbt content ...
compile in Compile <<= compile in Compile map { comp =>
"ln -sf ../../.hooks/pre-push.py .git/hooks/pre-push".!!
comp
}
The blank line is critical, as this is the delimiter in the .sbt format.
I am converting a legacy jar project to SBT and for strange reasons that are not easily solved, this project comes with "javax/servlet/Servlet.class" inside it. So I need to somehow exclude this class from the jar file generated by package-bin. How do I accomplish this ?. Preferably I would like to exclude using a wildcard (i.e. javax.*).
The SBT assembly plugin does look like it has features that will do this, but I am worried that relying on sbt assembly means that my jar project will not work in a muliti module project (i.e. if I include it as a dependency in a war file then the war projects needs to be told to run assembly on the dependent jar project rather than package-bin - but I may be mistaken here).
Each task declares the other tasks and settings that it uses. You can use inspect to determine these inputs as described on Inspecting Settings and in a recent tutorial-style blog post by John Cheng.
In this case, the relevant task used by packageBin is mappings. The mappings task collects the files to be included in the jar and maps them to the path in the jar. Some background is explained on Mapping Files, but the result is that mappings produces a value of type Seq[(File, String)]. Here, the File is the input file providing the content and the String is the path in the jar.
So, to modify the mappings for the packageBin task, filter out the paths from the default mappings that you don't want to include:
mappings in (Compile,packageBin) ~= { (ms: Seq[(File, String)]) =>
ms filter { case (file, toPath) =>
toPath != "javax/servlet/Servlet.class"
}
}
mappings in (Compile,packageBin) selects the mappings for the main package task (as opposed to test sources or the packageSrc task).
x ~= f means "set x to the result of applying function f to the previous value of x". (See More About Settings for details.)
The filter drops all pairs where the path corresponds to the Servlet class.
I came up with this solution, it defines a new compile task which depends on the previous compile task (thus effectively allowing me to hook in right after the source is compiled and before it's packaged)
def mySettings = {
// add functionality to the standard compile task
inConfig(Compile)(Seq(compile in Compile <<= (target,streams,compile in Compile) map{
(targetDirectory, taskStream, analysis) =>
import taskStream.log
// this runs after compile but before package-bin
recursiveListFiles(targetDirectory, ".*javax.*".r) foreach {
file =>
log.warn("deleting matched resource: " + file.getAbsolutePath())
IO.delete(file)
}
analysis
})) ++
Seq(name := "MyProject", version := "1.0", exportJars := true)
}
def recursiveListFiles(f: File, r: Regex): Array[File] = {
val these = f.listFiles
val good = these.filter(f => r.findFirstIn(f.getName).isDefined)
good ++ these.filter(_.isDirectory).flatMap(recursiveListFiles(_, r))
}
Its a little bit more complicated than what I had hoped but it allows me to do all sorts of modifications prior to packaging (in this case searching the target folder deleting all class files that matches a regular expression). Also it accomplished my second goal of sticking with the default SBT lifecycle.
I am trying to setup SBT to compile an existing project which does not use the maven directory structure. I am using the full configuration and have set my javaSource & resourceDirectory settings as follows:
def settings = Defaults.defaultSettings ++ Seq(
resourceDirectory in Compile <<= baseDirectory( _ / "java" ),
javaSource in Compile <<= baseDirectory( _ / "java" )
)
Now I want to be able to filter the resources we include in the jar artifact, as we currently do with ant, plus exclude .java files as our resources are mixed in with source code. For example:
<fileset dir="java" includes="**/*.txt, **/*.csv" excludes="**/*.java" />
Is there any way to do this?
Use defaultExcludes scoped for the unmanagedResources task and optionally the configuration. For example, this setting excludes .java files from the main resources:
defaultExcludes in Compile in unmanagedResources := "*.java"
in Compile restricts this setting to only apply to main resources. By using in Test instead, it would apply only to test resources. By omitting a configuration (that is, no in Compile or in Test), the setting would apply to both main and test resources.
in unmanagedResources applies these excludes for resources only. To apply excludes to sources, for example, the scope would be in unmanagedSources. The reason for the unmanaged part is to emphasize that these apply to unmanaged (or manually edited) sources only.
The defaultExcludes key has type sbt.FileFilter, so the setting value must be of this type. In the example above, "*.java" is implicitly converted to a FileFilter. * is interpreted as a wildcard and so the filter accepts files with a name that ends in '.java'. To combine filters, you use || and &&. For example, if .scala files needed to be excluded as well, the argument to := would be:
"*.java" || "*.scala"
In the original Ant fileset, the include and exclude filters select mutually exclusive sets of files, so only one is necessary.
It is also possible to directly build the Seq[File] for unmanagedResources. For example:
unmanagedResources in Compile <<=
unmanagedResourceDirectories in Compile map { (dirs: Seq[File]) =>
( dirs ** ("*.txt" || "*.csv" -- "*.java") ).get
}
The ** method selects all descendents that match the FileFilter argument. You can verify that the files are selected as you expect by running show unmanaged-resources.
For sbt 1.48 I needed this style:
.settings(
Compile / sources := Seq(file("/path/to/your/file"))
)