I have a project where I'm having several Object that contains the main function. I would like to package several JAR out of this project where each JAR contains only certain packages. For example., here is my src folder contents:
src
main
scala
com.myproject
package1
SomeClass1.scala
package2
SomeClass2.scala
package3
SomeClass3.scala
package4
SomeClass4.scala
Now what I want to do is to run several SBT commands where I can specify some command line arguments that would then exclude either package1 or package2 and so on.
I understand that I could so so with some filtering like below, but how do I combine the command line argument to check which package I should exclude?
// Filter when packaging
excludeFilter in (Compile, unmanagedSources) ~= { _ ||
((f: File) =>
f.getPath.containsSlice("/package1/"))
}
unmanagedSources / excludeFilter := "Main.scala"
So I would like to check the command line argument to build.sbt and use that to exclude the respective packages all-together. Any inputs on how I could do this?
Here is what I tried, but does not work as expected:
import complete.DefaultParsers._
val kfp = inputKey[Unit]("A demo input task.")
kfp := {
// get the result of parsing
val args: Seq[String] = spaceDelimited("<arg>").parsed
// Here, we also use the value of the `scalaVersion` setting
println("The current Scala version is " + scalaVersion.value)
println("The arguments to demo were:")
args foreach println
if (args(0) == "package1")
// Filter when packaging
excludeFilter in (Compile, unmanagedSources) ~= { _ ||
((f: File) =>
f.getPath.containsSlice("/package2/"))
}
else if (args(0) == "package2")
// Filter when packaging
excludeFilter in (Compile, unmanagedSources) ~= { _ ||
((f: File) =>
f.getPath.containsSlice("/package3/"))
}
else if (args(0) == "package3")
// Filter when packaging
excludeFilter in (Compile, unmanagedSources) ~= { _ ||
((f: File) =>
f.getPath.containsSlice("/package4/"))
}
else
// Filter when packaging
excludeFilter in (Compile, unmanagedSources) ~= { _ ||
((f: File) =>
f.getPath.containsSlice("/package1/"))
}
}
Call it like
sbt package "kfp package1"
Related
I have a play 2.8.x application that uses scala.
The sbt project has a play web project and another library module.
Is it possible to interact with the other module in a REPL?
I have ammonite installed on my system also, but not sure how to load my module. Do I just have to build and then reference the library in my /target build folder? Or is there a better way?
Can I do this in sbt by itself or ammonite is the only way?
Every sbt project has a REPL, you just have to run:
sbt> console
for root project or for name project
sbt> name/console
But this is normal Scala REPL, if you want ammonite, then there is instruction on ammonite.io:
You can also try out Ammonite 2.1.4 in an existing SBT project. To do so, add the following to your build.sbt
libraryDependencies += {
val version = scalaBinaryVersion.value match {
case "2.10" => "1.0.3"
case _ ⇒ "2.1.4"
}
"com.lihaoyi" % "ammonite" % version % "test" cross CrossVersion.full
}
sourceGenerators in Test += Def.task {
val file = (sourceManaged in Test).value / "amm.scala"
IO.write(file, """object amm extends App { ammonite.Main.main(args) }""")
Seq(file)
}.taskValue
// Optional, required for the `source` command to work
(fullClasspath in Test) ++= {
(updateClassifiers in Test).value
.configurations
.find(_.configuration.name == Test.name)
.get
.modules
.flatMap(_.artifacts)
.collect{case (a, f) if a.classifier == Some("sources") => f}
}
After that, simply hit
sbt projectName/test:run
or if there are other main methods in the Test scope
sbt projectName/test:run-main amm
How can I use a moduleID: ModuleID for a "sibling" project to access settings keys?
I'm writing an SBT plugin for multi-module builds.
I have project A (which dependsOn B) and project B.
Both projects have my-own generate and mybuild tasks as settings keys.
The mybuild task consumes the value from generate - this works fine.
B doesn't depend upon anything, so B's mybuild only needs the key for B:generate and all is well.
I want A's mybuild to consume both A:generate and B:generate based on the fact that A dependsOn B in the build.sbt file.
The only promising key(s) I've found return the projects as : ModuleID instances, so is there some way to get a list of settings keys from a ModuleID?
... or should I be doing this another way?
Solution (Kind of)
Whth #himos help this ...
(myTaskKey in myConfig) := {
loadedBuild.value.allProjectRefs.find(_._1 == thisProjectRef.value).map(_._2) match {
case Some(myCurrentProject) =>
if (myCurrentProject.dependencies.nonEmpty)
sys.error {
myCurrentProject.dependencies
.map {
myDependsOnProject: ClasspathDep[ProjectRef] =>
(myDependsOnProject.project / myConfig / myTaskKey).value
// https://www.scala-sbt.org/0.13/docs/Tasks.html#Dynamic+Computations+with
}
.foldLeft("mine.dependencies:")(_ + "\n\t" + _)
}
}
}
... sort of works.
It causes an error that implies I've accessed the correct object, even if the SBT macros don't like it.
I think ModuleID that you mention relates to dependency management, not sub projects.
For taking sub project setting/task keys project scope can be used:
(generate in A).value
(generate in B).value
More comprehensive example:
name := "A"
version := "1.0"
scalaVersion := "2.12.5"
val generate = TaskKey[String]("generate")
val myBuild = TaskKey[String]("myBuild")
val a = (project in file(".")).settings(Seq(
generate := "A_generate"
))
val b = (project in file("proj_b")).settings(Seq(
generate := "B_generate",
myBuild := (generate in a).value + "_" + generate.value
)).dependsOn(a)
Sbt console output:
sbt:A> show b/myBuild
[info] A_generate_B_generate
Having a bit of trouble with the following scenario, where one of the sub-modules in the aggregate list is an SBT plugin, but the aggregate project is cross built for 2.10.x and 2.11.x.
The exclusion logic works well in general, except for any command that beings with a +. This looks like an SBT bug, as when the following block is triggered with say +publish, the compilation and doc generation in the SBT plugin fails for Scala 2.11, but it shouldn't be included in the first place.
Somehow SBT ignores that, and I've failed to find a different solution for said problem.
How can one define either an exclusion filter or module exclusion conditional on the Scala version being 2.10.x in such a way that it doesn't cause problems with + commands?
lazy val sbtPlugin = (project in file("projectx-sbt"))
.settings(sharedSettings: _*)
.settings(
scalaVersion := "2.10.6",
publish := {
CrossVersion.partialVersion(scalaVersion.value).map {
case (2, scalaMajor) if scalaMajor >= 11 => false
case _ => true
}
},
publishMavenStyle := false,
excludeFilter := {
CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, scalaMajor)) if scalaMajor >= 11 => NothingFilter
case _ => AllPassFilter
}
},
sbtPlugin := true,
..
)
I have a project defined as follows:
lazy val tests = Project(
id = "tests",
base = file("tests")
) settings(
commands += testScalalib
) settings (
sharedSettings ++ useShowRawPluginSettings ++ usePluginSettings: _*
) settings (
libraryDependencies <+= (scalaVersion)("org.scala-lang" % "scala-reflect" % _),
libraryDependencies <+= (scalaVersion)("org.scala-lang" % "scala-compiler" % _),
libraryDependencies += "org.tukaani" % "xz" % "1.5",
scalacOptions ++= Seq()
)
I would like to have three different commands which will compile only some files inside this project. The testScalalib command added above for instance is supposed to compile only some specific files.
My best attempt so far is:
lazy val testScalalib: Command = Command.command("testScalalib") { state =>
val extracted = Project extract state
import extracted._
val newState = append(Seq(
(sources in Compile) <<= (sources in Compile).map(_ filter(f => !f.getAbsolutePath.contains("scalalibrary/") && f.name != "Typers.scala"))),
state)
runTask(compile in Compile, newState)
state
}
Unfortunately when I use the command, it still compiles the whole project, not just the specified files...
Do you have any idea how I should do that?
I think your best bet would be to create different configurations like compile and test, and have the appropriate settings values that would suit your needs. Read Scopes in the official sbt documentation and/or How to define another compilation scope in SBT?
I would not create additional commands, I would create an extra configuration, as #JacekLaskowski suggested, and based on the answer he had cited.
This is how you can do it (using Sbt 0.13.2) and Build.scala (you could of course do the same in build.sbt, and older Sbt version with different syntax)
import sbt._
import Keys._
object MyBuild extends Build {
lazy val Api = config("api")
val root = Project(id="root", base = file(".")).configs(Api).settings(custom: _*)
lazy val custom: Seq[Setting[_]] = inConfig(Api)(Defaults.configSettings ++ Seq(
unmanagedSourceDirectories := (unmanagedSourceDirectories in Compile).value,
classDirectory := (classDirectory in Compile).value,
dependencyClasspath := (dependencyClasspath in Compile).value,
unmanagedSources := {
unmanagedSources.value.filter(f => !f.getAbsolutePath.contains("scalalibrary/") && f.name != "Typers.scala")
}
))
}
now when you call compile everything will get compiled, but when you call api:compile only the classes matching the filter predicate.
Btw. You may want to also look into the possibility of defining different unmanagedSourceDirectories and/or defining includeFilter.
I have a particular task I'd like to automate as part of a build process, but I'm stuck at grammar stage with sbt. I'm trying to do a helloworld-ish task using two local projects, one the plugin and one a test using that plugin, but I can't get the new task in the plugin (sampleIntTask) to be available when using sbt on the test project.
I have the following in the filesystem:
/plugin/
Plugin.scala
build.sbt
/test-using-plugin/
build.sbt
/project/plugins.sbt
For my helloworld-ish plugin: in Plugin.scala :
import sbt._
import Keys._
object MyPlugin extends Plugin {
val sampleIntTask = taskKey[Int]("sum 1 and 2")
sampleIntTask := {
val sum = 1 + 2
println("sum: " + sum)
sum
}
}
in plugin/build.sbt:
sbtPlugin := true
name := "myPlugin"
version := "0.1"
scalaVersion := "2.10.3"
and for testing it: in test-using-plugin/build.sbt:
name := "test-test-test"
version := "0.1"
scalaVersion := "2.10.3"
and in test-using-plugin/project/plugins.sbt:
lazy val root = project.in( file(".") ).dependsOn( testPlugin )
lazy val testPlugin = file("/Users/cap10/gitprojects/processing")
When I /test-project$ sbt sampleIntTask, I get:
[info] Set current project to test-using-plugin (in build ...)
> sampleIntTask
[error] Not a valid command: sampleIntTask
[error] Not a valid project ID: sampleIntTask
[error] Expected ':' (if selecting a configuration)
[error] Not a valid key: sampleIntTask (similar: compileInputs)
[error] sampleIntTask
[error] ^
I feel like this is about the right level of complexity for this test (define plugin project config, define plugin project behavior, define test project config, add dependency on plugin project), but I'd be unsurprised if I'm totally off based on the grammar as I can't make heads or tails of the sbt intro.
build.sbt
If you do not need to share the settings across multiple builds, you can just add your settings to test-using-plugin/custom.sbt:
val sampleIntTask = taskKey[Int]("sum 1 and 2")
sampleIntTask := {
val sum = 1 + 2
println("sum: " + sum)
sum
}
and forget about the plugin.
Local plugin way
I haven't tested the other parts, but your Plugin.scala is wrong.
The setting expression needs to be in a setting sequence:
import sbt._
import Keys._
object MyPlugin extends Plugin {
val sampleIntTask = taskKey[Int]("sum 1 and 2")
lazy val baseMyPluginSettings: Seq[sbt.Def.Setting[_]] = Seq(
sampleIntTask := {
val sum = 1 + 2
println("sum: " + sum)
sum
}
)
lazy val myPluginSettings: Seq[sbt.Def.Setting[_]] = baseMyPluginSettings
}
And in your test-using-plugin/build.sbt add:
myPluginSettings
If you have to share settings across the builds, you can make a plugin like this or put them in global sbt file. The use of global sbt should be limited to user-specific settings and commands, so that's out. Personally, I would publish the plugin locally using publishLocal so it doesn't depend on specific file path. You can use the locally published plugin like any other plugins:
addSbtPlugin("com.example" % "myPlugin" % "0.1" changing())
By using "-SNAPSHOT" version or by calling changing(), sbt will check for the latest.