I have developed a custom sbt plugin that imports the sbt native packager, and a bunch of other plugins.
This helps bootstrap a scala project easing the process of importing all plugins we use across the project.
I also have custom task for custom purposes, such a:
substituteTemplates
to do custom behaviour.
I'm also changing the behaviour of the plugin I'm importing to cope with my customizations, such as:
packageZipTarball in Universal <<= (packageZipTarball in Universal) dependsOn substituteTemplates.
Now, everything works but the user of my plugin has to do:
val example = (project in file(".")).enablePlugins(JavaAppPackaging).settings(...
if he does not do it the sbt will not compile, because sbt finds definitions of stuff (mappings in Universal and such... ) that do not exist.
Here is how I tried to solve it.
TAKE 1
Now, I'd like to have the possibility to cope with the fact that some plugins need to be enabled.
If I enable the plugins (for example JavaAppPackaging) then I do a bunch of other changes, otherwise I leave the settings untouched.
At first I tried with a custom settings key, such as:
val enableJavaAppPackaging = SettingKey[Boolean]("enableJavaAppPackaging","")
then, I tried to use it:
object MyWrapperPlugin extends AutoPlugin {
override def projectSettings: Seq[Setting[_]] = {
if (enableJavaAppPackaging.value) {
packageZipTarball in Universal <<= (packageZipTarball in Universal) dependsOn substituteTemplates,
mappings in Universal <++= templates.map {
_.keySet.map { k => file(s"target/templates/$k") -> k }.toList
},
// a lot of other stuff....
}
but this cannot work as sbt complains:
`value` can only be used within a task or setting macro, such as :=, +=, ++=, Def.task, or Def.setting
There is a way to do this kind of logic in a way that is OK also for sbt?
TAKE 2
If I cannot do this at least I'd like to turn the JavaApp PAckaging plugin by default, so not to do elsiffing at all.
object MyWrapperPlugin extends AutoPlugin {
override def requires: Plugins = AssemblyPlugin && DependencyGraphPlugin && JavaAppPackaging
but this does not work too.
Related
I have a bunch of SBT 0.13 project definitions that look like this:
lazy val coreBase = crossProject.crossType(CrossType.Pure).in(file("core"))
.settings(...)
.jvmConfigure(_.copy(id = "core"))
.jsConfigure(_.copy(id = "coreJS"))
lazy val core = coreBase.jvm
lazy val coreJS = coreBase.js
(Mostly because I'm resentful about having to maintain Scala.js builds and don't want to have to type the JVM suffix every time I'm changing projects, etc.)
This doesn't compile in SBT 1.0 because Project doesn't have a copy method now.
Okay, let's check the migration guide.
Many of the case classes are replaced with pseudo case classes generated using Contraband. Migrate .copy(foo = xxx) to withFoo(xxx).
Cool, let's try it.
build.sbt:100: error: value withId is not a member of sbt.Project
.jvmConfigure(_.withId("core"))
^
So I asked on Gitter and got crickets.
The links for the 1.0 API docs actually point to something now, which is nice, but they're not very helpful in this case, and trying to read the SBT source gives me a headache. I'm not in a rush to update to 1.0, but I'm going to have to at some point, I guess, and maybe some helpful person will have answered this by then.
(This answer has been edited with information about sbt 1.1.0+ and sbt-crossproject 0.3.1+, which significantly simplify the whole thing.)
With sbt 1.1.0 and later, you can use .withId("core"). But there's better with sbt-crossproject 0.3.1+, see below.
I don't know about changing the ID of a Project, but here is also a completely different way to solve your original issue, i.e., have core/coreJS instead of coreJVM/coreJS. The idea is to customize crossProject to use the IDs you want to begin with.
First, you'll need to use sbt-crossproject. It is the new "standard" for compilation across several platforms, co-designed by #densh from Scala Native and myself (from Scala.js). Scala.js 1.x will allways use sbt-crossproject, but it is also possible to use sbt-crossproject with Scala.js 0.6.x. For that, follow the instructions in the readme. In particular, don't forget the "shadowing" part:
// Shadow sbt-scalajs' crossProject and CrossType from Scala.js 0.6.x
import sbtcrossproject.{crossProject, CrossType}
sbt-crossproject is more flexible than Scala.js' hard-coded crossProject. This means you can customize it more easily. In particular, it has a generic notion of Platform, defining how any given platform behaves.
For a cross JVM/JS project, the new-style crossProject invocation would be
lazy val coreBase = crossProject(JVMPlatform, JSPlatform)
.crossType(CrossType.Pure)
.in(file("core"))
.settings(...)
.jvmConfigure(_.copy(id = "core"))
.jsConfigure(_.copy(id = "coreJS"))
lazy val core = coreBase.jvm
lazy val coreJS = coreBase.js
Starting with sbt-crossproject 0.3.1, you can simply tell it not to add the platform suffix for one of your platforms. In your case, you want to avoid the suffix for the JVM platform, so you would write:
lazy val coreBase = crossProject(JVMPlatform, JSPlatform)
.withoutSuffixFor(JVMPlatform)
.crossType(CrossType.Pure)
...
lazy val core = coreBase.jvm
lazy val coreJS = coreBase.js
and that's all you need to do!
Old answer, applicable to sbt-crossproject 0.3.0 and before
JVMPlatform and JSPlatform are not an ADT; they are designed in an OO style. This means you can create your own. In particular, you can create your own JVMPlatformNoSuffix that would do the same as JVMPlatform but without adding a suffix to the project ID:
import sbt._
import sbtcrossproject._
case object JVMPlatformNoSuffix extends Platform {
def identifier: String = "jvm"
def sbtSuffix: String = "" // <-- here is the magical empty string
def enable(project: Project): Project = project
val crossBinary: CrossVersion = CrossVersion.binary
val crossFull: CrossVersion = CrossVersion.full
}
Now that's not quite enough yet, because .jvmSettings(...) and friends are defined to act on a JVMPlatform, not on any other Platform such as JVMPlatformNoSuffix. You'll therefore have to redefine that as well:
implicit def JVMNoSuffixCrossProjectBuilderOps(
builder: CrossProject.Builder): JVMNoSuffixCrossProjectOps =
new JVMNoSuffixCrossProjectOps(builder)
implicit class JVMNoSuffixCrossProjectOps(project: CrossProject) {
def jvm: Project = project.projects(JVMPlatformNoSuffix)
def jvmSettings(ss: Def.SettingsDefinition*): CrossProject =
jvmConfigure(_.settings(ss: _*))
def jvmConfigure(transformer: Project => Project): CrossProject =
project.configurePlatform(JVMPlatformNoSuffix)(transformer)
}
Once you have all of that in your build (hidden away in a project/JVMPlatformNoSuffix.scala in order not to pollute the .sbt file), you can define the above cross-project as:
lazy val coreBase = crossProject(JVMPlatformNoSuffix, JSPlatform)
.crossType(CrossType.Pure)
.in(file("core"))
.settings(...)
lazy val core = coreBase.jvm
lazy val coreJS = coreBase.js
without any need to explicitly patch the project IDs.
I'm working with a huge project with lots of subprojects, some of them with subprojects of their own. On top of that, I'd like some of them to be dynamic - given a List somewhere in the project build, I'd like to create one project for each of the elements.
For those reasons, having to define a lazy val for each project in build.sbt is very cumbersome. Is there other way to declare projects, like a addProject-like method we could call anywhere? Is there some SBT plugin that helps with that?
Sbt uses macros to turns top level vals into projects, so I don't think you will be able to escape that part. However, you can define all you build in Project => Project functions: (note that you also composability "for free" with function composition)
def myConf: Project => Project =
_.enablePlugins(ScalaJSPlugin)
.settings(scalaVersion := "2.12.0")
Then simply use project.configure(myConf) for single line project definitions:
lazy val subProject1 = project.configure(myConf)
lazy val subProject2 = project.configure(myConf)
lazy val subProject3 = project.configure(myConf)
lazy val subProject4 = project.configure(myConf)
...
Suppose my entire project configuration is this simple build.sbt:
scalaVersion := "2.11.4"
libraryDependencies += "org.scalaz" %% "scalaz-core" % "7.1.0"
And this is my code:
import scalaz.Equal
import scalaz.syntax.equal._
object Foo {
def whatever[A: Equal](a: A, b: A) = a === b
}
Now when I run sbt doc and open the API docs in the browser, I see the scalaz package in the ScalaDoc root package listing, together with my Foo:
object Foo
package scalaz
Or, in case you don't believe me:
I've noticed this with Scalaz before, and I'm not the only one it happens to (see for example the currently published version of the Argonaut API docs). I'm not sure I've seen it happen with any library other than Scalaz.
If I don't actually use anything from Scalaz in my project code, it doesn't show up. The same thing happens on at least 2.10.4 and 2.11.4.
Why is the scalaz package showing up here and how can I make it stop?
I noticed this too. It also happens with the akka.pattern package from Akka as well as for example the upickle package from the upickle project.
Those three packages have two things in common:
They are package objects
They define at least one type in a mixin trait.
So I did a little experiment with two projects:
Project A:
trait SomeFunctionality {
class P(val s: String)
}
package object projectA extends SomeFunctionality
Project B (depends on Project A):
package projectB
import projectA._
object B extends App {
val p = new P("Test")
}
And voila: in the ScalaDoc of Project B appear two packages in the root package:
projectA
projectB
It seems like both criteria above have to be met, since eliminating one solves the issue.
I believe this is a bug in the scala compiler. So I can not help you with avoiding this since the only way would be to change the sources of scalaz in this case. Changing anything in ProjectB except for removing all references to ProjectA didn't help.
Update: It seems like you can instruct the compiler to exclude specific packages from scaladoc.
scaladoc -skip-packages <pack1>:<pack2>:...:<packN> files.scala
So this would be a workaround here
Scopes matters in sbt. And I'm completely OK with it. But there are also delegating rules that allows you build a hierarchical structure of settings. I'd like to use it to bring extra settings to more specific rules.
import sbt._
import Keys._
object TestBuild extends Build {
val sourceExample = settingKey[Seq[String]]("example source for setting dependency")
val targetExample = settingKey[Seq[String]]("example of a dependent setting")
override lazy val settings = super.settings ++ Seq (
sourceExample := Seq("base"),
targetExample := "extended" +: sourceExample.value,
sourceExample in Test += "testing"
)
}
The example gives me unexpected output:
> show compile:sourceExample
[info] List(base)
> show test:sourceExample
[info] List(base, testing)
> show compile:targetExample
[info] List(extended, base)
> show test:targetExample
[info] List(extended, base)
I expect test:targetExample be List(extended, base, testing) not List(extended, base). Once I've get the result I immediately figure out why exactly it works as shown. test:targetExample delegates from *:targetExample the calculated value but not the rule for calculating it in nested scope.
This behavior brings two difficulties for me writing my own plugin. I have extra work to define same rules in every scope as a plugin developer. And I have to memorize scope definitions of internal tasks to use it correctly as user.
How can I overcome this inconvenience? I'd like to introduce settings in call-by-name semantic instead of call-by-value. What tricks may work for it?
P.S. libraryDependencies in Test looks much more concise that using % test.
I should make clear that I perfectly understand that the sbt derives values just as it is described in documentation. It works as the creator intended it to work.
But why should I obey to the rules? I see them completely counter-intuitive. Sbt introduces inheritance semantic that actually works unlike how inheritance used to be defined. When you write
trait A { lazy val x : Int = 5 }
trait B extends A { lazy val y : Int = x * 2}
trait C extends A { override lazy val x : Int = 3 }
you expect (new B with C).y be 6, not 10. Knowing that it would be actually 10 allows you to use this kind of inheritance correctly but leaves your with desire to find more conventional means for implementing inheritance. You may even write your own implementation based on name->value dictionary. And you may proceed further according to the tenth rule of programming.
So I'm searching for a hack that would bring inheritance semantic in accordance with common one. As a start point I may suggest writing command to scan all settings and push them from parents to children explicitly. And than invoke this command automatically each time sbt runs.
But it seems too dirty for me, so I'm curios if there is more graceful way to achieve similar semantic.
The reason for the "incorrect" value is that targetExample depends on sourceExample in Compile scope as in:
targetExample := "extended" +: sourceExample.value
Should it use sourceExample value from Test scope, use in Test to be explicit about your wish as follows:
targetExample := "extended" +: (sourceExample in Test).value
Use inspect to know the dependency chain.
BTW, I strongly advise using build.sbt file for such a simple build definition.
You could have default settings, and reuse it in different configurations as described in Plugins Best Practices - Playing nice with configurations. I believe, this should give you a semantic similar to what you're looking for.
You can define your base settings and reuse it in different configurations.
import sbt._
import Keys._
object TestBuild extends Build {
val sourceExample = settingKey[Seq[String]]("example source for setting dependency")
val targetExample = settingKey[Seq[String]]("example of a dependent setting")
override lazy val settings = super.settings ++
inConfig(Compile)(basePluginSettings) ++
inConfig(Test)(basePluginSettings ++ Seq(
sourceExample += "testing" // we are already "in Test" here
))
lazy val basePluginSettings: Seq[Setting[_]] = Seq (
sourceExample := Seq("base"),
targetExample := "extended" +: sourceExample.value
)
}
PS. Since you're talking about writing your plugin, you may also want to look at the new way of writing sbt plugins, namely AutoPlugin, as the old mechanism is now deprecated.
I often want to do some customization before one of the standard tasks are run. I realize I can make new tasks that executes existing tasks in the order I want, but I find that cumbersome and the chance that a developer misses that he is supposed to run my-compile instead of compile is big and leads to hard to fix errors.
So I want to define a custom task (say prepare-app) and inject it into the dependency tree of the existing tasks (say package-bin) so that every time someone invokes package-bin my custom tasks is run right before it.
I tried doing this
def mySettings = {
inConfig(Compile)(Seq(prepareAppTask <<= packageBin in Compile map { (pkg: File) =>
// fiddle with the /target folder before package-bin makes it into a jar
})) ++
Seq(name := "my project", version := "1.0")
}
lazy val prepareAppTask = TaskKey[Unit]("prepare-app")
but it's not executed automatically by package-bin right before it packages the compile output into a jar. So how do I alter the above code to be run at the right time ?
More generally where do I find info about hooking into other tasks like compile and is there a general way to ensure that your own tasks are run before and after a standard tasks are invoked ?.
Extending an existing task is documented the SBT documentation for Tasks (look at the section Modifying an Existing Task).
Something like this:
compile in Compile <<= (compile in Compile) map { _ =>
// what you want to happen after compile goes here
}
Actually, there is another way - define your task to depend on compile
prepareAppTask := (whatever you want to do) dependsOn compile
and then modify packageBin to depend on that:
packageBin <<= packageBin dependsOn prepareAppTask
(all of the above non-tested, but the general thrust should work, I hope).
As an update for the previous answer by #Paul Butcher, this could be done in a bit different way in SBT 1.x versions since <<== is no longer supported. I took an example of a sample task to run before the compilation that I use in one of my projects:
lazy val wsdlImport = TaskKey[Unit]("wsdlImport", "Generates Java classes from WSDL")
wsdlImport := {
import sys.process._
"./wsdl/bin/wsdl_import.sh" !
// or do whatever stuff you need
}
(compile in Compile) := ((compile in Compile) dependsOn wsdlImport).value
This is very similar to how it was done before 1.x.
Also, there is another way suggested by official SBT docs, which is basically a composition of tasks (instead of dependencies hierarchy). Taking the same example as above:
(compile in Compile) := {
val w = wsdlImport.value
val c = (compile in Compile).value
// you can add more tasks to composition or perform some actions with them
c
}
It feels like giving more flexibility in some cases, though the first example looks a bit neater, as for me.
Tested on SBT 1.2.3 but should work with other 1.x as well.