How to override libraryDependencies in a sbt plugin? - scala

How would one override the libraryDependencies ?
I tried:
Keys.libraryDependencies in Compile := {
val libraryDependencies = (Keys.libraryDependencies in Compile).value
val allLibraries = UpdateDependencies(libraryDependencies)
allLibraries
}
So that seem to work, when I add print statement, the allLibraries is correct.
However, in the next steps, it doesn't seem to have the right values:
Keys.update in Compile := Def.taskDyn {
val u = (Keys.update in Compile).value
Def.task {
val allModules= u.configurations.flatMap(_.allModules)
log.info(s"Read ${allModules.size} modules:")
u
}
}.value
The print statement only have a few modules instead of all the one I would have added in the previous step.
Anyone have a solution ? Thanks !

So I understand where my problem was.
I was not understanding correctly how settings and tasks were working together.
settings are only evaluated once when sbt start.
and tasks are only evaluated once when sbt start a task / command which will require it.
So you cannot read and then rewrite settings like that.
It was so convoluted, I even wrote a whole article about it

Related

How to filter/disable a scalac option for all subprojects in SBT in a DRY way

My project has multiple subprojects, and I use sbt-tpolecat1 in this project.
I use a Java framework in my code. This framework use fluent interface heavily so I need to suppress many "discarded non-Unit value" warnings in my code.
This sbt-tpolecat provided a lot of useful scalac options out of the box, and I just want to exclude the -Wvalue-discard scalac option for my use case.
The problem is I have 4-5 subprojects2 in this project. And now I need to add the below to every subproject's settings:
sub_project_name.settings(
scalacOptions ~= (_.filterNot(Set("-Wvalue-discard")))
)
// or
sub_project_name.settings(valueDiscardSetting)
lazy val valueDiscardSetting =
Seq(scalacOptions ~= (_.filterNot(Set("-Wvalue-discard"))))
Is there a way to exclude this option in all subprojects in a DRY way?
My current subprojects hierarchy is similar to this:
App -> Frontend -> Common
-> Backend -> Common
Common settings val
There is a common practice of factoring out common settings in multi-project builds
define a sequence of common settings in a val and add them to each
project. Less concepts to learn that way.
for example
lazy val commonSettings = Seq(
scalacOptions ~= (_.filterNot(Set("-Wvalue-discard"))),
...
)
lazy val util = (project in file("util")).settings(commonSettings)
lazy val core = (project in file("core")).settings(commonSettings)
Common settings auto plugin
Auto plugins can set settings for every project. Create the following small plugin under project/CommonSettingsPlugin.scala
object CommonSettingsPlugin extends AutoPlugin {
override def requires = plugins.JvmPlugin
override def trigger = allRequirements
override lazy val projectSettings = Seq(
scalacOptions ~= (_.filterNot(Set("-Wvalue-discard")))
)
}
The override
override def requires = plugins.JvmPlugin
should effectively enable the plugin without having to call explicitly enablePlugin in build.sbt.
Override settings with onLoad
onLoad happens at the end after all projects are built and loaded.
lazy val settingsAlreadyOverridden = SettingKey[Boolean]("settingsAlreadyOverridden","Has overrideSettings command already run?")
settingsAlreadyOverridden := false
commands += Command.command("removeScalacOptions") { state =>
if (settingsAlreadyOverridden.value) {
state
} else {
Project.extract(state).appendWithSession(
Seq(
settingsAlreadyOverridden := true,
scalacOptions ~= (_.filterNot(Set("-Wvalue-discard")))
),
state
)
}
}
onLoad in Global := (onLoad in Global).value andThen ("removeScalacOptions" :: _)
Also consider how they addressed the problem in community-build via removeScalacOptions.

sbt: publish generated sources

I have a project where part of the sources are generated (sourceGenerators in Compile). I noticed that (in most scenarios reasonably) these sources are not published with publishLocal or publishSigned. In this case this is unfortunate because when you use this project/library as a dependency, you cannot look up the sources, for example in IntelliJ, even if the other sources of the project have been downloaded.
Can I configure sbt's publishing settings to include the generated sources in the Maven -sources.jar?
So, just to be complete, this was my solution based on #pfn's answer:
mappings in (Compile, packageSrc) ++= {
val base = (sourceManaged in Compile).value
val files = (managedSources in Compile).value
files.map { f => (f, f.relativeTo(base).get.getPath) }
}
mappings in (Compile,packageSrc) := (managedSources in Compile).value map (s => (s,s.getName)),
Just like #0__'s answer, but ported to the 'new' sbt syntax, i.e. without deprecation warnings.
Compile/packageSrc/mappings ++= {
val base = (Compile/sourceManaged).value
val files = (Compile/managedSources).value
files.map(f => (f, f.relativeTo(base).get.getPath))
}

sbt test:doc Could not find any member to link

I'm attempting to run sbt test:doc and I'm seeing a number of warnings similar to below:
[warn] /Users/tleese/code/my/stuff/src/test/scala/com/my/stuff/common/tests/util/NumberExtractorsSpecs.scala:9: Could not find any member to link for "com.my.stuff.common.util.IntExtractor".
The problem appears to be that Scaladoc references from test sources to main sources are not able to link correctly. Any idea what I might be doing wrong or need to configure?
Below are the relevant sections of my Build.scala:
val docScalacOptions = Seq("-groups", "-implicits", "-external-urls:[urls]")
scalacOptions in (Compile, doc) ++= docScalacOptions
scalacOptions in (Test, doc) ++= docScalacOptions
autoAPIMappings := true
Not sure if this is a satisfactory solution, but...
Scaladoc currently expects pairs of jar and URL to get the external linking to work. You can force sbt to link internal dependencies using JARs using exportJars. Compare the value of
$ show test:fullClasspath
before and after setting exportJars. Next, grab the name of the JAR that's being used and link it to the URL you'll be uploading it to.
scalaVersion := "2.11.0"
autoAPIMappings := true
exportJars := true
scalacOptions in (Test, doc) ++= Opts.doc.externalAPI((
file(s"${(packageBin in Compile).value}") -> url("http://example.com/")) :: Nil)
Now I see that test:doc a Scaladoc with links to http://example.com/index.html#foo.IntExtractor from my foo.IntExtractor.
Using ideas from Eugene's answer I made a following snippet.
It uses apiMapping sbt variable as adviced in sbt manual.
Unfortunately it doesn't tell how to deal with managed dependencies, even the subsection title says so.
// External documentation
/* You can print computed classpath by `show compile:fullClassPath`.
* From that list you can check jar name (that is not so obvious with play dependencies etc).
*/
val documentationSettings = Seq(
autoAPIMappings := true,
apiMappings ++= {
// Lookup the path to jar (it's probably somewhere under ~/.ivy/cache) from computed classpath
val classpath = (fullClasspath in Compile).value
def findJar(name: String): File = {
val regex = ("/" + name + "[^/]*.jar$").r
classpath.find { jar => regex.findFirstIn(jar.data.toString).nonEmpty }.get.data // fail hard if not found
}
// Define external documentation paths
Map(
findJar("scala-library") -> url("http://scala-lang.org/api/" + currentScalaVersion + "/"),
findJar("play-json") -> url("https://playframework.com/documentation/2.3.x/api/scala/index.html")
)
}
)
This is a modification of the answer by #phadej. Unfortunately, that answer only works on Unix/Linux because it assumes that the path separator is a /. On Windows, the path separator is \.
The following works on all platforms, and is slightly more idiomatic IMHO:
/* You can print the classpath with `show compile:fullClassPath` in the SBT REPL.
* From that list you can find the name of the jar for the managed dependency.
*/
lazy val documentationSettings = Seq(
autoAPIMappings := true,
apiMappings ++= {
// Lookup the path to jar from the classpath
val classpath = (fullClasspath in Compile).value
def findJar(nameBeginsWith: String): File = {
classpath.find { attributed: Attributed[java.io.File] => (attributed.data ** s"$nameBeginsWith*.jar").get.nonEmpty }.get.data // fail hard if not found
}
// Define external documentation paths
Map(
findJar("scala-library") -> url("http://scala-lang.org/api/" + currentScalaVersion + "/"),
findJar("play-json") -> url("https://playframework.com/documentation/2.3.x/api/scala/index.html")
)
}
)

Play framework: Running separate module of multi-module application

I'm trying to create a multi-module application and run one of it's modules separately from the others (from another machine).
Project structure looks like this:
main
/ \
module1 module2
I want to run a module1 as a separate jar file (or there is a better way of doing this?), which I will run from another machine (I want to connect it to the main app using Akka remoting).
What I'm doing:
Running "play dist" command
Unzipping module1.zip from universal folder
Setting +x mode to bin/module1 executable.
Setting my main class (will paste it below): instead of play.core.server.NettyServer im putting my main class: declare -r app_mainclass="module1.foo.Launcher"
Running with external application.conf file.
Here is my main class:
class LauncherActor extends Actor {
def receive = {
case a => println(s"Received msg: $a ")
}
}
object Launcher extends App {
val system = ActorSystem("testsystem")
val listener = system.actorOf(Props[LauncherActor], name = "listener")
println(listener.path)
listener ! "hi!"
println("Server ready")
}
Here is the console output:
#pavel bin$ ./module1 -Dconfig.file=/Users/pavel/projects/foobar/conf/application.conf
[WARN] [10/18/2013 18:56:03.036] [main] [EventStream(akka://testsystem)] [akka.event-handlers] config is deprecated, use [akka.loggers]
akka://testsystem/user/listener
Server ready
Received msg: hi!
#pavel bin$
So the system switches off as soon as it gets to the last line of the main method. If I run this code without Play - it works as expected, the object is loaded and it waits for messages, which is expected behavior.
Maybe I'm doing something wrong? Or should I set some options in module1 executable? Other ideas?
Thanks in advance!
Update:
Versions:
Scala - 2.10.3
Play! - 2.2.0
SBT - 0.13.0
Akka - 2.2.1
Java 1.7 and 1.6 (tried both)
Build properties:
lazy val projectSettings = buildSettings ++ play.Project.playScalaSettings ++ Seq(resolvers := buildResolvers,
libraryDependencies ++= dependencies) ++ Seq(scalacOptions += "-language:postfixOps",
javaOptions in run ++= Seq(
"-XX:MaxPermSize=1024m",
"-Xmx4048m"
),
Keys.fork in run := true)
lazy val common = play.Project("common", buildVersion, dependencies, path = file("modules/common"))
lazy val root = play.Project(appName, buildVersion, settings = projectSettings).settings(
resolvers ++= buildResolvers
).dependsOn(common, module1, module2).aggregate(common, module1, module2)
lazy val module1 = play.Project("module1", buildVersion, path = file("modules/module1")).dependsOn(common).aggregate(common)
lazy val module2: Project = play.Project("module2", buildVersion, path = file("modules/module2")).dependsOn(common).aggregate(common)
So I found a dirty workaround and I will use it until I will find a better solution. In case someone is interested, I've added this code at the bottom of the Server object:
val shutdown = Future {
readLine("Press 'ENTER' key to shutdown")
}.map { q =>
println("**** Shutting down ****")
System.exit(0)
}
import scala.concurrent.duration._
Await.result(shutdown, 100 days)
And now system works until I will hit the ENTER key in the console. Dirty, I agree, but didn't find a better solution.
If there will be something better, of course I will mark it as an answer.

Parallel execution of tests

I've noticed that SBT is running my specs2 tests in parallel. This seems good, except one of my tests involves reading and writing from a file and hence fails unpredictably, e.g. see below.
Are there any better options than
setting all tests to run in serial,
using separate file names and tear-downs for each test?
class WriteAndReadSpec extends Specification{
val file = new File("testFiles/tmp.txt")
"WriteAndRead" should {
"work once" in {
new FileWriter(file, false).append("Foo").close
Source.fromFile(file).getLines().toList(0) must_== "Foo"
}
"work twice" in {
new FileWriter(file, false).append("Bar").close
Source.fromFile(file).getLines().toList(0) must_== "Bar"
}
}
trait TearDown extends After {
def after = if(file.exists) file.delete
}
}
In addition to that is written about sbt above, you must know that specs2 runs all the examples of your specifications concurrently by default.
You can still declare that, for a given specification, the examples must be executed sequentially. To do that, you simply add sequential to the beginning of your specification:
class WriteAndReadSpec extends Specification{
val file = new File("testFiles/tmp.txt")
sequential
"WriteAndRead" should {
...
}
}
Fixed sequence of tests for suites can lead to interdependency of test cases and burden in maintenance.
I would prefer to test without touching the file system (no matter either it is business logic or serialization code), or if it is inevitable (as for testing integration with file feeds) then would use creating temporary files:
// Create temp file.
File temp = File.createTempFile("pattern", ".suffix");
// Delete temp file when program exits.
temp.deleteOnExit();
The wiki link Pablo Fernandez gave in his answer is pretty good, though there's a minor error in the example that might throw one off (though, being a wiki, I can and did correct it). Here's a project/Build.scala that actually compiles and produces the expected filters, though I didn't actually try it out with tests.
import sbt._
import Keys._
object B extends Build
{
lazy val root =
Project("root", file("."))
.configs( Serial )
.settings( inConfig(Serial)(Defaults.testTasks) : _*)
.settings(
libraryDependencies ++= specs,
testOptions in Test := Seq(Tests.Filter(parFilter)),
testOptions in Serial := Seq(Tests.Filter(serialFilter))
)
.settings( parallelExecution in Serial := false : _*)
def parFilter(name: String): Boolean = !(name startsWith "WriteAndReadSpec")
def serialFilter(name: String): Boolean = (name startsWith "WriteAndReadSpec")
lazy val Serial = config("serial") extend(Test)
lazy val specs = Seq(
"org.specs2" %% "specs2" % "1.6.1",
"org.specs2" %% "specs2-scalaz-core" % "6.0.1" % "test"
)
}
There seems to be a third option, which is grouping the serial tests in a configuration and running them separately while running the rest in parallel.
Check this wiki, look for "Application to parallel execution".
Other answers explained how to use make them run sequential.
While they're valid answers, in my opinion it's better to change your tests to let them run in parallel. (if possible)
In your example - use different files for each test.
If you have DB involved - use different (or random) users (or whatever isolation you can)
etc ...