sbt - deep child modules - scala

I'm new to sbt and I want to reproduce a complex project structure with many nested modules.
For example, I have the following structure:
.
build.sbt
|_web
|_api
|_dto
|_domain
build.sbt is as follows:
name := "myProject"
version := "1.0"
scalaVersion := "2.12.4"
resolvers += Resolver.sonatypeRepo("public")
libraryDependencies += "com.typesafe.play" %% "play" % "2.6.10"
lazy val commonSettings = Seq(
organization := "com.example",
version := "0.1",
scalaVersion := "2.12.4"
)
// root module
lazy val root = (project in file("."))
.aggregate(domain, web)
// domain module
lazy val domain = project.settings(commonSettings)
// web module
lazy val web = project.settings(
commonSettings,
libraryDependencies := Seq("com.typesafe.play" %% "play" % "2.6.10"),
name := "myproj-web"
).dependsOn(domain)
// web api module
lazy val webApi = (project in file("./web/api")).settings(
commonSettings,
libraryDependencies := Seq("com.typesafe.play" %% "play" % "2.6.10"),
name := "myproj-web-api"
).dependsOn(domain)
First problem I have is I can't access my libraries in web/api, though I can in web/.
Second problem is that I don't like file("./web/api"). Is it possible to make sbt understand nested folders as it understands plain folders (like web or domain).
Also, is it possible then to have build.sbt for each module. For example, for web to contain build file for api and dto, but preserving aggregations and ability to call build only on root project and have all the rest projects be built.

Related

Can't resolve docker related sbt tags

I'm trying to add sbt-docker to my sbt build of my play website but I'm running into an issue. For some reason none of the docker related stuff on the bottom can resolve.
project/plugins.sbt
logLevel := Level.Warn
resolvers ++= Seq(
"Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/"
)
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.5.9")
build.sbt
name := "personal_site"
version := "1.1"
lazy val `personal_site` = (project in file(".")).enablePlugins(PlayScala,DockerPlugin)
scalaVersion := "2.11.7"
libraryDependencies ++= Seq( jdbc , cache , ws , specs2 % Test )
unmanagedResourceDirectories in Test <+= baseDirectory ( _ /"target/web/public/test" )
resolvers += "scalaz-bintray" at "https://dl.bintray.com/scalaz/releases"
dockerfile in docker := {
val targetDir = "/usr/src"
new Dockerfile {
from("flurdy/activator")
//More goes here
}
}
imageNames in docker := Seq(
// Sets the latest tag
ImageName(s"${name.value}:latest"),
// Sets a name with a tag that contains the project version
ImageName(
namespace = None,
repository = name.value,
tag = Some("v" + version.value)
)
)
Here's an image of what it looks like in IntelliJ
I've also tried adding addSbtPlugin("se.marcuslonnberg" % "sbt-docker" % "1.4.0") to my project/plugins.sbt but I get this error about DockerPlugin being imported twice.
~/Sync/Projects/Programming/Personal_Site (master ✘)✹ ᐅ sbt clean
[info] Loading project definition from /home/ryan/Sync/Projects/Programming/Personal_Site/project
/home/ryan/Sync/Projects/Programming/Personal_Site/build.sbt:5: error: reference to DockerPlugin is ambiguous;
it is imported twice in the same scope by
import _root_.sbtdocker.DockerPlugin
and import _root_.com.typesafe.sbt.packager.docker.DockerPlugin
lazy val `personal_site` = (project in file(".")).enablePlugins(PlayScala,DockerPlugin)
Try changing your build.sbt config to this.
lazy val root = (project in file(".")).enablePlugins(sbtdocker.DockerPlugin, PlayScala)
It removes the ambiguity by using the full name to DockerPlugin, since sbt-native-packager uses the same name for its Docker plugin I believe.
Maybe worth raising a Github issue with the author's repo so they can document it in the project docs.

SBT: How to define dependencies of subprojects in subprojects' build.sbt files?

The following build.sbt file works, but it defines the dependencies of all subprojects:
name := "myproject"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.scalafx" %% "scalafx" % "8.0.60-R9"
)
lazy val aLib = (project in file("lib/a"))
lazy val bLib = (project in file("lib/b"))
.dependsOn(aLib)
.dependsOn(cLib)
lazy val cLib = (project in file("lib/c"))
.dependsOn(aLib)
lazy val myApp = (project in file("myapp"))
.dependsOn(aLib)
.dependsOn(bLib)
.dependsOn(cLib)
.aggregate(aLib, bLib, cLib)
Since each subproject (directories lib/a, lib/b, lib/c, myapp) has its own build.sbt file, I would like to use those build files to define the individual dependencies of each project.
I tried to move the dependsOn/aggregate statements to the subprojects' build files, but I am not able to make it work that way. What is the recommended way?

Adding module dependency information in sbt's build.sbt file

I have a multi module project in IntelliJ, as in this screen capture shows, contexProcessor module depends on contextSummary module.
IntelliJ takes care of everything once I setup the dependencies in Project Structure.
However, when I run sbt test with the following setup in build.sbt, I got an error complaining that it can't find the packages in contextSummary module.
name := "contextProcessor"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "2.2.2" % "test"
How to teach sbt that the missing modules are found?
I could use the build.sbt file in the main root directory.
lazy val root = (project in file(".")).aggregate(contextSummary, contextProcessor)
lazy val contextSummary = project
lazy val contextProcessor = project.dependsOn(contextSummary)
Reference: http://www.scala-sbt.org/0.13.5/docs/Getting-Started/Multi-Project.html
For testing only one project, I can use project command in sbt.
> sbt
[info] Set current project to root (in build file:/Users/smcho/Desktop/code/ContextSharingSimulation/)
> project contextProcessor
[info] Set current project to contextProcessor (in build file:/Users/smcho/Desktop/code/ContextSharingSimulation/)
> test
For batch mode as in How to pass command line args to program in SBT 0.13.1?
sbt "project contextProcessor" test
I think a simple build.sbt might not be enough for that.
You would need to create a more sophisticated project/Build.scala like that:
import sbt._
import sbt.Keys._
object Build extends Build {
lazy val root = Project(
id = "root",
base = file("."),
aggregate = Seq(module1, module2)
)
lazy val module1 = Project(
id = "module1",
base = file("module1-folder"),
settings = Seq(
name := "Module 1",
version := "1.0",
scalaVersion := "2.11.7",
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "2.2.2" % "test"
lazy val module2 = Project(
id = "module2",
base = file("module2-folder"),
dependencies = Seq(module1),
settings = Seq(
name := "Module 2",
version := "1.0",
scalaVersion := "2.11.7",
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "2.2.2" % "test"
}

How to set up jacoco4sbt to process classes in main and submodules in Play?

I'm having some problems to make jacoco4sbt working with my Play 2.3.4 project.
My project is composed of 3 submodules: common, api and frontend and has no code in the app root folder. Now when I run Jacoco it does not find the submodules classes.
Inspecting target/scala-VERSION/classes I only find some routing classes (which in fact is the only code I have in my "root" project, but I was expecting that because I aggregate all those projects the classes would be there).
If I copy the classes from MODULE_NAME/target/scala-VERSION/classes to target/scala-VERSION/classes and then run Jacoco I get the expected result.
So what is the best way to make it work? I can't find any config in jacoco4sbt to specify additional classes locations.
My build.sbt file
import Keys._
// Dummy value to deal with bug in sbt 0.13.5
val k = 0
name := "PlayApp"
version := "0.5.0"
// omitted resolvers part
scalaVersion := "2.10.4"
libraryDependencies ++= Seq(
"com.edulify" %% "play-hikaricp" % "1.5.0" exclude("com.jolbox", "bonecp"),
"com.novocode" % "junit-interface" % "0.11" % "test"
)
lazy val common = project.in(file("common")).enablePlugins(PlayJava)
lazy val frontend = project.in(file("frontend")).enablePlugins(PlayJava).dependsOn(common)
lazy val api = project.in(file("api")).enablePlugins(PlayJava).dependsOn(common)
lazy val main = project.in(file(".")).enablePlugins(PlayJava)
.aggregate(frontend, api).dependsOn(frontend, api)
parallelExecution in Test := false
javaOptions in Test += "-Dconfig.resource=test.conf"
jacoco.sbt
import de.johoop.jacoco4sbt._
import JacocoPlugin._
jacoco.settings
Keys.fork in jacoco.Config := true
parallelExecution in jacoco.Config := false
jacoco.outputDirectory in jacoco.Config := file("target/jacoco")
jacoco.reportFormats in jacoco.Config := Seq(XMLReport("utf-8"), HTMLReport("utf-8"))
jacoco.excludes in jacoco.Config := Seq("views*", "*Routes*", "controllers*routes*", "controllers*Reverse*", "controllers*javascript*", "controller*ref*")
javaOptions in jacoco.Config += "-Dconfig.resource=test.conf"
Add jacoco.sbt to every subproject with the following content:
jacoco.settings
p.s. I've been looking for ways to convince sbt to have jacoco.settings applied to every subproject in the top-level root build.sbt, but to no avail.

Factoring libraryDependencies in multi project Build.sbt

I'm trying to write a concise multi project Build.sbt, so I tried to put all library dependencies in root project and then make others depends on it. My Build.sbt looks like the following:
object KataBuild extends Build {
lazy val fizzBuzz = Project(
id = "fizzBuzz",
base = file("fizzBuzz"),
settings = Project.defaultSettings ++ Seq(
name := "fizzBuzz",
version := "1.0",
scalaVersion := "2.10.3"
)
)
lazy val kata = Project(
id = "scala-kata",
base = file("."),
settings = Project.defaultSettings ++ Seq(
name := "scala-kata",
version := "1.0",
scalaVersion := "2.10.3",
libraryDependencies ++= Seq(
"org.scalatest" %% "scalatest" % "2.1.0" % "test"
)
)
) aggregate(fizzBuzz)
fizzBuzz dependsOn(kata)
}
But running test from the main project (scala-kata) fails to build test for fizzBuzz. What am I missing?
Your question is similar to this one. In short, fizzBuzz.dependsOn(kata) means that its compile configuration depends on the kata's compile configuration, but you want to link the test configurations.
The 'Per-configuration classpath dependencies' section of the sbt docs show you how you can make a test->test dependency instead.
However, if you are not going to use kata's test sources but are just looking for a way to include Scala-Test in fizzBuzz, just add it explicitly to fizzBuzz's library dependencies, too. You can define a helper value
lazy val scalaTest = "org.scalatest" %% "scalatest" % "2.1.0" % "test"
Then you can add it to be sub project's library dependencies (libraryDependencies += scalaTest).