I'm trying to come up with a structure for a large, multi-module sbt project that satisfies the following requirements:
When the root project is built, dependencies are first resolved from the modules available under the root ( i.e. if module A depends on module B version 2, which is the version of B available under the root, satisfy the dependency with whatever the build for B produces )
When modules are built individually, dependencies are resolved from repositories ( local, cache, remote )
I am aware of two vehicles to define dependencies to an sbt project: dependsOn() and the libraryDependencies setting key.
So far, in my naive structure, where all build information for the modules (A, B) was tracked at the root, I simply passed .dependsOn the project references, and the inter-module dependencies were correctly resolved in the build of R
What I would like to do, is to move/track this relationship in the build.sbt file of the modules themselves, which are then hosted in separate repositories (and pulled back occasionally to an "aggregate" tag of the parent project's repo via git submodule)
I've never had any problem doing this with maven (I assume because of being able to refer to a parent explicitly in the module's pom and there being only one way to establish a dependency) but I can't yet wrap my head around how to get it going in sbt
So my question is, will I have to write a custom resolver for this? Is there anything obvious I'm missing here?
Thanks.
I'm also having a similar setup, with an aggregate project with 100+ sub-projects. Sub-project also live in their own repository and can be built/published stand-alone or as part of the aggregate project. I don't need any special resolver for this to work.
I'm just combining both approaches you described:
project A:
groupId := "groupId"
version := "1.0.0-SNAPSHOT"
libraryDependencies += "groupId" %% "B" % version
project B:
groupId := "groupId"
version := "1.0.0-SNAPSHOT"
project R:
lazy val a = (project in file(a)).dependsOn(b)
lazy val b = (project in file(b))
I noticed that sbt is clever enough not to include the dependency on b twice.
Related
I am trying to create a local library which contains a class
myproject.scala:
object test {
def info(message: String): Unit = println(s"INFO: $message")
}
build.sbt:
name := "MyProject"
version := "0.1"
organization := "MyCorp"
scalaVersion := "2.11.0"
sbtVersion := "0.13"
I ran sbt clean compile publishLocal and I see the jar in my local ivy2 directory. What I'm unsure about is how to now use that library in another project.
do I added libraryDependencies += "MyCorp"%"myproject_2.11"%"0.1"
to the second project's sbt, and I see it in the classPath when I print it out in the repl. The problem is when I try
import MyCorp.myproject
I get an error not found. I'm sure I'm missing something simple, but it's driving me nuts.
I ran sbt clean compile and I see the jar in my local ivy2 directory.
That's weird. sbt clean compile does not publish the artifact in the local repository. (Have you copied it manually there?) That should have been done with publishLocal command and the artifact should become available at {path_to_.ivy2}/local/MyCorp/MyProject/0.1/jars/MyProject.jar.
Now in your second project, it can be added as
libraryDependencies += "MyCorp" % "MyProject" % "0.1"
// or in libraryDependencies ++= Seq(...)
Please notice that the _2.11 suffix that you have used in the name depends on how the first project was built, whether its build was differentiated by Scala versions. If it was, the suffix would be usually present in the artifact .jar file name. And it is preferable to avoid including the suffix in the library dependency declaration, but rather use %% for built-in support.
After checking it, also try to restart the SBT CLI, because unfortunately sometimes changes in build.sbt are not taken into account on-the-fly.
Update
I assume its mycorp.myproject.test , but I tried every possible combination. #Brian
Following the comments, I think that there still should be something misconfigured in the project and/or missing in the description.
Assuming there is a file {path/to/project}/src/main/scala/mycorp/myproject/Test.scala, with the following contents:
package mycorp.myproject
object Test {
def info(message: String): Unit = println(s"INFO: $message")
}
When the artifact is published, the .jar file should contain the folders mycorp/myproject with Test.class and Test$.class files.
After adding the .jar to the dependencies of the second project, importing Test into another class should look like:
package mycorp.myproject2
import mycorp.myproject.Test
object AnotherTest extends App {
Test.info("hello")
}
I hope this helps.
End-of-update
I'd like to know how to convert a regular scala project into an sbt project. I've tried manually creating an sbt file on the root directory, correctly implemented, but Intellij still doesn't recognize this as a sbt project, i.e, it won't show me in the "View -> Tool Windows" the "SBT" option.
How should I go about this? What I'm actually attempting to do is to create an empty project with multiple (independent) modules.
From what I've gathered there seems to be no way to add a module directly with sbt support, am I right?
Thanks
Here is an example of a multi-project build. The root project "aggregates" them all in case you want to compile them all together or package them all together, etc. The "coreLibrary" project depends on the code of "coreA" and "coreB".
import sbt.Keys._
import sbt._
name := "MultiProject"
lazy val root = project.in(file(".")).aggregate(coreA, coreB, coreLibrary)
lazy val coreA = Project("CoreA", file("core-a")).settings(
organization := "org.me",
version := "0.1-SNAPSHOT"
)
lazy val coreB = Project("CoreB", file("core-b")).settings(
organization := "org.me",
libraryDependencies += "org.apache.kafka" %% "kafka" % "0.8.2-beta",
version := "0.3-SNAPSHOT"
)
lazy val coreLibrary = Project("UberCore", file("core-main")).dependsOn(coreA, coreB).settings(
organization := "org.me",
version := "0.2-SNAPSHOT"
)
You can (for example) compile each project from the command line:
>sbt CoreB/compile
Or you can do this interactively:
>sbt
>project CoreB
>compile
I recommend you to use a single multiple-module SBT project. sbt is a great build tool for scala, you can do a lot of things with sbt, including checking out from the repository one module and built it.
sbt
projects
project <helloProject>
Actually, this feature allows multiple people to work on the same project in parallel. Please take a look at this: http://www.scala-sbt.org/0.13.5/docs/Getting-Started/Multi-Project.html.
I have an SBT Scala multi-project with the following structure:
multiprojectRoot
project/SharedProjectBuildCode.scala
project1
src/sourceFiles
project1-build.sbt
project2
src/sourceFiles
project2-build.sbt
projectN
src/sourceFiles
projectN-build.sbt
multiprojectRoot/project/SharedProjectBuildCode.scala: contains multi-project definitions that use dependsOn to create dependencies on local projects. For example:
lazy val project2 = Project(
...
).dependsOn(project1)
multiprojectRoot/project2/project2-build.sbt: Contains the settings and dependencies for a given project. For example:
name := "project2"
libraryDependencies ++= Seq(
...
"my.company" % "project1" % "1.0"
)
First dependency to project1 is declared with dependsOn on SharedProjectBuildCode.scala file and the second is created on standalone project2-build.sbt build definition file.
So, project2 definition contains either:
an ambiguous dependency to project1 or
a double dependency to project1
We want to keep this project structure, because is the best for our current workflow:
Independent .sbt files serve standalone deployment purposes for each project on our continuous delivery server.
Multi-project .scala file with dependsOn is used to facilitate development, allowing us to avoid things such as continuous publishLocal.
We need to have control for such dependency ambiguities someway. Can you help me?
I think you should have in SharedProjectBuildCode.scala
lazy val root = Project(id = "Main-Project",
base = file(".")) aggregate(project1, project2,..)
lazy val project2 = Project(id = "project2",
base = file("project1")).dependsOn(project1)
...
And don't need to add as dependency in build.sbt anymore.
I was able to control which dependency set loaded on each use case by using the rules of build files loading provided by SBT.
When you load SBT from a given root directory, it looks for *.sbt files on the root directory and also for *.scala on the root/project directory. If you have a multi-project build, then it also reads the definitions of .sbt files that are encountered on child projects, but it will not use project/.scala files on child projects:
.sbt build definition
Multi-project builds
So, I changed my multi-project build the following way:
multiprojectRoot
project/SharedProjectBuildCode.scala
project1
src/sourceFiles
project/DeploymentOnlyCode.scala
project1-build.sbt
project2
src/sourceFiles
project/DeploymentOnlyCode.scala
project2-build.sbt
projectN
src/sourceFiles
project/DeploymentOnlyCode.scala
projectN-build.sbt
This way, depending on the use case I run SBT from the multi-project root or a project internal directory:
Development: SBT is run from multiprojectRoot directory. It takes the advantages of having a multi-project build (such as using dependsOn and avoiding publishLocal).
Production: SBT is run from within a concrete project directory, such as multiprojectRoot/project2. It allows the project to be built as stand-alone, having all dependencies as explicit external (useful for declaring a sequence of dependencies on production, continuous integration server).
Now, a project has 3 instances of code that aggregates their attributes for a final build:
multiprojectRoot/project/SharedProjectBuildCode.scala: Contains local dependencies and other code relevant for multi-project build.
multiprojectRoot/project1/project1-build.sbt: Contains project build attributes, common for multi-project and standalone build of a project, such as name or dependencies that are always external. The same should be done for other multi-project projects of the same level, to be explicitly treated as external dependency artifacts.
multiprojectRoot/project1/project/DeploymentOnlyCode.scala: Contains build attributes that will only be taken into consideration for stand-alone build. The same can be done on other sub-projects, if these require to define deployment specific attributes.
This also gives maximum control on how a project is built, whether is a releasable artifact or not, and handle source code relevant only for a given project, as a complete and independent piece.
I am new to sbt (using sbt.version=0.13.5) created multiproject build definition as following (build.sbt):
name := "hello-app"
version in ThisBuild := "1.0.0"
organization in ThisBuild := "com.jaksky.hello"
scalaVersion := "2.10.4"
ideaExcludeFolders ++= Seq (
".idea",
".idea_modules"
)
lazy val common = (
Project("common",file("common"))
)
lazy val be_services = (
Project("be-services",file("be-services"))
dependsOn(common)
)
My expectation was that sbt will generate directory layout for the projects (based on the documentation). What happened was that just only top directories were generated (common and be-services) with target folder in it.
I tried it in batch mode sbt compile or in interactive mode - none has generated expected folder structures e.g. /src/{main, test}/{scala, java, resources}.
So either my expectations are wrong or there is some problem in my definition or some speciall setting, plugin etc.
Could some more experienced user clarify that, please?
Thanks
As #vptheron correctly points out, sbt does not generate any project directories, with the exception of the target directory when it produces compiled class files.
You might find that functionality in plugins, e.g. np. Also if you use an IDE such as IntelliJ IDEA, creating a new sbt-based project will initialize a couple of directories (such as src).
I am trying to figure out how idea will recognize thrid party dependencies when using SBT. When I use the sbt plugin gen-idea it seems to download all the necessary dependencies which get put into my ~/.ivy/ directory as expected. How can intellij use these deps?
EDIT:
One thing I noticed is if I make a new idea project instead of just a module then this works? Any idea why this would be? I would like to be able to have multiple sbt modules in the same project.
The sbt-idea plugin works with multi-module sbt project. We have been using it since somewhere around sbt-0.10.0, and currently are at sbt-0.11.2. It seems like you have the dependency part of the build file set up ok, so here's an example of how we do the project setup from a full specification Build.scala file:
object Vcaf extends Build {
import Resolvers._
import Dependencies._
import BuildSettings._
lazy val vcafDb = Project(
id = "vcaf-db",
base = file("./vcaf-db"),
dependencies = Seq(),
settings = buildSettings ++ /* proguard */ SbtOneJar.oneJarSettings ++ Seq(libraryDependencies := dbDeps, resolvers := cseResolvers)
)
lazy val vcaf = Project(
"vcaf",
file("."),
dependencies = Seq(vcafDb),
aggregate = Seq(vcafDb),
settings = buildSettings ++ Seq(libraryDependencies := vcafDeps, resolvers := cseResolvers) ++ webSettings
)
}
In the example, the vcaf-db project is in the a folder within the vcaf project folder. The vcaf-db project does not have it's own build.sbt or Build.scala file. You'll notice that we are specifying libraryDependencies for each project, which may or may not be your missing link.
As ChrisJamesC mentioned, you need to do a "reload" from within SBT (or exit sbt and come back in) to pick up changes to your build definition. After the project is reloaded, you should be able to do a "gen-idea no-classifiers no-sbt-classifiers" and get an intellij project that has the main project, modules, and library access as defined in the build file.
Hope it helps!
If you want multiple SBT modules in one IDEA project, you can use sbt multi-project builds (aka subprojects). Just create a master project that refers to the modules as sub-projects, then run gen-idea on the master. To specify dependencies among the modules you have to use Build.scala (not build.sbt), as in jxstanford's answer or like this:
lazy val foo = Project(id = "foo", base = file("foo"))
lazy val bar = Project(id = "bar", base = file("bar")) dependsOn(foo)
One level of subprojects works fine (with the dependencies correctly reflected in the resulting IDEA project), but nested subprojects don't seem to work. Also, it seems to be an sbt restriction that the subprojects must live in subdirectories of the master project (i.e., file("../foo") is not allowed).
See also How to manage multiple interdependent modules with SBT and IntelliJ IDEA?.