Accumulate subprojects' `discoveredMainClasses` - scala

How can I accumulate all the discoveredMainClasses of a project, along with its dependent sub projects in SBT?
For example, I have a project that looks like
├── bar
│   └── src
│   └── main
│   └── scala
│   └── BarMain.scala
├── build.sbt
├── foo
│   └── src
│   └── main
│   └── scala
│   └── FooMain.scala
├── project
│   └── build.properties
└── root
With one root project that aggregate(foo, bar), I get the following for discoveredMainClasses:
[info] foo/compile:discoveredMainClasses
[info] List(MainFoo)
[info] bar/compile:discoveredMainClasses
[info] List(MainBar)
[info] root/compile:discoveredMainClasses
[info] List()
With one root that only dependsOn(foo, bar) I get
> show discoveredMainClasses
[info] *
How can I have show root/discoveredMainClasses contain both MainFoo and MainBar?
For context, I have other tasks that depend on the output from discoveredMainClasses namely the makeBashScripts in native-packager

The core idea is to create a module that depends on all all the sub modules you want to include and configure all settings on this module.
This results in a build.sbt like this
lazy val root = project.in(file("."))
// package the root module, but not the sub modules
.enablePlugins(JavaAppPackaging)
.settings(
name := "application",
// add the discoveredMainClasses to this project
discoveredMainClasses in Compile ++= (discoveredMainClasses in (client, Compile)).value,
discoveredMainClasses in Compile ++= (discoveredMainClasses in (server, Compile)).value
)
// include these modules in the resulting package
.dependsOn(client, server)
lazy val client = project.in(file("client"))
.settings(
name := "client"
)
lazy val server = project.in(file("server"))
.settings(
name := "server"
)
The (discoveredMainClasses in (client, Compile)).value accesses the discoveredMainClasses from the client project in the Compile scope.
You can build and run your applications with
$ sbt universal:stage
$ ./target/universal/stage/bin/client-app
$ ./target/universal/stage/bin/server-app
A running example can be found here.
cheers,
Muki

An alternative way to #Muki's answer would be to define a ScopeFilter that includes everything but root and accumulate main classes that way. This has the advantage of not having to repeat client, server everywhere.
The resulting build.sbt:
lazy val allCompileButRootFilter =
ScopeFilter(inAggregates(ThisProject, includeRoot = false), inConfigurations(Compile))
lazy val root = project.in(file("."))
.aggregate(client, server)
.enablePlugins(JavaAppPackaging)
.settings(
discoveredMainClasses in Compile ++=
discoveredMainClasses.all(allCompileButRootFilter).value.flatten,
...
)

Related

Define subprojects in another file than build.sbt

I'm trying to define a multi-project build with a consequent number of subprojects:
.
├── build.sbt
├── project/
│ ├── dependencies.scala
│ ├── tasks.scala
│ └── settings.scala
├── lib_1/
│ └── src/
├── ...
└── lib_n/
└── src/
Those subprojects are currently defined in build.sbt:
val outputJarFolder = "/some/path/"
lazy val comonSettings = /* ... */
lazy val lib_1 = (project in file ("lib1")).settings(
name:="LibOne",
commonSettings,
libraryDependencies ++= Seq(scalaTest, jsonLib, scalaXML, commonsIo),
Compile/packageBin/artifactPath := file(outputJarFolder + "lib1.jar")
)
// ... more libs ...
lazy val lib_n = (project in file ("libn")).settings(
name:="LibLast",
commonSettings,
Compile/packageBin/artifactPath := file(outputJarFolder + "libn.jar")
)
.depensOn(lib_2, lib_12)
How can I define those subprojects in another file than build.sbt in order to "unclog" that file? I want to still be able to define them in their lexicographic order (so lazy is a must). I'm working with sbt version 1.2.8 and scala 2.10.
I've tried:
Putting the declaration of those lib_k variables in a scala file and importing it --> sbt says: "classes cannot be lazy".
Putting those declaration in an object (or in a class and instantiate it in build.sbt) --> sbt projects doesn't list any subproject.
sbt documentation mentions it, but doesn't emphasize too much (perhaps to avoid encouragement for too much variation in how builds are defined in the absence of a common convention):
The build definition is described in build.sbt (actually any files named *.sbt) in the project’s base directory.
So you can split your build.sbt file into several separate .sbt files in the root of the project with different names.
I also recommend reading documentation on Organizing the build.

Cannot load grand-child routing from child module

Play Version
PlayScala#2.7.2
JDK (Oracle 1.8.0_72, OpenJDK 1.8.x, Azul Zing)
$ java -version
openjdk version "1.8.0_212"
OpenJDK Runtime Environment (Zulu 8.38.0.13-CA-linux64) (build 1.8.0_212-b04)
OpenJDK 64-Bit Server VM (Zulu 8.38.0.13-CA-linux64) (build 25.212-b04, mixed mode)
Expected Behavior
make your sbt project as 'root'
add sub-project as 'buildA'
add sub-project as 'appA'
buildA include a appA.routes
-> Yay, you can create some builds in one root project !!
Actual Behavior
My child-project (buildA) can NOT load grand-chid-project(appA)' s routing.
$ tree -L 2
.
├── build.sbt
├── build-app
│   ├── app
│   ├── build.sbt
│   ├── conf
│   │   ├── application.conf
│   │   ├── logback.xml
│   │   └── routes
├── core
│   ├── app
│   ├── build.sbt
│   ├── conf
│   │   └── core.routes
│   ├── src
├── project
   ├── plugins.sbt
build.sbt
lazy val buildApp = project.in(file("build-app"))
lazy val root = project.in(file("."))
build-app/build.sbt
lazy val core = project.in(file("../core")).enablePlugins(PlayScala)
lazy val buildApp = (project in file("."))
.enablePlugins(PlayScala)
.dependsOn(core)
build-app/conf/routes
GET / controllers.app.HomeController.index
-> /core core.Routes
core/conf/core.routes
GET / controllers.core.HomeController.index
$ sbt "project buildApp" compile
[error] /home/sizer/go/src/github.com/sizer/hello-sbt-multiproject/build-app/conf/routes:3:1: not found: value core
[error] -> /core core.Routes
Can NOT load core.routes :sob:
Am I Wrong or Is it correct behaviour??
my project is below.
https://github.com/sizer/hello-sbt-multiproject/tree/playframework_failedExample

How to exclude libraries from different tasks in sbt?

I have the following project structure:
/
├── project
| └── ...
├── src
| └── ...
├── lib
│   ├── prod-lib.jar
| └── test-lib.jar
└── build.sbt
And I need to compile with test-lib.jar for deploying into a testing environment and with prod-lib.jar for deploying into a production environment.
Both of them have the same API for accessing the things I need, so my source code does not have any problem with neither of them, but both have subtle differences on how they implement their execution in the background.
Is there a way to create a "sbt task" (Or maybe anything else) that can ignore one jar or the other, but in both perform the assembly task anyway?
Put your jars in different folders and set unmanagedBase key in Compile and Test scopes correspondingly:
> set unmanagedBase in Compile := baseDirectory.value / "lib-compile"
> show compile:assembly::unmanagedBase
[info] /project/foo/lib-compile
> set unmanagedBase in Test := baseDirectory.value / "lib-test"
> show test:assembly::unmanagedBase
[info] /project/foo/lib-test
But don't forget to call assembly task in the corresponding scope then (compile:assembly or test:assembly), because in Global it's still the default:
> show assembly::unmanagedBase
[info] /project/foo/lib

How to access scala project file from project module

I have created a project foo_proj with Intellij (using SBT template) and added a module test_mod to it. The abbreviated directory looks like this
foo_proj
├── src
│   └── main
│   └── scala-2.11
│   └── proj_obj.scala
└── test_mod
└── src
└── tmod.scala
The contents of proj_obj.scala are:
package com.base.proj
object proj_obj {
}
If would like to be able to import this object (proj_obj) into the module file tmod.scala, but when I try import com.base.proj, it can't find it.
I am new to Scala, so if I want to use stuff from the project src directory in other project modules, how else should I be structuring things? Or is this an Intellij IDEA configuration that I need to set?
Edit
The contents of the generated build.sbt are
name := "test_proj"
version := "1.0"
scalaVersion := "2.11.6"
to enable "submodules" (aka multiproject), all you need to do is add the following to your build.sbt file (or use a scala file under project dir):
lazy val root = project in file(".")
lazy val testModule = project in file("test_mod") dependsOn(root)
also, you should change test_mod dir structure.
either drop the src dir and put all your sources under the test_mod dir,
or use the sbt convention: src/main/scala or src/test/scala.

sbteclipse doesn't create a Scala project by default?

I just want to create a directory layout for my scala project with sbt and sbteclipse. Following is my sbt file.
import com.typesafe.sbteclipse.plugin.EclipsePlugin.EclipseKeys
name := "BGS"
organization := "com.example"
version := "1.0.0"
scalaVersion := "2.9.2"
scalacOptions ++= Seq("-deprecation")
EclipseKeys.createSrc := EclipseCreateSrc.Default + EclipseCreateSrc.Resource
EclipseKeys.projectFlavor := EclipseProjectFlavor.Scala
scalaSource in Compile <<= (sourceDirectory in Compile)(_ / "scala")
scalaSource in Test <<= (sourceDirectory in Test)(_ / "scala")
libraryDependencies += "org.scalatest" %% "scalatest" % "1.8" % "test"
libraryDependencies += "junit" % "junit" % "4.10" % "test"
unmanagedBase <<= baseDirectory { base => base / "lib" }
unmanagedJars in Compile <<= baseDirectory map { base => (base ** "*.jar").classpath }
In this sbt file, I had to use folling to lines to force creation of Scala directories:
scalaSource in Compile <<= (sourceDirectory in Compile)(_ / "scala")
scalaSource in Test <<= (sourceDirectory in Test)(_ / "scala")
Furthermore, after running "eclipse" from sbt console, I imported the project to Eclipse, but I could not create Scala class. Eclipse project icon has "J" letter attached to it indicating it is a Java project :-?
Why does sbt and sbteclipse default to Java?
I am running sbt version 0.12 (latest version as of Nov 2012), scala 2.9.2
For your information, what I am aiming to do is use sbt to create working project with following directory structure:
├── build.sbt
├── lib
│   ├── biojava3-core-3.0.4.jar
├── project
│   ├── plugins.sbt
│   ├── project
│   │   └── target
│   └── target
│   ├── config-classes
│   ├── scala-2.9.2
│   └── streams
├── src
│   ├── main
│   │   ├── java
│   │   ├── resources
│   │   └── scala
│   └── test
│   ├── java
│   ├── resources
│   └── scala
├── target
│   ├── scala-2.9.2
│   │   ├── cache
│   │   ├── classes
│   │   └── test-classes
│   └── streams
│   └── compile
└── test
Since I haven't got any desirable answers so far, I am trending into Typesafe Stack. I removed manually-installed scala and sbt and I am using everything from Typesafe Stack. I will update more when I am done testing with this way.
Here is the final update as I promised:
I followed instruction in this link and it worked perfectly:
http://typesafe.com/resources/typesafe-stack/downloading-installing.html
Detailed information in case somebody want to follow for Mac OSX:
Basically, I first I switched macport to homebrew following instruction here:
http://bitboxer.de/2010/06/03/moving-from-macports-to-homebrew/
Then I did:
brew install scala sbt maven giter8
Then create a Scala project from command line:
g8 typesafehub/scala-sbt
Finally I followed instruction here to add sbteclipse and convert the project to eclipse:
https://github.com/typesafehub/sbteclipse
Everything works as I expect.
Why do you want sbt to create those directors? why not jsut create them yourself. It's a one time event.
You shouldnt need any of the following lines
EclipseKeys.createSrc := EclipseCreateSrc.Default + EclipseCreateSrc.Resource
EclipseKeys.projectFlavor := EclipseProjectFlavor.Scala
scalaSource in Compile <<= (sourceDirectory in Compile)(_ / "scala")
scalaSource in Test <<= (sourceDirectory in Test)(_ / "scala")
I normally just do mkdir -p src/main/scala whenever I start a new project.