sbt assembly for multiproject build - scala

I have a multiproject sbt build file like this
import sbt._
import Keys._
object TestBuild extends Build {
lazy val root = Project(id = "test",
base = file(".")) aggregate(core, handlers)
lazy val core = Project(id = "test-core",
base = file("core"))
lazy val handlers = Project(id = "test-handlers",
base = file("handlers")) dependsOn (core)
}
How can I build an assembly-jar that includes all the dependencies + core + handlers

Ok I solved this problem using
import sbt._
import Keys._
object TestBuild extends Build {
lazy val root = Project(id = "test",
base = file(".")) aggregate(core, handlers) dependsOn(core,handlers)
lazy val core = Project(id = "test-core",
base = file("core"))
lazy val handlers = Project(id = "test-handlers",
base = file("handlers")) dependsOn (core)
}
I put the assembly settings in the build.sbt file

You can use sbt-assembly plugin.
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.7.3")

Related

Sbt nested dependsOn

I have a root project that depends on a subproject1. And subproject1 depends on subproject2.
Does that imply that I Can use subproject2's source code directly in root?
lazy val root =
Project(id = "root", base = file(".")).dependsOn(sub1)
lazy val sub1 =
Project(id = "sub1").dependsOn(sub2)
lazy val sub2 =
Project(id = "sub2")
Yes.
This can easily be checked.
build.sbt
name := "sbtdemo"
version := "0.1"
ThisBuild / scalaVersion := "2.13.4"
lazy val root =
Project(id = "root", base = file(".")).dependsOn(sub1)
lazy val sub1 =
Project(id = "sub1", base = file("sub1")).dependsOn(sub2)
lazy val sub2 =
Project(id = "sub2", base = file("sub2"))
sub2/src/main/scala/App.scala
object App {
def foo() = println("foo")
}
src/main/scala/Main.scala
object Main {
def main(args: Array[String]): Unit = {
App.foo() // foo
}
}
Yes. From Classpath dependencies by sbt:
lazy val core = project.dependsOn(util)
Now code in core can use classes from util. This also creates an ordering between the projects when compiling them; util must be updated and compiled before core can be compiled.

Intellj doesn't see changes in multi project sbt

I have a multi module sbt project. When I change some source code in a module, other modules don't see the changes in IntelliJ .
When I try to navigate, it goes to declaration, instead of navigating to the source it navigates to compiled jar file.
It works fine when I remove the jar from library dependencies in project settings. I think because it recompiles so works fine till next change. And sbt compiles works fine but I guess problem because of Build.scala settings, project dependencies can have order issues. Here is the dependencies;
lazy val root = Project(id = "xx-main", base = file("."), settings = commonSettings)
.aggregate(utils, models, commons, dao, te)
.dependsOn(utils, models, commons, dao)
lazy val utils = Project(id = "xx-utils", base = file("xx-utils"))
.settings(commonSettings: _*)
lazy val commons = Project(id = "xx-commons", base = file("xx-commons"))
.settings(commonSettings: _*)
.dependsOn(utils, models)
lazy val models =
Project(id = "xx-models", base = file("xx-models"), settings = commonSettings)
.dependsOn(utils)
lazy val dao = Project(id = "xx-dao", base = file("xx-dao"))
.settings(commonSettings: _*)
.dependsOn(utils, models)
lazy val te = Project(id = "xx-te", base = file("xx-te"))
.settings(commonSettings: _*)
.dependsOn(utils, models, dao, commons)

How do you share a custom task in a SBT multi-project

I have a project set up as a SBT multi-build. That looks like this:
- project
Dependencies.scala
- core
build.sbt
- server
build.sbt
build.sbt
I want to use Dependencies.scala as a container for version numbers of libraries that are shared between the sub-projects.
sealed trait Dependencies {
val commonsIo = "2.4"
}
object DependencyVersions extends Dependencies
In the root build.sbt I added a Setting that is given to each sub-project.
lazy val dependencies = settingKey[Dependencies]("versions")
val defaultSettings = Defaults.coreDefaultSettings ++ Seq(
dependencies := DependencyVersions)
def projectFolder(name: String, theSettings: Seq[Def.Setting[_]] = Nil) = Project(name, file(name), settings = theSettings)
lazy val core = projectFolder("core", defaultSettings)
I can't access the dependencies setting in core/build.sbt.
"commons-io" % "commons-io" % dependencies.value.commonsIo, <-- doesn't work
How can I get this to work?
You can define common settings (dependencies) in an object Common extends AutoPlugin (in project/Common.scala), and then use .enablePlugin(Common) on sub-project descriptor (see it in Anorm).
Thanks #cchantep got it working now using the AutoPlugin below
import sbt._
sealed trait Dependencies {
val commonsIo = "2.4"
}
object DependencyVersions extends Dependencies
object DependencyVersionsPlugin extends AutoPlugin {
override def trigger = allRequirements
object autoImport {
lazy val dependencies = settingKey[Dependencies]("Bundles dependency versions")
}
import autoImport._
override def projectSettings = Seq(
dependencies := DependencyVersions
)
}

Adding multiple subprojects to Play 2 framework

am using play 2 framework with java
I have added a project dependency as a sub-project
by following the tutorials
Now I want a second sub-project.
But kind of new to the scala codes in the build.scala file
can some one tell me how to ass a second sub-project.
below is my code for the build.scala file and the sub-project.
import sbt._
import Keys._
import PlayProject._
object ApplicationBuild extends Build {
val appName = "Rub_Server"
val appVersion = "1.0-SNAPSHOT"
val appDependencies = Seq(
// These are the project dependencies
"mysql" % "mysql-connector-java" % "5.1.18",
)
val subProject = Project("Com-RubineEngine-GesturePoints", file("modules/Com-RubineEngine-GesturePoints"))
val main = PlayProject(appName, appVersion, appDependencies, mainLang = JAVA).settings(
// Add your own project settings here
).dependsOn(subProject)
}
now I want add a second project to the build.scala file
how do I do tht
thanks.
.dependsOn(subProject, anotherProject)

Play Framework and scala.tools.nsc

I have to use scala parser inside Play Framework application.
import scala.tools.nsc._
trait Foo
class Parser {
def parse(code: String) = {
val settings = new Settings
settings.embeddedDefaults[Foo]
val interpreter = new Interpreter(settings)
interpreter.parse(code)
}
}
I have following dependency in Build.scala
"org.scala-lang" % "scala-compiler" % "2.9.1"
This code works when build using SBT. In Play it ends with NullPointerException and:
Failed to initialize compiler: object scala not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programatically, settings.usejavacp.value = true.
Build.scala
import sbt._
import Keys._
import PlayProject._
object ApplicationBuild extends Build {
val appName = "com.qwerty.utils"
val appVersion = "1.0-SNAPSHOT"
val scalaVersion = "2.9.1"
val appDependencies = Seq(
"org.scala-lang" % "scala-compiler" % "2.9.1"
)
val main = PlayProject(appName, appVersion, appDependencies, mainLang = SCALA).settings(
// Add your own project settings here
)
}
For background on embeddedDefaults, see the original proposal.
The container (Play) must define the 'app.class.path' and 'boot.class.path' resources and then embeddedDefaults will use them to configure the interpreter properly for the environment. So, this is an enhancement for Play.
If you can pass the necessary classpaths into your application, you can configure classpaths and classloaders explicitly yourself with something like:
val settings = new Settings
settings.classpath.value = "<classpath>"
settings.bootclasspath.value =
settings.bootclasspath.value + File.pathSeparator +
"<extra-bootclasspath>"
val interpreter = new Interpreter(settings) {
override def parentClassLoader = classOf[Foo].getClassLoader
}
interpreter.parse(code)
The bootclasspath should generally contain scala-library.jar and the classpath should contain the application jars.