Simple build tool (SBT) package WAR - scala

I have a multiple project configuration. One of the projects needs to be build as a war file for future deployment. I used this sbt-plugin: https://github.com/JamesEarlDouglas/xsbt-web-plugin to build the war. However I need it to include depended projects as jars.
In maven I included other modules as a dependency in my WAR module and they appeared eventually in the WAR:lib directory. It seems that sbt-web-plugin default behavior is not to include them
What I mean is:
This is a part of my parent.scala file
lazy val dataPopulator = Project(
"data-populator",
file("data-populator"),
settings = buildSettings ++ Seq (libraryDependencies ++= dataPopulatorDeps))
lazy val warProject = Project(id = "rest-ws",
base = file("rest-ws"),
settings = buildSettings ++ Seq (libraryDependencies)) dependsOn(dataPopulator)
The dataPopulator Project above when packaged creates a jar.
The warProject Project above have a specific build.sbt in it's directory
rest-ws/build.sbt:
seq(webSettings :_*)
name := "main-ws"
libraryDependencies += "org.mortbay.jetty" % "jetty" % "6.1.22" % "container"
When I run the package command (added by the web plugin) it creates a war, the problem is that this war doesn't include dataPopulator jar while it depends on it during compile time.
Anyone have a suggestion how to include generated jar artifacts from some modules into another project that is packaged as war ?

Related

sbt: set the base-directory of a remote RootProject

Disclaimer: I am new to sbt and Scala so I might be missing obvious things.
My objective here is to use the Scala compiler as a library from my main project. I was initially doing that by manually placing the scala jars in a libs directory in my project and then including that dir in my classpath. Note that at the time I wasn't using sbt. Now, I want to use sbt and also download the scala sources from github, build the scala jars and then build my project. I start by creating 2 directories: myProject and myProject/project. I then create the following 4 files:
The sbt version file:
// File 1: project/build.properties
sbt.version=0.13.17
The plugins file (not relevant to this question):
// File 2: project/plugins.sbt
addSbtPlugin("com.eed3si9n" % "sbt-buildinfo" % "0.7.0")
The build.sbt file:
// File 3: build.sbt
lazy val root = (project in file(".")).
settings(
inThisBuild(List(
organization := "me",
scalaVersion := "2.11.12",
version := "0.1.0-SNAPSHOT"
)),
name := "a name"
).dependsOn(ScalaDep)
lazy val ScalaDep = RootProject(uri("https://github.com/scala/scala.git"))
My source file:
// File 4: Test.scala
import scala.tools.nsc.MainClass
object Test extends App {
println("Hello World !")
}
If I run sbt inside myProject then sbt will download the scala sources from github and then try to compile them. The problem is that the base-directory is still myProject. This means that if the scala sbt source files refer to something that is in the scala base-directory they won't find it. For example, the scala/project/VersionUtil.scala file tries to open the scala/versions.properties file that lies in the scala base-directory.
Question: How can I set sbt to download a github repo and then build it using that project's base-directory instead of mine's (by that I mean the base-directory of myProject in the above example) ??
Hope that makes sense.
I would really appreciate any feedback on this.
Thanks in advance !
In the Scala ecosystem you usually depend on binary artifacts (libraries) that are published in Maven or Ivy repositories. Virtually all Scala projects publish binaries, including the compiler. So all you have to do is add the line below to your project settings:
libraryDependencies += "org.scala-lang" % "scala-compiler" % scalaVersion.value
dependsOn is used for dependencies between sub-projects in the same build.
For browsing sources you could use an IDE. IntelliJ IDEA can readily import Sbt projects and download/attach sources for library dependencies. Eclipse has an Sbt plugin that does the same. Ensime also, etc. Or just git clone the repository.

Including a Spark Package JAR file in a SBT generated fat JAR

The spark-daria project is uploaded to Spark Packages and I'm accessing spark-daria code in another SBT project with the sbt-spark-package plugin.
I can include spark-daria in the fat JAR file generated by sbt assembly with the following code in the build.sbt file.
spDependencies += "mrpowers/spark-daria:0.3.0"
val requiredJars = List("spark-daria-0.3.0.jar")
assemblyExcludedJars in assembly := {
val cp = (fullClasspath in assembly).value
cp filter { f =>
!requiredJars.contains(f.data.getName)
}
}
This code feels like a hack. Is there a better way to include spark-daria in the fat JAR file?
N.B. I want to build a semi-fat JAR file here. I want spark-daria to be included in the JAR file, but I don't want all of Spark in the JAR file!
The README for version 0.2.6 states the following:
In any case where you really can't specify Spark dependencies using sparkComponents (e.g. you have exclusion rules) and configure them as provided (e.g. standalone jar for a demo), you may use spIgnoreProvided := true to properly use the assembly plugin.
You should then use this flag on your build definition and set your Spark dependencies as provided as I do with spark-sql:2.2.0 in the following example:
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0" % "provided"
Please note that by setting this your IDE may no longer have the necessary dependencies references to compile and run your code locally, which would mean that you would have to add the necessary JARs to the classpath by hand. I do this often on IntelliJ, what I do is having a Spark distribution on my machine and adding its jars directory to the IntelliJ project definition (this question may help you with that, should you need it).

File of one of the sbt plugin's dependencies

I need to get hold of the File reference to a specific artifact during the setup phase of my sbt's plugin.
I've tried:
obtaining the ivy home directory, but that basically means assuming where ivy will place the files (they could be even be in a local maven)
parsing System.getProperty("java.class.path"), but it only contains the sbt-launch jar
obtaining the resolved sbt jars from the update.value setting, but it doesn't have any of the plugin's jars in the list! (only the jars for the application being compiled)
Short of invoking the Ivy API manually, is there any way to get the File to the plugin's jar dependency?
NOTE: This is a very specific part of how to write an sbt plugin to launch the app with an agent factored out into a separate question.
got it! adding the dependency explicitly within the source reveals its resolved path:
override val projectSettings = Seq(
libraryDependencies += "com.github.fommil.lion" %% "agent" % "1.0-SNAPSHOT",
javaOptions ++= Seq(s"-Dhack=${update.value}}")
)
has a reference in it!

How to get Intellij to use dependencies from SBT scala

I am trying to figure out how idea will recognize thrid party dependencies when using SBT. When I use the sbt plugin gen-idea it seems to download all the necessary dependencies which get put into my ~/.ivy/ directory as expected. How can intellij use these deps?
EDIT:
One thing I noticed is if I make a new idea project instead of just a module then this works? Any idea why this would be? I would like to be able to have multiple sbt modules in the same project.
The sbt-idea plugin works with multi-module sbt project. We have been using it since somewhere around sbt-0.10.0, and currently are at sbt-0.11.2. It seems like you have the dependency part of the build file set up ok, so here's an example of how we do the project setup from a full specification Build.scala file:
object Vcaf extends Build {
import Resolvers._
import Dependencies._
import BuildSettings._
lazy val vcafDb = Project(
id = "vcaf-db",
base = file("./vcaf-db"),
dependencies = Seq(),
settings = buildSettings ++ /* proguard */ SbtOneJar.oneJarSettings ++ Seq(libraryDependencies := dbDeps, resolvers := cseResolvers)
)
lazy val vcaf = Project(
"vcaf",
file("."),
dependencies = Seq(vcafDb),
aggregate = Seq(vcafDb),
settings = buildSettings ++ Seq(libraryDependencies := vcafDeps, resolvers := cseResolvers) ++ webSettings
)
}
In the example, the vcaf-db project is in the a folder within the vcaf project folder. The vcaf-db project does not have it's own build.sbt or Build.scala file. You'll notice that we are specifying libraryDependencies for each project, which may or may not be your missing link.
As ChrisJamesC mentioned, you need to do a "reload" from within SBT (or exit sbt and come back in) to pick up changes to your build definition. After the project is reloaded, you should be able to do a "gen-idea no-classifiers no-sbt-classifiers" and get an intellij project that has the main project, modules, and library access as defined in the build file.
Hope it helps!
If you want multiple SBT modules in one IDEA project, you can use sbt multi-project builds (aka subprojects). Just create a master project that refers to the modules as sub-projects, then run gen-idea on the master. To specify dependencies among the modules you have to use Build.scala (not build.sbt), as in jxstanford's answer or like this:
lazy val foo = Project(id = "foo", base = file("foo"))
lazy val bar = Project(id = "bar", base = file("bar")) dependsOn(foo)
One level of subprojects works fine (with the dependencies correctly reflected in the resulting IDEA project), but nested subprojects don't seem to work. Also, it seems to be an sbt restriction that the subprojects must live in subdirectories of the master project (i.e., file("../foo") is not allowed).
See also How to manage multiple interdependent modules with SBT and IntelliJ IDEA?.

Remove entry from classpath after compile

I have a legacy war project that depends on a jar project, the jar project needs to add a few unmanaged jars to the classpath for compilation. But these jars should not be packaged in the war. So my question is how do I remove these entries from the fullClasspath. The following won't work:
val excludeFilter = "(servlet-api.jar)|(gwt-dev.jar)|(gwt-user.jar)"
val filteredCP = cp.flatMap({ entry =>
val jar = entry.data.getName()
if (jar.matches(excludeFilter)) {
Nil
} else {
Seq(entry)
}
})
fullClasspath in Runtime = filteredCP
I am pretty sure there must be simple way to do this but so far it has eluded me.
Edit: Based on Pablo's sugestion to use the managed classpath instead of the unmanaged I can rephrase the question as: how do you add local jars to the managedClasspath. My jars are placed in a local folder with a (very) nonstandard layout:
lib/testng.jar
lib/gwt/2.3/gwt-user.jar
lib/jetty/servlet.jar
So basically I am looking for something like:
libraryDependencies += "testng" % "provided->test"
libraryDependencies += "gwt" % "2.3" % "gwt-user" % "provided->compile"
libraryDependencies += "jetty" % "servlet" % "provided->default"
allowing me to grab jars from my own local lib folder.
Some information is provided on the Classpaths page, but it is not very clear or detailed. The information is also available using the inspect command, described on the Inspecting Settings page.
Basically, for a configuration X, in a short-hand notation:
// complete, exported classpath, such as used by
// 'run', 'test', 'console', and the war task
fullClasspath in X =
dependencyClasspath in X ++ exportedProducts in X
// classpath only containing dependencies,
// used by 'compile' or 'console-quick', for example
dependencyClasspath in X =
externalDependencyClasspath in X ++ internalDependencyClasspath in X
// classpath containing only dependencies external to the build
// (as opposed to the inter-project dependencies in internalDependencyClasspath)
externalDependencyClasspath in X =
unmanagedClasspath in X ++ managedClasspath in X
// the manually provided classpath
unmanagedClasspath in X =
unmanagedJars for X and all configurations X extends, transitively
So, normally, when you want to add unmanaged libraries, you add them to unmanagedJars. For example, if you add libraries to unmanagedJars in Compile, then sbt will correctly include the libraries on the unmanagedClasspath for Compile, Runtime, and Test.
However, you want explicit control here. Add the libraries only to the unmanagedClasspath you want the jars on, which is unmanagedClasspath in Compile. For example, in sbt 0.11.0+:
unmanagedClasspath in Compile <++= baseDirectory map { base =>
val lib = base / "lib"
Seq(
lib / "testng.jar",
lib / "gwt/2.3/gwt-user.jar",
lib / "jetty/servlet.jar"
)
}
Assuming the war plugin uses the Runtime classpath, those jars will only show up on the compile classpath and not in the war.
sbt supports ivy-like configurations, and implements maven basic scopes.
If you want to use some jars in your compilation classpath but don't want to ship them, I guess the provided scope is for you:
libraryDependencies += "org.example" % "example" % "1.0" % "provided->compile"