When using oneJar to package a multi project sbt build, project dependencies are not bundled into the jar. My setting is the following:
foo/build.sbt (top-level build.sbt)
foo/src/ (sources of the root project)
foo/gui/build.sbt (project 'build' definition)
foo/gui/src (sources of the 'gui' project)
The build definitions are:
// foo/build.sbt
name := "foo"
version := "0.0.1"
scalaVersion := "2.10.4"
lazy val root = project.in( file(".") )
lazy val gui = project.in( file("gui") ).dependsOn( root )
[...]
//foo/gui/build.sbt
name := "foo-gui"
seq(com.github.retronym.SbtOneJar.oneJarSettings: _*)
[...]
When calling oneJar on the gui project everything seems to run fine, but the classes of the root project are not included in the jar (although the library dependencies are). Is there any fix ?
I never tried a light configuration as you but shouldn't you put the oneJar settings in the root sbt file? You want to package the root and include guy right?
I tried something similar for the first time today and started with oneJar but when using a full sbt configuration the compiler complained that settings were a Seq(_) and sbt expected a single setting or something like that. I switched to sbt-assembly and it worked.
sbt-oneJar has not been updated for 2 years while sbt-assembly was recently updated. I'm not sure which one is preferred but I'd rather use an active tool.
Related
I'm trying to create a relatively simple sbt plugin to wrap grpc-swagger artifact.
Therefore, I've created a project with the following structure:
projectDir/
build.sbt
lib/grpc-swagger.jar <- the artifact I've downloaded
src/...
where build.sbt looks like the following:
ThisBuild / version := "0.0.1-SNAPSHOT"
ThisBuild / organization := "org.testPlugin"
ThisBuild / organizationName := "testPlugin"
lazy val root = (project in file("."))
.enable(SbtPlugin)
.settings(name := "grpc-swagger-test-plugin")
According to sbt docs, that's all I have to do in order to include an unmanaged dependecy, that is:
create a lib folder;
store the artifact in there;
However, when I do execute sbt compile publishLocal, the plugin published lacks of that external artifact.
So far I've tried to:
set exportJars := true flag
add Compile / unmanagedJars += file(lib/grpc-swagger.jar") (with also variations of the path)
manual fiddling to libraryDependecies using from file("lib/grpc-swagger.jar") specifier
but none so far seemed to work.
So how am I supposed to add an external artifact to a sbt plugin?
The proper solution to this problem is to publish the grpc-swagger library. If for whatever reason this can't be done from that library's build system, you can do it with sbt. Just add a simple subproject whose only job it is to publish that jar. It should work like so:
...
lazy val `grpc-swagger` = (project in file("."))
.settings(
name := "grpc-swagger",
Compile / packageBin := baseDirectory.value / "lib" / "grpc-swagger.jar",
// maybe other settings, such as grpc-swagger's libraryDependencies
)
lazy val root = (project in file("."))
.enable(SbtPlugin)
.settings(name := "grpc-swagger-test-plugin")
.dependsOn(`grpc-swagger`)
...
The pom file generated for the root project should now specify a dependency on grpc-swagger, and running the publish task in the grpc-swagger project will publish that jar along with a pom file.
That should be enough to make things work, but honestly, it's still a hack. The proper solution is to fix grpc-swagger's build system so you can publish an artifact from there and then just use it via libraryDependencies.
I have the following build.sbt with two sub projects. Everything compiles and runs fine.
One is a thin scala play project. dataextractor has a lot of util classes I want to call in the play project.
However, the configurarion below still leads to the following compilation error:
[error]
/Users/foo.bar/_vws/com.corp.enablement.scripts/sirf_extract_serve/tools_sirf_server/app/corp/tools/es_result_server/service/SystemInitializer.scala:6:21:
object dataextraction is not a member of package corp.tools [error]
import corp.tools.dataextraction.providers.confluence
This is my first sbt multi-project. Advice would be genuinley appreciated
lazy val tools_dataextractor = (project in file("tools_dataextractor")).settings(
Defaults.itSettings,
libraryDependencies += scalatest % "it,test",
name := "corp_tools_dataextractor",
version := "0.1",
mainClass in Compile := Some("corp.tools.ExtractionStartUp")
)
lazy val tools_sirf_server = (project in file("tools_sirf_server")).settings(
).enablePlugins(PlayScala).dependsOn(tools_dataextractor)
lazy val root = (project in file("."))
.aggregate(tools_dataextractor, tools_sirf_server)
The configuration looks good.
2 possibilities what the problem might be:
You are in a sbt-console and haven't reloaded the console after you changed build.sbt
You work with Intellij and did not reload the sbt projects
If this does not help - adjust your question with the steps you take.
OK, the answer is neophyte error.
I had a build.sbt in the root and a build.sbt in each sub project (which is permissible).
Everything will built fine.... Until I started adding dependencies from one sub-project to another. In this case, the super build.sbt "dependsOn" is ignored and compilation errors occur.
Side note, the main reason for keeping sub-project build.sbt was simply laziness. It took half a day to get everything working in a single build.sbt at the root level. However, it is defintiely worth the effort.
Following the hints of the post explaining the basics of migrating to scalajs and this page about cross-compilations, I decided to add cross compilation to my standalone dependency-free scala library by doing the following changes:
I added a file project/plugins.sbt with the content
addSbtPlugin("org.scala-js" % "sbt-scalajs" % "0.6.16")
I added scalaVersion in ThisBuild := "2.11.8" in build.sbt because else just scalaVersion was using 2.10
I also added in the build.sbt the following content to ensure that I can keep the same directory structure, since I don't have any particular files for the JVM or for Javascript:
lazy val root = project.in(file(".")).
aggregate(fooJS, fooJVM).
settings(
publish := {},
publishLocal := {}
)
lazy val foo = crossProject.crossType(CrossType.Pure).in(file(".")).
settings(version := "0.1").
jvmSettings(
// Add JVM-specific settings here
).
jsSettings(
// Add JS-specific settings here
)
lazy val fooJVM = foo.jvm
lazy val fooJS = foo.js
But now, after I published the project locally using sbt publish-local the projects depending on this library do not work anymore, i.e. they don't see the classes that this library was offering and raise errors.
I looked into .ivy2/local/.../foo/0.1/jars and the JAR went from 1MB to 1KB, so the errors make sense.
However, how can I make sure the JVM jar file is compiled correctly?
Further informations
The jar time does not change anymore, it looks like there had been some miscompilation. I deleted the .ivy2 cache, but now sbt publish-local always finishes with success but does not regenerate the files.
Ok I found the solution myself.
I needed to remove the publishLocal := {} from the build, and now all the projects depending on my library work fine.
I'd like to know how to convert a regular scala project into an sbt project. I've tried manually creating an sbt file on the root directory, correctly implemented, but Intellij still doesn't recognize this as a sbt project, i.e, it won't show me in the "View -> Tool Windows" the "SBT" option.
How should I go about this? What I'm actually attempting to do is to create an empty project with multiple (independent) modules.
From what I've gathered there seems to be no way to add a module directly with sbt support, am I right?
Thanks
Here is an example of a multi-project build. The root project "aggregates" them all in case you want to compile them all together or package them all together, etc. The "coreLibrary" project depends on the code of "coreA" and "coreB".
import sbt.Keys._
import sbt._
name := "MultiProject"
lazy val root = project.in(file(".")).aggregate(coreA, coreB, coreLibrary)
lazy val coreA = Project("CoreA", file("core-a")).settings(
organization := "org.me",
version := "0.1-SNAPSHOT"
)
lazy val coreB = Project("CoreB", file("core-b")).settings(
organization := "org.me",
libraryDependencies += "org.apache.kafka" %% "kafka" % "0.8.2-beta",
version := "0.3-SNAPSHOT"
)
lazy val coreLibrary = Project("UberCore", file("core-main")).dependsOn(coreA, coreB).settings(
organization := "org.me",
version := "0.2-SNAPSHOT"
)
You can (for example) compile each project from the command line:
>sbt CoreB/compile
Or you can do this interactively:
>sbt
>project CoreB
>compile
I recommend you to use a single multiple-module SBT project. sbt is a great build tool for scala, you can do a lot of things with sbt, including checking out from the repository one module and built it.
sbt
projects
project <helloProject>
Actually, this feature allows multiple people to work on the same project in parallel. Please take a look at this: http://www.scala-sbt.org/0.13.5/docs/Getting-Started/Multi-Project.html.
I am trying to figure out how idea will recognize thrid party dependencies when using SBT. When I use the sbt plugin gen-idea it seems to download all the necessary dependencies which get put into my ~/.ivy/ directory as expected. How can intellij use these deps?
EDIT:
One thing I noticed is if I make a new idea project instead of just a module then this works? Any idea why this would be? I would like to be able to have multiple sbt modules in the same project.
The sbt-idea plugin works with multi-module sbt project. We have been using it since somewhere around sbt-0.10.0, and currently are at sbt-0.11.2. It seems like you have the dependency part of the build file set up ok, so here's an example of how we do the project setup from a full specification Build.scala file:
object Vcaf extends Build {
import Resolvers._
import Dependencies._
import BuildSettings._
lazy val vcafDb = Project(
id = "vcaf-db",
base = file("./vcaf-db"),
dependencies = Seq(),
settings = buildSettings ++ /* proguard */ SbtOneJar.oneJarSettings ++ Seq(libraryDependencies := dbDeps, resolvers := cseResolvers)
)
lazy val vcaf = Project(
"vcaf",
file("."),
dependencies = Seq(vcafDb),
aggregate = Seq(vcafDb),
settings = buildSettings ++ Seq(libraryDependencies := vcafDeps, resolvers := cseResolvers) ++ webSettings
)
}
In the example, the vcaf-db project is in the a folder within the vcaf project folder. The vcaf-db project does not have it's own build.sbt or Build.scala file. You'll notice that we are specifying libraryDependencies for each project, which may or may not be your missing link.
As ChrisJamesC mentioned, you need to do a "reload" from within SBT (or exit sbt and come back in) to pick up changes to your build definition. After the project is reloaded, you should be able to do a "gen-idea no-classifiers no-sbt-classifiers" and get an intellij project that has the main project, modules, and library access as defined in the build file.
Hope it helps!
If you want multiple SBT modules in one IDEA project, you can use sbt multi-project builds (aka subprojects). Just create a master project that refers to the modules as sub-projects, then run gen-idea on the master. To specify dependencies among the modules you have to use Build.scala (not build.sbt), as in jxstanford's answer or like this:
lazy val foo = Project(id = "foo", base = file("foo"))
lazy val bar = Project(id = "bar", base = file("bar")) dependsOn(foo)
One level of subprojects works fine (with the dependencies correctly reflected in the resulting IDEA project), but nested subprojects don't seem to work. Also, it seems to be an sbt restriction that the subprojects must live in subdirectories of the master project (i.e., file("../foo") is not allowed).
See also How to manage multiple interdependent modules with SBT and IntelliJ IDEA?.