Publish SBT project with sub-projects as a single Maven artifact - scala

I have an SBT project for a Scala library. The implementation is spread across multiple SBT sub-projects but I would like to publish the library as a single JAR. I've constructed the following example:
val subProject = project
val myLibrary = project
.in(file("."))
.dependsOn(subProject)
The project myLibrary it the main project which depends on all other sub-projects. If I run sbt myLibrary/package, I get a JAR files with just the class files, sources and rendered documentation of the sources in project myLibrary. I found the following workaround to include the files from the other sub-projects, by adding settings like the following to myLibrary:
val myLibrary = project
.in(file("."))
.dependsOn(subProject)
.settings(
Compile / packageBin / mappings ++= (subProject / Compile / packageBin / mappings).value,
Compile / packageSrc / mappings ++= (subProject / Compile / packageSrc / mappings).value,
Compile / packageDoc / mappings ++= (subProject / Compile / packageDoc / mappings).value)
What is the intended way to produce the same result?

You can take a look at sbt-assembly plugin which creates fat jar (includes all the dependencies) of your project.

Related

sbt plugin: add an unmanaged jar file

I'm trying to create a relatively simple sbt plugin to wrap grpc-swagger artifact.
Therefore, I've created a project with the following structure:
projectDir/
build.sbt
lib/grpc-swagger.jar <- the artifact I've downloaded
src/...
where build.sbt looks like the following:
ThisBuild / version := "0.0.1-SNAPSHOT"
ThisBuild / organization := "org.testPlugin"
ThisBuild / organizationName := "testPlugin"
lazy val root = (project in file("."))
.enable(SbtPlugin)
.settings(name := "grpc-swagger-test-plugin")
According to sbt docs, that's all I have to do in order to include an unmanaged dependecy, that is:
create a lib folder;
store the artifact in there;
However, when I do execute sbt compile publishLocal, the plugin published lacks of that external artifact.
So far I've tried to:
set exportJars := true flag
add Compile / unmanagedJars += file(lib/grpc-swagger.jar") (with also variations of the path)
manual fiddling to libraryDependecies using from file("lib/grpc-swagger.jar") specifier
but none so far seemed to work.
So how am I supposed to add an external artifact to a sbt plugin?
The proper solution to this problem is to publish the grpc-swagger library. If for whatever reason this can't be done from that library's build system, you can do it with sbt. Just add a simple subproject whose only job it is to publish that jar. It should work like so:
...
lazy val `grpc-swagger` = (project in file("."))
.settings(
name := "grpc-swagger",
Compile / packageBin := baseDirectory.value / "lib" / "grpc-swagger.jar",
// maybe other settings, such as grpc-swagger's libraryDependencies
)
lazy val root = (project in file("."))
.enable(SbtPlugin)
.settings(name := "grpc-swagger-test-plugin")
.dependsOn(`grpc-swagger`)
...
The pom file generated for the root project should now specify a dependency on grpc-swagger, and running the publish task in the grpc-swagger project will publish that jar along with a pom file.
That should be enough to make things work, but honestly, it's still a hack. The proper solution is to fix grpc-swagger's build system so you can publish an artifact from there and then just use it via libraryDependencies.

How do I make sbt include non-Java sources to published artifact?

How do I make sbt include non-Java sources to published artifact ?
I'm using Kotlin plugin and can't figure out how to force sbt to include .kt file into published source jar. It only includes .java files.
A lot of people online suggest adding following code to sbt script but it doesn't help
mappings in (Compile, packageSrc) ++= {
val base = (sourceManaged in Compile).value
val files = (managedSources in Compile).value
files.map { f => (f, f.relativeTo(base).get.getPath) }
},
I also tried
includeFilter in (Compile, packageSrc) := "*.scala" || "*.java" || "*.kt",
Here is output of some variables in sbt console
sbt:collections> show unmanagedSourceDirectories
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/scala
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/java
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/kotlin
sbt:collections> show unmanagedSources
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/java/com/walkmind/extensions/collections/TestSomething.java
which plugin you use for kotlin?
https://github.com/pfn/kotlin-plugin has the option kotlinSource to configure where the source directory is located.
sbt packageBin compiled kotlin files and include them to output jar.
build.sbt
// define kotlin source directory
kotlinSource in Compile := baseDirectory.value / "src/main/kotlin",
src/main/kotlin/org.test
package org.test
fun main(args: Array<String>) {
println("Hello World!")
}
console
sbt compile
sbt packageBin
target/scala-2.13
jar include MainKt.class
and folder org/test contains MainKt.class too.
would this solve your problem?
I found a workaround for this in my project https://github.com/makiftutuncu/e. I made following: https://github.com/makiftutuncu/e/blob/master/project/Settings.scala#L105
Basically, I added following setting in SBT to properly generate sources artifact:
// Include Kotlin files in sources
packageConfiguration in Compile := {
val old = (packageConfiguration in Compile in packageSrc).value
val newSources = (sourceDirectories in Compile).value.flatMap(_ ** "*.kt" get)
new Package.Configuration(
old.sources ++ newSources.map(f => f -> f.getName),
old.jar,
old.options
)
}
For the documentation artifact, I added Gradle build to my Kotlin module. I set it up as shown here https://github.com/makiftutuncu/e/blob/master/e-kotlin/build.gradle.kts. This way, I make Gradle build generate the Dokka documentation. And finally, added following setting in SBT to run Gradle while building docs:
// Delegate doc generation to Gradle and Dokka
doc in Compile := {
import sys.process._
Process(Seq("./gradlew", "dokkaJavadoc"), baseDirectory.value).!
target.value / "api"
}
I admit, this is a lot of work just to get 2 artifacts but it did the trick for me. 🤷🏻 Hope this helps.

Scala source external to SBT project folder

When creating an SBT scala project in ItelliJ, it wants me to put all source code in the src/main/scala folder of the project folder. However, I want to put the source code somewhere else independent or particular projects, build-tools, IDEs, etc.
How do I tell it to get the source from arbitrary locations?
You can use the following settings to change the location where SBT looks for your Scala sources:
// Change Scala source directory. Default is:
// scalaSource in Compile := baseDirectory.value / "src" / "main" / "scala"
scalaSource in Compile := file("<Path to your sources...>")
// Change Scala test source directory. Default is:
// scalaSource in Test := baseDirectory.value / "src" / "test" / "scala"
scalaSource in Test := file("<Path to your test sources...>")
For further information (including how to specify other source, library and resource directories), refer to the official SBT documentation for customizing paths.

How to use SBT to run ScalaTest tests against a fat jar?

I have a simple SBT project, consisting of some Scala code in src/main/scala and some test code in src/test/scala. I use the sbt-assembly plugin to create a fat jar for deployment onto remote systems. The fat jar includes all the dependencies of the Scala project, including the Scala runtime itself. This all works great.
Now I'm trying to figure out a way I can run the Scala tests against the fat jar. I tried the obvious thing, creating a new config extending the Test config and modifying the dependencyClasspath to be the fat JAR instead of the default value, however this fails because (I assume because) the Scala runtime is included in the fat jar and collides somehow with the already-loaded Scala runtime.
My solution right now works but it has serious drawbacks. I just use Fork.java to invoke Java on the org.scalatest.tools.Runner runner with a classpath set to include the test code and the fat jar and all of the test dependencies. The downside is that none of the SBT test richness works, there's no testQuick, there's not testOnly, and the test failure reporting is on stdout.
My question boils down to this: how does one use SBT's test commands to run tests when those tests are dependent not on their corresponding SBT compile output, but on a fat JAR file which itself includes all the Scala runtimes?
This is what I landed on (for specs2, but can be adapted). This is basically what you said was your Fork solution, but I figured I'd leave this here in case someone wanted to know what that might be. Unfortunately I don't think you can run this "officially" as a SBT test runner. I should also add that you still want Fork.java even though this is Scala, because Fork.scala depends on a runner class that I don't seem to have.
test.sbt (or build.sbt, if you want to put a bunch of stuff there - SBT reads all .sbt files in the root if you want to organize):
// Set up configuration for building a test assembly
Test / assembly / assemblyJarName := s"${name.value}-test-${version.value}.jar"
Test / assembly / assemblyMergeStrategy := (assembly / assemblyMergeStrategy).value
Test / assembly / assemblyOption := (assembly / assemblyOption).value
Test / assembly / assemblyShadeRules := (assembly / assemblyShadeRules).value
Test / assembly / mainClass := Some("org.specs2.runner.files")
Test / test := {
(Test / assembly).value
val assembledFile: String = (Test / assembly / assemblyOutputPath).value.getAbsolutePath
val minimalClasspath: Seq[String] = (Test / assembly / fullClasspath).value
.filter(_.metadata.get(moduleID.key).get.organization.matches("^(org\\.(scala-lang|slf4j)|log4j).*"))
.map(_.data.getAbsolutePath)
val runClass: String = (Test / assembly / mainClass).value.get
val classPath: Seq[String] = Seq(assembledFile) ++ minimalClasspath
val args: Seq[String] = Seq("-cp", classPath.mkString(":"), runClass)
val exitCode = Fork.java((Test / assembly / forkOptions).value, args)
if (exitCode != 0) {
throw new TestsFailedException()
}
}
Test / assembly / test := {}
Change in build.sbt:
lazy val root = (project in file("."))
.settings(/* your original settings are here */)
.settings(inConfig(Test)(baseAssemblySettings): _*) // enable assembling in test

How to exclude resources during packaging with SBT but not during testing

I have a bunch of conf files in a project structure like this:
```
src / main / resources / live.conf
src / test / resources / test.conf
```
I want to excluded live.conf from the artifact that is build when I run sbt one-jar (using the one-jar plugin). I added this line which unfortunately also excludes test.conf when running sbt test:compile
excludeFilter in Runtime in unmanagedResources := "*.conf"
How can I exclude live.conf in the artifact jar but not for the tests?
This should help:
mappings in (Compile, packageBin) ~= { _.filter(!_._1.getName.endsWith(".conf")) }
packageBin is a task which produces your jar artifact and mappings denotes files wich are used for compilation and project packaging in Compile scope