How do I make sbt include non-Java sources to published artifact ?
I'm using Kotlin plugin and can't figure out how to force sbt to include .kt file into published source jar. It only includes .java files.
A lot of people online suggest adding following code to sbt script but it doesn't help
mappings in (Compile, packageSrc) ++= {
val base = (sourceManaged in Compile).value
val files = (managedSources in Compile).value
files.map { f => (f, f.relativeTo(base).get.getPath) }
},
I also tried
includeFilter in (Compile, packageSrc) := "*.scala" || "*.java" || "*.kt",
Here is output of some variables in sbt console
sbt:collections> show unmanagedSourceDirectories
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/scala
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/java
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/kotlin
sbt:collections> show unmanagedSources
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/java/com/walkmind/extensions/collections/TestSomething.java
which plugin you use for kotlin?
https://github.com/pfn/kotlin-plugin has the option kotlinSource to configure where the source directory is located.
sbt packageBin compiled kotlin files and include them to output jar.
build.sbt
// define kotlin source directory
kotlinSource in Compile := baseDirectory.value / "src/main/kotlin",
src/main/kotlin/org.test
package org.test
fun main(args: Array<String>) {
println("Hello World!")
}
console
sbt compile
sbt packageBin
target/scala-2.13
jar include MainKt.class
and folder org/test contains MainKt.class too.
would this solve your problem?
I found a workaround for this in my project https://github.com/makiftutuncu/e. I made following: https://github.com/makiftutuncu/e/blob/master/project/Settings.scala#L105
Basically, I added following setting in SBT to properly generate sources artifact:
// Include Kotlin files in sources
packageConfiguration in Compile := {
val old = (packageConfiguration in Compile in packageSrc).value
val newSources = (sourceDirectories in Compile).value.flatMap(_ ** "*.kt" get)
new Package.Configuration(
old.sources ++ newSources.map(f => f -> f.getName),
old.jar,
old.options
)
}
For the documentation artifact, I added Gradle build to my Kotlin module. I set it up as shown here https://github.com/makiftutuncu/e/blob/master/e-kotlin/build.gradle.kts. This way, I make Gradle build generate the Dokka documentation. And finally, added following setting in SBT to run Gradle while building docs:
// Delegate doc generation to Gradle and Dokka
doc in Compile := {
import sys.process._
Process(Seq("./gradlew", "dokkaJavadoc"), baseDirectory.value).!
target.value / "api"
}
I admit, this is a lot of work just to get 2 artifacts but it did the trick for me. 🤷🏻 Hope this helps.
Related
I have a simple SBT project, consisting of some Scala code in src/main/scala and some test code in src/test/scala. I use the sbt-assembly plugin to create a fat jar for deployment onto remote systems. The fat jar includes all the dependencies of the Scala project, including the Scala runtime itself. This all works great.
Now I'm trying to figure out a way I can run the Scala tests against the fat jar. I tried the obvious thing, creating a new config extending the Test config and modifying the dependencyClasspath to be the fat JAR instead of the default value, however this fails because (I assume because) the Scala runtime is included in the fat jar and collides somehow with the already-loaded Scala runtime.
My solution right now works but it has serious drawbacks. I just use Fork.java to invoke Java on the org.scalatest.tools.Runner runner with a classpath set to include the test code and the fat jar and all of the test dependencies. The downside is that none of the SBT test richness works, there's no testQuick, there's not testOnly, and the test failure reporting is on stdout.
My question boils down to this: how does one use SBT's test commands to run tests when those tests are dependent not on their corresponding SBT compile output, but on a fat JAR file which itself includes all the Scala runtimes?
This is what I landed on (for specs2, but can be adapted). This is basically what you said was your Fork solution, but I figured I'd leave this here in case someone wanted to know what that might be. Unfortunately I don't think you can run this "officially" as a SBT test runner. I should also add that you still want Fork.java even though this is Scala, because Fork.scala depends on a runner class that I don't seem to have.
test.sbt (or build.sbt, if you want to put a bunch of stuff there - SBT reads all .sbt files in the root if you want to organize):
// Set up configuration for building a test assembly
Test / assembly / assemblyJarName := s"${name.value}-test-${version.value}.jar"
Test / assembly / assemblyMergeStrategy := (assembly / assemblyMergeStrategy).value
Test / assembly / assemblyOption := (assembly / assemblyOption).value
Test / assembly / assemblyShadeRules := (assembly / assemblyShadeRules).value
Test / assembly / mainClass := Some("org.specs2.runner.files")
Test / test := {
(Test / assembly).value
val assembledFile: String = (Test / assembly / assemblyOutputPath).value.getAbsolutePath
val minimalClasspath: Seq[String] = (Test / assembly / fullClasspath).value
.filter(_.metadata.get(moduleID.key).get.organization.matches("^(org\\.(scala-lang|slf4j)|log4j).*"))
.map(_.data.getAbsolutePath)
val runClass: String = (Test / assembly / mainClass).value.get
val classPath: Seq[String] = Seq(assembledFile) ++ minimalClasspath
val args: Seq[String] = Seq("-cp", classPath.mkString(":"), runClass)
val exitCode = Fork.java((Test / assembly / forkOptions).value, args)
if (exitCode != 0) {
throw new TestsFailedException()
}
}
Test / assembly / test := {}
Change in build.sbt:
lazy val root = (project in file("."))
.settings(/* your original settings are here */)
.settings(inConfig(Test)(baseAssemblySettings): _*) // enable assembling in test
I have a Scala project, using SBT. I have a directory html inside my project which needs to be copied when the project is being run with sbt run, or when I package it into a Jar using sbt-assembly. Either way, I'll expect to have the html directory copied to target/scala-2.11/classes/html.
I have tried:
resourceDirectory in Compile := file("html")
...which moves each of the files inside html to target/scala-2.11/classes without the intermediate html directory.
and:
unmanagedResources in Compile := Seq(file("html"))
...which copies the directory, but none of the files inside it!
Maybe not so nice, but working:
val html = "html"
lazy val compileCopyTask = taskKey[Unit](s"Copy $html.")
compileCopyTask := {
println(s"Start copying $html")
val mainVersion = scalaVersion.value.split("""\.""").take(2).mkString(".")
val to = target.value / ("scala-" + mainVersion) / html / "classes"
to.mkdirs()
val from = baseDirectory.value / html
IO.copyDirectory(from,to)
println(s"$from -> $to...done.")
}
compile in Compile := {
compileCopyTask.value
(compile in Compile).value
}
if you run sbt copy-resourcesafter you sbt compile
you might have some luck. Ran into this recently.
your html folder will have to be in scr/main/resources or wherever your resourceDirectory is set in your build....
I have a built.sbt that references a child project's main class as its own main class:
lazy val akka = (project in file("."))
.aggregate(api)
.dependsOn(api)
.enablePlugins(JavaAppPackaging)
lazy val api = project in file("api")
scalaVersion := "2.11.6"
// This is referencing API code
mainClass in (Compile, run) := Some("maslow.akka.cluster.node.ClusterNode")
artifactName := { (sv: ScalaVersion, module: ModuleID, artifact: Artifact) =>
s"""${artifact.name}.${artifact.extension}"""
}
name in Universal := name.value
packageName in Universal := name.value
However, each time I run sbt run I get the following error:
> run
[info] Updating {file:/Users/mark/dev/Maslow-Akka/}api...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Updating {file:/Users/mark/dev/Maslow-Akka/}akka...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Running maslow.akka.cluster.node.ClusterNode
[error] (run-main-0) java.lang.ClassNotFoundException: maslow.akka.cluster.node.ClusterNode
java.lang.ClassNotFoundException: maslow.akka.cluster.node.ClusterNode
at java.lang.ClassLoader.findClass(ClassLoader.java:530)
As I've been doing some research into the problem, I first switched to the project to api from akka and then opened up console. From there, it can't find the maslow package even though it most certainly exists. After that, I went into the api folder and ran sbt console and it accessed the aforementioned package just fine. After I do this, sbt run from the akka project works. Why?
The folder api is pulled in via git read-tree. There shouldn't be anything special about it. I'm using sbt 0.13.5
I think in a multi-project build a global line such as
mainClass in (Compile, run) := ...
will just be swallowed without consequences, as it doesn't refer to any project.
Probably the following works:
mainClass in (Compile, run) in ThisBuild := ...
Or you add it to the root project's settings:
lazy val akka = (project in file("."))
.aggregate(api)
.dependsOn(api)
.enablePlugins(JavaAppPackaging)
.settings(
mainClass in (Compile, run) := ...
)
The problem was this line: lazy val api = project in file("api")
From the docs:
When defining a dependency on another project, you provide a ProjectReference. In the simplest case, this is a Project object. (Technically, there is an implicit conversion Project => ProjectReference) This indicates a dependency on a project within the same build.
This indicates a dependency within the same build. Instead, what I needed was to use RootProject since api is an external build:
It is possible to declare a dependency on a project in a directory separate from the current build, in a git repository, or in a project packaged into a jar and accessible via http/https. These are referred to as external builds and projects. You can reference the root project in an external build with RootProject:
In order to solve this problem, I removed the project declarations out of build.sbt into project/Build.scala:
import sbt._
object MyBuild extends Build {
lazy val akka = Project("akka", file(".")).aggregate(api).dependsOn(api)
lazy val api = RootProject(file("api"))
}
To be clear, the problem was that my api sub-project was a ProjectRef and not a RootProject.
I'm using the IvyDE Eclipse plugin to resolve my project's (let's call it A) dependencies. One of those dependencies is another project (call it B) which I build using sbt. The final part of the build is to invoke sbt's publishLocal task to publish the B project's artefacts to the local Ivy repository.
It all works fine up to this point. However, when I then try to resolve the project A dependencies, IvyDE generates an error message saying that it failed to resolve the "doc" and "src" jars. I can force it to work in the end because it doesn't fail to resolve the main jar containing the actual code. Still, I find it somewhat annoying.
After spending a considerable amount of time looking for a solution, I have come up with the following sbt code to change my project B configuration:
import sbt.Build
import sbt.Project
import java.io.File
import sbt.PathExtra
import sbt.SettingKey
import sbt.Artifact
import sbt.Keys.{ artifact, artifactName, artifactPath, packageSrc, packageDoc, crossTarget, projectID, scalaVersion, scalaBinaryVersion, moduleName }
import sbt.ScalaVersion
import sbt.Configurations.{ Compile }
import sbt.Configurations
/**
* The objective here is to make the artefacts published to the local repository by the publishLocal task
* compatible with the local resolver used by IvyDE.
* This is achieved by dropping the classifiers from the "doc" and "source" artefacts, and by adding
* an extra level directory to their paths to avoid clashing with the "jar" main artefact.
*/
object SbtIvyFix extends Build with PathExtra {
lazy override val projects = Seq(root)
lazy val root: Project = Project("xlstocsv", new File(".")) settings (
artifact in (Compile, packageSrc) := {
(artifact in (Compile, packageSrc)).value.copy(classifier = None)
},
artifact in (Compile, packageDoc) := {
(artifact in (Compile, packageDoc)).value.copy(classifier = None)
},
artifactPath in (Compile, packageSrc) <<= myArtifactPathSetting(artifact in (Compile, packageSrc)),
artifactPath in (Compile, packageDoc) <<= myArtifactPathSetting(artifact in (Compile, packageDoc)))
// Lifted from the Sbt source artifactPathSetting() function in Defaults.scala
def myArtifactPathSetting(art: SettingKey[Artifact]) = (crossTarget, projectID, art, scalaVersion in artifactName, scalaBinaryVersion in artifactName, artifactName) {
(t, module, a, sv, sbv, toString) =>
{
t / a.`type` / toString(ScalaVersion(sv, sbv), module, a)
}
}
}
This works quite well but feels somewhat over the top. Can someone suggest a simpler way of achieving the same result?
My code (Java) reads an image from jar:
Main.class.getResourceAsStream("/res/logo.png")
Everything runs fine (if I start the app after packaging it into a jar). But when I run it using sbt's run task, it returns me null instead of needed stream.
Running this from sbt console also gives null:
getClass.getResourceAsStream("/res/logo.png")
Is there a way to tell sbt to put my resources on classpath?
EDIT:
I set the resources dir to be same as source dir:
build.sbt:
resourceDirectory <<= baseDirectory { _ / "src" }
When I loaded sbt's `console' and ran the following:
classOf[Main].getProtectionDomain().getCodeSource()
I got the location of my classes, but it does not contain neither res folder nor any of my resource files.
Seems that sbt copies resources only to the resulting jar, and does not copy them to classes dir. Should I modify compile task to move these resources files to classes dir?
EDIT2:
Yes, when I manually copy the resource file to classes dir, I can easily access it from console. So, how should I automate this process?
EDIT3:
It seems that sbt is just unable to see my resource folder - it does not add files to resulting jar file, actually!
Solution:
resourceDirectory in Compile <<= baseDirectory { _ / "src" }
I can't give you a full solution right now, but there is a setting called resourceDirectories to which you could add the res folder.
[EDIT]
For me it didn't work also if the resource was in the standard resource folder. Please try it that way:
Main.class.getClassLoader().getResourceAsStream("icon.png")
[EDIT2] This is the full build script (build.scala) which works if your resource is in src/main/java:
import sbt._
import Keys._
object TestBuild extends Build {
lazy val buildSettings = Seq(
organization := "com.test",
version := "1.0-SNAPSHOT",
scalaVersion := "2.9.1"
)
lazy val test = Project(
id = "test",
base = file("test"),
settings = Defaults.defaultSettings ++ Seq(resourceDirectory in Compile <<= javaSource in Compile)
)
}