How to make sbt publishLocal task compatible with IvyDE? - eclipse

I'm using the IvyDE Eclipse plugin to resolve my project's (let's call it A) dependencies. One of those dependencies is another project (call it B) which I build using sbt. The final part of the build is to invoke sbt's publishLocal task to publish the B project's artefacts to the local Ivy repository.
It all works fine up to this point. However, when I then try to resolve the project A dependencies, IvyDE generates an error message saying that it failed to resolve the "doc" and "src" jars. I can force it to work in the end because it doesn't fail to resolve the main jar containing the actual code. Still, I find it somewhat annoying.
After spending a considerable amount of time looking for a solution, I have come up with the following sbt code to change my project B configuration:
import sbt.Build
import sbt.Project
import java.io.File
import sbt.PathExtra
import sbt.SettingKey
import sbt.Artifact
import sbt.Keys.{ artifact, artifactName, artifactPath, packageSrc, packageDoc, crossTarget, projectID, scalaVersion, scalaBinaryVersion, moduleName }
import sbt.ScalaVersion
import sbt.Configurations.{ Compile }
import sbt.Configurations
/**
* The objective here is to make the artefacts published to the local repository by the publishLocal task
* compatible with the local resolver used by IvyDE.
* This is achieved by dropping the classifiers from the "doc" and "source" artefacts, and by adding
* an extra level directory to their paths to avoid clashing with the "jar" main artefact.
*/
object SbtIvyFix extends Build with PathExtra {
lazy override val projects = Seq(root)
lazy val root: Project = Project("xlstocsv", new File(".")) settings (
artifact in (Compile, packageSrc) := {
(artifact in (Compile, packageSrc)).value.copy(classifier = None)
},
artifact in (Compile, packageDoc) := {
(artifact in (Compile, packageDoc)).value.copy(classifier = None)
},
artifactPath in (Compile, packageSrc) <<= myArtifactPathSetting(artifact in (Compile, packageSrc)),
artifactPath in (Compile, packageDoc) <<= myArtifactPathSetting(artifact in (Compile, packageDoc)))
// Lifted from the Sbt source artifactPathSetting() function in Defaults.scala
def myArtifactPathSetting(art: SettingKey[Artifact]) = (crossTarget, projectID, art, scalaVersion in artifactName, scalaBinaryVersion in artifactName, artifactName) {
(t, module, a, sv, sbv, toString) =>
{
t / a.`type` / toString(ScalaVersion(sv, sbv), module, a)
}
}
}
This works quite well but feels somewhat over the top. Can someone suggest a simpler way of achieving the same result?

Related

How do I make sbt include non-Java sources to published artifact?

How do I make sbt include non-Java sources to published artifact ?
I'm using Kotlin plugin and can't figure out how to force sbt to include .kt file into published source jar. It only includes .java files.
A lot of people online suggest adding following code to sbt script but it doesn't help
mappings in (Compile, packageSrc) ++= {
val base = (sourceManaged in Compile).value
val files = (managedSources in Compile).value
files.map { f => (f, f.relativeTo(base).get.getPath) }
},
I also tried
includeFilter in (Compile, packageSrc) := "*.scala" || "*.java" || "*.kt",
Here is output of some variables in sbt console
sbt:collections> show unmanagedSourceDirectories
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/scala
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/java
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/kotlin
sbt:collections> show unmanagedSources
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/java/com/walkmind/extensions/collections/TestSomething.java
which plugin you use for kotlin?
https://github.com/pfn/kotlin-plugin has the option kotlinSource to configure where the source directory is located.
sbt packageBin compiled kotlin files and include them to output jar.
build.sbt
// define kotlin source directory
kotlinSource in Compile := baseDirectory.value / "src/main/kotlin",
src/main/kotlin/org.test
package org.test
fun main(args: Array<String>) {
println("Hello World!")
}
console
sbt compile
sbt packageBin
target/scala-2.13
jar include MainKt.class
and folder org/test contains MainKt.class too.
would this solve your problem?
I found a workaround for this in my project https://github.com/makiftutuncu/e. I made following: https://github.com/makiftutuncu/e/blob/master/project/Settings.scala#L105
Basically, I added following setting in SBT to properly generate sources artifact:
// Include Kotlin files in sources
packageConfiguration in Compile := {
val old = (packageConfiguration in Compile in packageSrc).value
val newSources = (sourceDirectories in Compile).value.flatMap(_ ** "*.kt" get)
new Package.Configuration(
old.sources ++ newSources.map(f => f -> f.getName),
old.jar,
old.options
)
}
For the documentation artifact, I added Gradle build to my Kotlin module. I set it up as shown here https://github.com/makiftutuncu/e/blob/master/e-kotlin/build.gradle.kts. This way, I make Gradle build generate the Dokka documentation. And finally, added following setting in SBT to run Gradle while building docs:
// Delegate doc generation to Gradle and Dokka
doc in Compile := {
import sys.process._
Process(Seq("./gradlew", "dokkaJavadoc"), baseDirectory.value).!
target.value / "api"
}
I admit, this is a lot of work just to get 2 artifacts but it did the trick for me. 🤷🏻 Hope this helps.

Intertwined dependencies between sbt plugin and projects within multi-project build that uses the plugin itself

I'm developing a library that includes an sbt plugin. Naturally, I'm using sbt to build this (multi-project) library. My (simplified) project looks as follows:
myProject/ # Top level of library
-> models # One project in the multi-project sbt build.
-> src/main/scala/... # Defines common models for both sbt-plugin and framework
-> sbt-plugin # The sbt plugin build
-> src/main/scala/...
-> framework # The framework. Ideally, the sbt plugin is run as part of
-> src/main/scala/... # compiling this directory.
-> project/ # Multi-project build configuration
Is there a way to have the sbt-plugin defined in myProject/sbt-plugin be hooked into the build for myProject/framework all in a unified build?
Note: similar (but simpler) question: How to develop sbt plugin in multi-project build with projects that use it?
Is there a way to have the sbt-plugin defined in myProject/sbt-plugin be hooked into the build for myProject/framework all in a unified build?
I have a working example on Github eed3si9n/plugin-bootstrap. It's not super pretty, but it kind of works. We can take advantage of the fact that sbt is recursive.
The project directory is another build inside your build, which knows how to build your build. To distinguish the builds, we sometimes use the term proper build to refer to your build, and meta-build to refer to the build in project. The projects inside the metabuild can do anything any other project can do. Your build definition is an sbt project.
By extension, we can think of the sbt plugins to be library- or inter-project dependencies to the root project of your metabuild.
meta build definition (project/plugins.sbt)
In this example, think of the metabuild as a parallel universe or shadow world that has parallel multi-build structure as the proper build (root, model, sbt-plugin).
To reuse the source code from model and sbt-plugin subprojects in the proper build, we can re-create the multi-project build in the metabuild. This way we don't need to get into the circular dependency.
addSbtPlugin("com.eed3si9n" % "sbt-doge" % "0.1.5")
lazy val metaroot = (project in file(".")).
dependsOn(metaSbtSomething)
lazy val metaModel = (project in file("model")).
settings(
sbtPlugin := true,
scalaVersion := "2.10.6",
unmanagedSourceDirectories in Compile :=
mirrorScalaSource((baseDirectory in ThisBuild).value.getParentFile / "model")
)
lazy val metaSbtSomething = (project in file("sbt-plugin")).
dependsOn(metaModel).
settings(
sbtPlugin := true,
scalaVersion := "2.10.6",
unmanagedSourceDirectories in Compile :=
mirrorScalaSource((baseDirectory in ThisBuild).value.getParentFile / "sbt-plugin")
)
def mirrorScalaSource(baseDirectory: File): Seq[File] = {
val scalaSourceDir = baseDirectory / "src" / "main" / "scala"
if (scalaSourceDir.exists) scalaSourceDir :: Nil
else sys.error(s"Missing source directory: $scalaSourceDir")
}
When sbt loads up, it will build metaModel and metaSbtSomething first, and use metaSbtSomething as a plugin to your proper build.
If you have any other plugins you need you can just add it to project/plugins.sbt normally as I've added sbt-doge.
proper build (build.sbt)
The proper build looks like a normal multi-project build.
As you can see framework subproject uses SomethingPlugin. Important thing is that they share the source code, but the target directory is completely separated, so there are no interference once the proper build is loaded, and you are changing code around.
import Dependencies._
lazy val root = (project in file(".")).
aggregate(model, framework, sbtSomething).
settings(inThisBuild(List(
scalaVersion := scala210,
organization := "com.example"
)),
name := "Something Root"
)
// Defines common models for both sbt-plugin and framework
lazy val model = (project in file("model")).
settings(
name := "Something Model",
crossScalaVersions := Seq(scala211, scala210)
)
// The framework. Ideally, the sbt plugin is run as part of building this.
lazy val framework = (project in file("framework")).
enablePlugins(SomethingPlugin).
dependsOn(model).
settings(
name := "Something Framework",
crossScalaVersions := Seq(scala211, scala210),
// using sbt-something
somethingX := "a"
)
lazy val sbtSomething = (project in file("sbt-plugin")).
dependsOn(model).
settings(
sbtPlugin := true,
name := "sbt-something",
crossScalaVersions := Seq(scala210)
)
demo
In the SomethingPlugin example, I'm defining something task that uses foo.Model.x.
package foo
import sbt._
object SomethingPlugin extends AutoPlugin {
def requries = sbt.plugins.JvmPlugin
object autoImport {
lazy val something = taskKey[Unit]("")
lazy val somethingX = settingKey[String]("")
}
import autoImport._
override def projectSettings = Seq(
something := { println(s"something! ${Model.x}") }
)
}
Here's how we can invoke something task from the build:
Something Root> framework/something
something! 1
[success] Total time: 0 s, completed May 29, 2016 3:01:07 PM
1 comes from foo.Model.x, so this demonstrates that we are using the sbt-something plugin in framework subproject, and that the plugin is using metaModel.

sbt-assembly include test classes

I follow sbt-assembly : including test classes from a config described in https://github.com/sbt/sbt-assembly that work ok doing assembly
When I load sbt I get
assembly.sbt:5: error: reference to jarName is ambiguous;
it is imported twice in the same scope by
import sbtassembly.AssemblyKeys._
and import _root_.sbtassembly.AssemblyPlugin.autoImport._
jarName in (Test, assembly) := s"${name.value}-test-${version.value}.jar"
^
So, I comment import line and run sbt:assembly but that begin the test but dont generate any -test-.jar.
Any one know how to generate the jar that include the test classes?
Thanks
I had to remove this line (I think it is now autoimported based on https://github.com/sbt/sbt-assembly/blob/546d200477b64e2602beeb65bfa04306122cd9f5/Migration.md)
import sbtassembly.AssemblyKeys._
And I added the rest (i.e. the two lines below) to build.sbt instead of assembly.sbt:
Project.inConfig(Test)(baseAssemblySettings)
jarName in (Test, assembly) := s"${name.value}-test-${version.value}.jar"
After taking those steps, test:assembly does produce a test jar for me however I expected the jar to only include test classes (similar to test:package), but it seems to include non-test classes as well. In other words, if I have src/main/scala/Foo.scala and src/test/scala/FooTest.scala then I thought that the jar produced by test:assembly would only include FooTest.class but it seems to also include Foo.class. Hopefully that's not an issue for you as I'm not yet sure how to workaround that.
EDIT: If you want the jar to only include classes from src/test (like I did), then you can add the following to your build.sbt to filter out everything else that may be on your classpath:
fullClasspath in (Test, assembly) := {
val cp = (fullClasspath in (Test, assembly)).value
cp.filter({x => x.data.getPath.contains("test-classes")})
}
This works for me:
lazy val root = project.settings(
assembly / fullClasspath := (assembly / fullClasspath).value ++ (Test / fullClasspath).value
)

Running a sub-project main class

I have a built.sbt that references a child project's main class as its own main class:
lazy val akka = (project in file("."))
.aggregate(api)
.dependsOn(api)
.enablePlugins(JavaAppPackaging)
lazy val api = project in file("api")
scalaVersion := "2.11.6"
// This is referencing API code
mainClass in (Compile, run) := Some("maslow.akka.cluster.node.ClusterNode")
artifactName := { (sv: ScalaVersion, module: ModuleID, artifact: Artifact) =>
s"""${artifact.name}.${artifact.extension}"""
}
name in Universal := name.value
packageName in Universal := name.value
However, each time I run sbt run I get the following error:
> run
[info] Updating {file:/Users/mark/dev/Maslow-Akka/}api...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Updating {file:/Users/mark/dev/Maslow-Akka/}akka...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Running maslow.akka.cluster.node.ClusterNode
[error] (run-main-0) java.lang.ClassNotFoundException: maslow.akka.cluster.node.ClusterNode
java.lang.ClassNotFoundException: maslow.akka.cluster.node.ClusterNode
at java.lang.ClassLoader.findClass(ClassLoader.java:530)
As I've been doing some research into the problem, I first switched to the project to api from akka and then opened up console. From there, it can't find the maslow package even though it most certainly exists. After that, I went into the api folder and ran sbt console and it accessed the aforementioned package just fine. After I do this, sbt run from the akka project works. Why?
The folder api is pulled in via git read-tree. There shouldn't be anything special about it. I'm using sbt 0.13.5
I think in a multi-project build a global line such as
mainClass in (Compile, run) := ...
will just be swallowed without consequences, as it doesn't refer to any project.
Probably the following works:
mainClass in (Compile, run) in ThisBuild := ...
Or you add it to the root project's settings:
lazy val akka = (project in file("."))
.aggregate(api)
.dependsOn(api)
.enablePlugins(JavaAppPackaging)
.settings(
mainClass in (Compile, run) := ...
)
The problem was this line: lazy val api = project in file("api")
From the docs:
When defining a dependency on another project, you provide a ProjectReference. In the simplest case, this is a Project object. (Technically, there is an implicit conversion Project => ProjectReference) This indicates a dependency on a project within the same build.
This indicates a dependency within the same build. Instead, what I needed was to use RootProject since api is an external build:
It is possible to declare a dependency on a project in a directory separate from the current build, in a git repository, or in a project packaged into a jar and accessible via http/https. These are referred to as external builds and projects. You can reference the root project in an external build with RootProject:
In order to solve this problem, I removed the project declarations out of build.sbt into project/Build.scala:
import sbt._
object MyBuild extends Build {
lazy val akka = Project("akka", file(".")).aggregate(api).dependsOn(api)
lazy val api = RootProject(file("api"))
}
To be clear, the problem was that my api sub-project was a ProjectRef and not a RootProject.

How to configure sbt to load resources when running application?

My code (Java) reads an image from jar:
Main.class.getResourceAsStream("/res/logo.png")
Everything runs fine (if I start the app after packaging it into a jar). But when I run it using sbt's run task, it returns me null instead of needed stream.
Running this from sbt console also gives null:
getClass.getResourceAsStream("/res/logo.png")
Is there a way to tell sbt to put my resources on classpath?
EDIT:
I set the resources dir to be same as source dir:
build.sbt:
resourceDirectory <<= baseDirectory { _ / "src" }
When I loaded sbt's `console' and ran the following:
classOf[Main].getProtectionDomain().getCodeSource()
I got the location of my classes, but it does not contain neither res folder nor any of my resource files.
Seems that sbt copies resources only to the resulting jar, and does not copy them to classes dir. Should I modify compile task to move these resources files to classes dir?
EDIT2:
Yes, when I manually copy the resource file to classes dir, I can easily access it from console. So, how should I automate this process?
EDIT3:
It seems that sbt is just unable to see my resource folder - it does not add files to resulting jar file, actually!
Solution:
resourceDirectory in Compile <<= baseDirectory { _ / "src" }
I can't give you a full solution right now, but there is a setting called resourceDirectories to which you could add the res folder.
[EDIT]
For me it didn't work also if the resource was in the standard resource folder. Please try it that way:
Main.class.getClassLoader().getResourceAsStream("icon.png")
[EDIT2] This is the full build script (build.scala) which works if your resource is in src/main/java:
import sbt._
import Keys._
object TestBuild extends Build {
lazy val buildSettings = Seq(
organization := "com.test",
version := "1.0-SNAPSHOT",
scalaVersion := "2.9.1"
)
lazy val test = Project(
id = "test",
base = file("test"),
settings = Defaults.defaultSettings ++ Seq(resourceDirectory in Compile <<= javaSource in Compile)
)
}