SBT how to add unmanaged jars to subproject tests? - scala

I have an SBT multiproject structure, but for some reason sbt does not recognize the unmanaged jar when tests are run. The objective is to have a stub subproject without an actual implementation and add this as a "Provided" dependency to the Main class, it should detect the jars in the /lib folder when provided. The actual implementation in the .jar file simply prints a random string.
Here is the project structure.
./MainClass
./MainClass/lib/printsomethingexample_2.13-0.1.0-SNAPSHOT.jar
./MainClass/src/test/scala/Main.scala
./MainClass/src/main/scala/Main.scala
./stub/lib/printsomethingexample_2.13-0.1.0-SNAPSHOT.jar
./stub/src/main/scala/test/stub/Example.scala
./build.sbt
File main/scala/Main.scala
import test.stub.Example
object Main extends App {
Example.printSomething()
}
File test/scala/Main.scala
import org.scalatest.flatspec._
import test.stub.Example
class Main extends AnyFlatSpec {
"Unmanaged jar" should "work" in {
Example.printSomething()
}
}
scala/test/stub/Example.scala:
package test.stub
object Example {
def printSomething(): Unit = ???
}
build.sbt:
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / scalaVersion := "2.13.10"
lazy val root = (project in file("."))
.settings(
name := "stubExample"
).aggregate(MainClass, stub)
lazy val stub = project
.in(file("stub"))
.settings(
name := "stub"
)
lazy val MainClass = project
.in(file("MainClass"))
.settings(
name := "MainClass",
libraryDependencies += "org.scalatest" %% "scalatest" % "3.2.14" % Test,
).dependsOn(`stub` % Provided)
When I run sbt "MainClass/runMain Main" it finds the unmanaged jars as expected and prints the string, but when I run sbt "MainClass/test" I get the following error:
[info] Main:
[info] Unmanaged jar
[info] - should work *** FAILED ***
[info] scala.NotImplementedError: an implementation is missing
Does anyone understand why the jars are not found?
*** EDIT ***
Here is the output of sbt "MainClass/runMain Main"
▶ sbt "MainClass/runMain Main"
[info] welcome to sbt 1.6.2 (Azul Systems, Inc. Java 1.8.0_312)
[info] loading global plugins from /Users/riccardo/.sbt/1.0/plugins
[info] loading project definition from /Users/riccardo/Documents/Projects/stubExample/project
[info] loading settings for project root from build.sbt ...
[info] set current project to stubExample (in build file:/Users/riccardo/Documents/Projects/stubExample/)
[info] running Main
Printed Something
The .jar has an implementation of test.stub.Example#printSomething (It is a different project)
package test.stub
object Example {
def printSomething(): Unit = println("Printed Something")
}
My expectation is that since the stub subproject is marked as Provided it should use the printSomething implementation from the jar in the classpath instead of the stub subproject, the stub subproject was marked as Provided and would expect it to be excluded from the classpath (maybe I don't understand how Provided works).
This behavior happens when running the main class as shown above, it uses the printSomething from the classpath in lib/printsomethingexample_2.13-0.1.0-SNAPSHOT.jar instead of using the stub that is not implemented. In my real life scenario, I need the dependency on the stub subproject.
If I remove the dependsOn the sbt test step works, but the project wouldn't compile without the unmanaged jars in the /lib folder, hence why we use a stub subproject as dependecy, just to make the project compile.

Related

How do I make sbt include non-Java sources to published artifact?

How do I make sbt include non-Java sources to published artifact ?
I'm using Kotlin plugin and can't figure out how to force sbt to include .kt file into published source jar. It only includes .java files.
A lot of people online suggest adding following code to sbt script but it doesn't help
mappings in (Compile, packageSrc) ++= {
val base = (sourceManaged in Compile).value
val files = (managedSources in Compile).value
files.map { f => (f, f.relativeTo(base).get.getPath) }
},
I also tried
includeFilter in (Compile, packageSrc) := "*.scala" || "*.java" || "*.kt",
Here is output of some variables in sbt console
sbt:collections> show unmanagedSourceDirectories
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/scala
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/java
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/kotlin
sbt:collections> show unmanagedSources
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/java/com/walkmind/extensions/collections/TestSomething.java
which plugin you use for kotlin?
https://github.com/pfn/kotlin-plugin has the option kotlinSource to configure where the source directory is located.
sbt packageBin compiled kotlin files and include them to output jar.
build.sbt
// define kotlin source directory
kotlinSource in Compile := baseDirectory.value / "src/main/kotlin",
src/main/kotlin/org.test
package org.test
fun main(args: Array<String>) {
println("Hello World!")
}
console
sbt compile
sbt packageBin
target/scala-2.13
jar include MainKt.class
and folder org/test contains MainKt.class too.
would this solve your problem?
I found a workaround for this in my project https://github.com/makiftutuncu/e. I made following: https://github.com/makiftutuncu/e/blob/master/project/Settings.scala#L105
Basically, I added following setting in SBT to properly generate sources artifact:
// Include Kotlin files in sources
packageConfiguration in Compile := {
val old = (packageConfiguration in Compile in packageSrc).value
val newSources = (sourceDirectories in Compile).value.flatMap(_ ** "*.kt" get)
new Package.Configuration(
old.sources ++ newSources.map(f => f -> f.getName),
old.jar,
old.options
)
}
For the documentation artifact, I added Gradle build to my Kotlin module. I set it up as shown here https://github.com/makiftutuncu/e/blob/master/e-kotlin/build.gradle.kts. This way, I make Gradle build generate the Dokka documentation. And finally, added following setting in SBT to run Gradle while building docs:
// Delegate doc generation to Gradle and Dokka
doc in Compile := {
import sys.process._
Process(Seq("./gradlew", "dokkaJavadoc"), baseDirectory.value).!
target.value / "api"
}
I admit, this is a lot of work just to get 2 artifacts but it did the trick for me. 🤷🏻 Hope this helps.

Exclude dependency from project in multi-module SBT project [duplicate]

In Build.scala I have a dependency between projects:
val coreLib = Projects.coreLib()
val consoleApp = Projects.consoleApp().dependsOn(coreLib)
val androidApp = Projects.androidProject().dependsOn(coreLib/*, exclusions = xpp */)
Core library project defines a library in its libraryDependencies (XPP parser), which I want to exclude in androidApp, since Android framework have its own XPP implementation out of the box.
How can I exclude XPP library from transitive dependencies of coreLib in androidApp project?
EDIT:
According to my research exclusion is possible ONLY to ModuleID which is used in conjunction with libraryDependency. Meanwhile dependsOn puts all transitive dependencies to classpath, there is no way in api to exclude some transitive dependencies of this project, you dependsOn
DETAILS:
I'm running sbt 0.13.5 currently.
libraryDependencies of commonLib as well as it various settings supplied in build.sbt so that this project could be reused as standalone, and because it feels right and natural way of supplying settings in sbt.
This appears to work for me:
val someApp = project.settings(
libraryDependencies += "junit" % "junit" % "4.11"
)
val androidApp = project.dependsOn(someApp).settings(
projectDependencies := {
Seq(
(projectID in someApp).value.exclude("junit", "junit")
)
}
)
What the projectDepenendencies is doing is what sbt, by default, attempts to do. It converts any inter-project dependencies into ModuleIDs which Ivy will use during resolution. Because the Project API has no way to specify excludes currently, we bypass this automatic layer and manually declare the Ivy dependency as well.
Result:
> show someApp/update
...
[info] Update report:
...
[info] compile:
[info] org.scala-lang:scala-library:2.10.4 (): (Artifact(scala-library,jar,jar,None,List(),None,Map()),/home/jsuereth/.sbt/boot/scala-2.10.4/lib/scala-library.jar)
[info] junit:junit:4.11: (Artifact(junit,jar,jar,None,ArraySeq(master),None,Map()),/home/jsuereth/.ivy2/cache/junit/junit/jars/junit-4.11.jar)
[info] org.hamcrest:hamcrest-core:1.3: (Artifact(hamcrest-core,jar,jar,None,ArraySeq(master),None,Map()),/home/jsuereth/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar)
...
And the dependent project with junit/hamcrest excluded:
> show androidApp/update
...
[info] Update report:
...
[info] compile:
[info] org.scala-lang:scala-library:2.10.4 (): (Artifact(scala-library,jar,jar,None,List(),None,Map()),/home/jsuereth/.sbt/boot/scala-2.10.4/lib/scala-library.jar)
[info] someapp:someapp_2.10:0.1-SNAPSHOT:
...

Running a sub-project main class

I have a built.sbt that references a child project's main class as its own main class:
lazy val akka = (project in file("."))
.aggregate(api)
.dependsOn(api)
.enablePlugins(JavaAppPackaging)
lazy val api = project in file("api")
scalaVersion := "2.11.6"
// This is referencing API code
mainClass in (Compile, run) := Some("maslow.akka.cluster.node.ClusterNode")
artifactName := { (sv: ScalaVersion, module: ModuleID, artifact: Artifact) =>
s"""${artifact.name}.${artifact.extension}"""
}
name in Universal := name.value
packageName in Universal := name.value
However, each time I run sbt run I get the following error:
> run
[info] Updating {file:/Users/mark/dev/Maslow-Akka/}api...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Updating {file:/Users/mark/dev/Maslow-Akka/}akka...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Running maslow.akka.cluster.node.ClusterNode
[error] (run-main-0) java.lang.ClassNotFoundException: maslow.akka.cluster.node.ClusterNode
java.lang.ClassNotFoundException: maslow.akka.cluster.node.ClusterNode
at java.lang.ClassLoader.findClass(ClassLoader.java:530)
As I've been doing some research into the problem, I first switched to the project to api from akka and then opened up console. From there, it can't find the maslow package even though it most certainly exists. After that, I went into the api folder and ran sbt console and it accessed the aforementioned package just fine. After I do this, sbt run from the akka project works. Why?
The folder api is pulled in via git read-tree. There shouldn't be anything special about it. I'm using sbt 0.13.5
I think in a multi-project build a global line such as
mainClass in (Compile, run) := ...
will just be swallowed without consequences, as it doesn't refer to any project.
Probably the following works:
mainClass in (Compile, run) in ThisBuild := ...
Or you add it to the root project's settings:
lazy val akka = (project in file("."))
.aggregate(api)
.dependsOn(api)
.enablePlugins(JavaAppPackaging)
.settings(
mainClass in (Compile, run) := ...
)
The problem was this line: lazy val api = project in file("api")
From the docs:
When defining a dependency on another project, you provide a ProjectReference. In the simplest case, this is a Project object. (Technically, there is an implicit conversion Project => ProjectReference) This indicates a dependency on a project within the same build.
This indicates a dependency within the same build. Instead, what I needed was to use RootProject since api is an external build:
It is possible to declare a dependency on a project in a directory separate from the current build, in a git repository, or in a project packaged into a jar and accessible via http/https. These are referred to as external builds and projects. You can reference the root project in an external build with RootProject:
In order to solve this problem, I removed the project declarations out of build.sbt into project/Build.scala:
import sbt._
object MyBuild extends Build {
lazy val akka = Project("akka", file(".")).aggregate(api).dependsOn(api)
lazy val api = RootProject(file("api"))
}
To be clear, the problem was that my api sub-project was a ProjectRef and not a RootProject.

sbt 0.13.1 multi-project module not found when I change the sbt default scala library to scala 2.11.2

I use the sbt 0.13.1 create the two modules, and I create project/MyBuild.scala to compile this two modules.
MyBuild.scala:
import sbt._
import Keys._
object MyBuild extends Build {
lazy val task = project.in(file("task"))
lazy val root = project.in(file(".")) aggregate(task) dependsOn task
}
When I change the scala library to 2.11.2 by set scalaHome. It will go to maven download the task.jar and failed, that's very strange. Is it a sbt bug?
There is the github test project address: test-sbt-0.13.1
This is because you have defined a custom scalaVersion in your main build.sbt which means it is defined for the root project only. The task project will use the default value:
jjst#ws11:test-sbt-0.13.1$ sbt
[...]
> projects
[info] In file:/home/users/jjost/dev/test-sbt-0.13.1/
[info] * root
[info] task
> show scalaVersion
[info] task/*:scalaVersion
[info] 2.10.4
[info] root/*:scalaVersion
[info] 2.11.2-local
As a consequence, artifacts generated by the task subproject won't be available for the root project to use.
You can solve this by making sure that your projects use the same scalaVersion. The easiest way to do so while keeping the same project structure would be to share common settings like the scala version across projects like so:
object MyBuild extends Build {
val commonSettings = Seq(
scalaVersion := "2.11.2-local",
scalaHome := Some(file("/usr/local/scala-2.11.2/"))
)
lazy val task = (project.in(file("task"))).
settings(commonSettings: _*)
lazy val root = (project.in(file("."))).
settings(commonSettings: _*).
aggregate(task).
dependsOn(task)
}
In practice, you might want to put common settings shared between projects in a dedicated file under project/common.scala, as recommended in Effective sbt.
I'd recommend you to move the sources of root project from root to subfolder (task2), and add them as aggregated. It will remove aggregating and depending on same project:
object MyBuild extends Build {
lazy val task = project.in(file("task"))
lazy val task2 = project.in(file("task2")) dependsOn task
lazy val root = project.in(file(".")) aggregate(task, task2)
}
This works for me, but it's strange that such non-standard project structure works fine with default scalaHome. Seems to be a problem in the way in which sbt resolves such dependency as external.
P.S. This answer doesn't cover the whole story. See #jjst's answer to clarify more.

MissingResourceException: Can't find bundle for base name com/sun/rowset/RowSetResourceBundle (SBT project)

Running sbt test via console (v0.13.1 on Windows) throws a MissingResourceException: : Can't find bundle for base name com/sun/rowset/RowSetResourceBundle, locale en_CA when I try to create an instance of com.sun.rowset.CachedRowSetImpl object. The same code runs great if I use IntelliJ to run a Specs test; it only fails when trying to run via SBT console.
Here is the specs2 test I'm trying to run:
import org.specs2.mutable.SpecificationWithJUnit
import javax.sql.rowset.CachedRowSet;
import com.sun.rowset.CachedRowSetImpl
class DatabaseTest extends SpecificationWithJUnit {
"CachedRowSet Test" should {
"Create a new CachedRowSetImpl instance" in {
val rowSet: CachedRowSet = new CachedRowSetImpl()
rowSet must_!= null
}
}
}
And the resulting exception:
[error] MissingResourceException: : Can't find bundle for base name com/sun/rowset/RowSetResourceBundle, locale en_CA (null:-1)
[error] com.sun.rowset.JdbcRowSetResourceBundle.<init>(Unknown Source)
[error] com.sun.rowset.JdbcRowSetResourceBundle.getJdbcRowSetResourceBundle(Unknown Source)
[error] com.sun.rowset.CachedRowSetImpl.<init>(Unknown Source)
[error] test.DatabaseTest$$anonfun$1$$anonfun$apply$1.apply(DatabaseTest.scala:10)
[error] test.DatabaseTest$$anonfun$1$$anonfun$apply$1.apply(DatabaseTest.scala:9)
Update: build.sbt contents:
scalacOptions ++= Seq("-deprecation", "-unchecked", "-feature")
scalaVersion := "2.10.3"
scalacOptions ++= Seq("-Yrangepos")
javacOptions ++= Seq("-source", "1.6", "-target", "1.6", "-Xlint:deprecation", "-Xlint:unchecked")
libraryDependencies in ThisBuild ++= Seq(
"postgresql" % "postgresql" % "9.1-901.jdbc4" withSources(),
"org.specs2" %% "specs2" % "2.3.7" withSources()
)
lazy val root = project.in(file("."))
I had this issue when running ScalaTest using SBT. In IntelliJ the tests ran clean but in SBT the late bound implementation for CachedRowSet did not get resolved and the following exception was thrown,
MissingResourceException: : Can't find bundle for base name
com/sun/rowset/RowSetResourceBundle, locale en_GB
The problem appears to be that the SBT thread does not resolve run time class loads back to the JVM (the missing resources are provided by rt.jar and resources.jar).
This can be fixed by forcing the SBT thread to fork new threads for each test by putting the following instruction into the main body of the SBT build file,
fork in Test := true
CachedRowSetImpl is an internal class in some version of the JDK runtime but is not officially part of the JDK. You can tell it is internal because its package starts with "com.sun", not "java". It's likely that your Windows machine has a completely different JDK that lacks this internal class.
I highly recommend avoiding the use of "com.sun" classes and find the idiomatic way to get the job done with the existing JDK.
Provided you use Java 7...
As of Java 7 there're few classes and methods that shield you from errors like yours. Develop against interfaces (with some help from providers) and life becomes easier.
import org.specs2.mutable.SpecificationWithJUnit
import javax.sql.rowset._
class DatabaseTest extends SpecificationWithJUnit {
"CachedRowSet Test" should {
"Create a new CachedRowSetImpl instance" in {
val rsf = RowSetProvider.newFactory
val rowSet: CachedRowSet = rsf.createCachedRowSet
rowSet must_!= null
}
}
}
With the Specs2 test, test worked fine for me.
> test
[info] DatabaseTest
[info]
[info] CachedRowSet Test should
[info] + Create a new CachedRowSetImpl instance
[info]
[info] Total for specification DatabaseTest
[info] Finished in 41 ms
[info] 1 example, 0 failure, 0 error
[info] Passed: Total 1, Failed 0, Errors 0, Passed 1
BTW: The missing resources are in lib/resources.jar in JRE, but couldn't figure out why they're reported as missing (even when they ended up in lib directory of a project).
Having the same issue. Not in Scala, but Java.
com.sun.rowset.JdbcRowSetResourceBundle loads resource with context classloader.
In my case context classloader is a URLClassLoader.
And it looks like a really weird limitation of URLClassLoader - when looking for resources it does not consult parent classloader, it checks only its own URLs list.
Hope that helps.