How to resolve a non-jar (dll/jnilib) library dependencies in sbt? - scala

In a SBT build.sbt project file, is it possible to retrieve library dependencies which are not bundled as jar?
In my case, I am trying to use QTSampledSP which requires .dll and .jnilib libraries.

To download the artifact, you need to make Ivy (and hence sbt) explicitly aware of the DLL artifact. Add the following to build.sbt in your project.
lazy val QtSampledJniLibArt = Artifact("qtsampledsp-osx", "jnilib", "jnilib")
libraryDependencies += "com.tagtraum" % "qtsampledsp-osx" % "0.9.6" artifacts(QtSampledJniLibArt)
resolvers += "beatunes" at "http://www.beatunes.com/repo/maven2"
Then you need to tell sbt to pay attention to these artifacts (again build.sbt):
classpathTypes ++= Set("jnilib", "dll")
By default, sbt will only add a few types into the classpath (and jnilib and dll are not amongst them).
[sbt-0-13-1]> help classpathTypes
Artifact types that are included on the classpath.
[sbt-0-13-1]> show classpathTypes
[info] Set(eclipse-plugin, bundle, hk2-jar, orbit, jar)
Since these DLLs/jnilibs are needed on the classpath to run correctly, the above setting classpathTypes where you add the additional types will correct things as you can see below (don't forget to reload when in sbt console).
[sbt-0-13-1]> show classpathTypes
[info] Set(eclipse-plugin, bundle, hk2-jar, jnilib, orbit, jar, dll)
If you need to look in more detail at these files, check out the update report (from the update task) where you can inspect all configurations/modules/artifacts. Run show update in sbt console and look at the files in target/resolution-cache/reports.

Related

Sbt Plugins vs Compiler Plugins

I am trying to understand why Adding sbt plugins in plugins.sbt in the project, works perfectly fine, but if I add the compiler plugins in that file it does not work ?
I thought any .sbt or .scala file in project, is made available for the build definition.
The only place where compiler plugins works is in the build.sbt. Hence i am confused as to why ?
In particular i am working with Kind-Projector
addCompilerPlugin("org.typelevel" %% "kind-projector" % "0.11.3" cross CrossVersion.full)
I see the following alias for the function
/** Adds `dependency` to `libraryDependencies` in the auto-compiler plugin configuration. */
def addCompilerPlugin(dependency: ModuleID): Setting[Seq[ModuleID]] =
libraryDependencies += compilerPlugin(dependency)
Hence just trying to understanding, what makes it that it can only be added to the build.sbt and not plugins.sbt in project/
Remeber sbt is recursive.
.sbt define things that are available in the current layer.
.scala files define thins that will be available in the next layer.
Adding an sbt plugin in project/bar.sbt is adding that plugin to the meta layer, as such the meta-layer that compiles the sbt you are using to compile your project adds those plugins to the next sbt layer.
So if you add a compiler plugin in project/foo.sbt then you are adding that compiler plugin to the compiler used to compile the project (meta) layer of sbt, but it will not be available in the current layer of sbt. That is the reason why compiler plugins are added in the build.sbt file, so they are added to the compiler used to compile your code.

Including a Spark Package JAR file in a SBT generated fat JAR

The spark-daria project is uploaded to Spark Packages and I'm accessing spark-daria code in another SBT project with the sbt-spark-package plugin.
I can include spark-daria in the fat JAR file generated by sbt assembly with the following code in the build.sbt file.
spDependencies += "mrpowers/spark-daria:0.3.0"
val requiredJars = List("spark-daria-0.3.0.jar")
assemblyExcludedJars in assembly := {
val cp = (fullClasspath in assembly).value
cp filter { f =>
!requiredJars.contains(f.data.getName)
}
}
This code feels like a hack. Is there a better way to include spark-daria in the fat JAR file?
N.B. I want to build a semi-fat JAR file here. I want spark-daria to be included in the JAR file, but I don't want all of Spark in the JAR file!
The README for version 0.2.6 states the following:
In any case where you really can't specify Spark dependencies using sparkComponents (e.g. you have exclusion rules) and configure them as provided (e.g. standalone jar for a demo), you may use spIgnoreProvided := true to properly use the assembly plugin.
You should then use this flag on your build definition and set your Spark dependencies as provided as I do with spark-sql:2.2.0 in the following example:
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0" % "provided"
Please note that by setting this your IDE may no longer have the necessary dependencies references to compile and run your code locally, which would mean that you would have to add the necessary JARs to the classpath by hand. I do this often on IntelliJ, what I do is having a Spark distribution on my machine and adding its jars directory to the IntelliJ project definition (this question may help you with that, should you need it).

sbt assembly package dependencies in multiple artifacts

I am trying to generate, with sbt assembly, from a single project several jars. Each containing some of the dependencies.
So far I have found only this QA that is close to what I am looking for. However I don't need to have separate configs, basically when I run assembly, I just want to generate all the different jars.
To be more concrete. I want to generate:
One jar with my code and some general dependencies
One jar with hadoop dependencies <- this is the problem, as I don't know how to say, generate another jar that has only those dependencies.
One jar with scala
Without going deep into complex sbt configurations, you could try another approach. The hadoop dependencies being standard, you could mark them as provided in your build to exclude them.
"org.apache.hadoop" % "hadoop-client" % "2.6.0" % "provided"
For Scala, the library jar is also standard and can be downloaded separately by your "user". To remove it from the fat jar, use the following setting (assembly 0.13.0):
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
The user of your fat jar is then aked to provide both Scala and Hadoop libraries in the classpath.
For example, when using Spark this is the correct approach as these two libraries are both provided by the Spark running environment. The same logic applies for the Hadoop MapReduce environment.

How to use external dependencies in sbt's .scala files?

This is for Scala 2.11.1 and sbt 0.13.5.
Say I have a Scala/sbt project with the following directory structure:
root/
build.sbt
src/ ..
project/
plugins.sbt
build.properties
LolUtils.scala
and I want to use some external library in LolUtils.scala. How is this generally accomplished in sbt?
If I simply add the libs I need into build.sbt via libraryDependencies += .. then it doesn't find them and fails on the import line with not found: object ...
If I add a separate project/build.sbt, for some reason it starts failing to resolve my plugins, plus I need to manually specify the Scala version in the nested project/build.sbt, which is unnecessary duplication.
What's the best way to accomplish this?
sbt is recursive which means that it uses itself to compile a build definition, i.e. *.sbt files and *.scala files under project directory. To add extra dependencies to use them in the build definition you have to declare them in a project/build.sbt.
There is one caveat to that. You can set any scalaVersion to your project, that is in build.sbt, but you should not modify scalaVersion in the project/build.sbt as it might conflict with the version sbt itself uses (that may or may not lead to binary incompatibility for plugins).
Sbt 0.13.5 is using Scala 2.10.4, and the library you're going to use must be compatible with that particular version of Scala.
> about
[info] This is sbt 0.13.5
...
[info] sbt, sbt plugins, and build definitions are using Scala 2.10.4

File of one of the sbt plugin's dependencies

I need to get hold of the File reference to a specific artifact during the setup phase of my sbt's plugin.
I've tried:
obtaining the ivy home directory, but that basically means assuming where ivy will place the files (they could be even be in a local maven)
parsing System.getProperty("java.class.path"), but it only contains the sbt-launch jar
obtaining the resolved sbt jars from the update.value setting, but it doesn't have any of the plugin's jars in the list! (only the jars for the application being compiled)
Short of invoking the Ivy API manually, is there any way to get the File to the plugin's jar dependency?
NOTE: This is a very specific part of how to write an sbt plugin to launch the app with an agent factored out into a separate question.
got it! adding the dependency explicitly within the source reveals its resolved path:
override val projectSettings = Seq(
libraryDependencies += "com.github.fommil.lion" %% "agent" % "1.0-SNAPSHOT",
javaOptions ++= Seq(s"-Dhack=${update.value}}")
)
has a reference in it!