Have sbt put javadocs and sources of dependencies on the class path - scala

When using a managed dependency, I can tell sbt to download the javadocs and sources:
"mygroup" % "mymodule" % "myversion" withJavadoc() withSources()
But these jars don't seem to be on the runtime classpath?
What I would like to do, is access the javadocs and sources from my application. Can I make these jars appear as managed resources, such that I could do
ClassLoader.getSystemClassLoader.getResource("/my/package/MyDependency.scala")
?

You can do this by adding a classifier.
For a given library dependency, add a javadoc or sources classifer:
libraryDependencies += "org.scalaz" %% "scalaz-core" % "7.0.6" classifier "javadoc"
libraryDependencies += "org.scalaz" %% "scalaz-core" % "7.0.6" classifier "sources"
Then, access its contents from the classpath:
val docStream = getClass.getResourceAsStream("""/scalaz/Monad$.html""")
val doc = io.Source.fromInputStream(docStream).mkString
println(doc)
Here's a working example: https://earldouglas.com/ext/stackoverflow.com/questions/22160701/

Related

can`t import kamon-play-26 using SBT

I updated my play to 2.6.0. I have a kamon dependency but sbt can't resolve this dependency.
Did anyone encounter this problem too?
Below is my libraryDependencies in the build.sbt:
libraryDependencies +=
Seq(
ws,
"com.google.inject" % "guice" % "3.0",
"com.typesafe.play" %% "play-json" % "2.6.0",
"io.kamon" %% "kamon-play-26" % "0.6.7"
)
But I get a below error as kamon-play-26 not found...
Kamon for Play 2.6 is available for Scala 2.11 and 2.12 with:
"io.kamon" %% "kamon-play-2.6" % "0.6.8"
Note the period in 2.6.
Searching through the kamon repositories in maven reveals that there is no kamon-play-26 package.
The github page https://github.com/kamon-io/kamon-play indicates that it does exist however. Perhaps its been pulled because the build is failing. Compile your own package from source, perhaps?

How to copy local cached jars to a folder via SBT?

I want to copy all the jars specified in the libraryDependencies to be copy to a folder in a task.
For example, I have the following dependencies defined for the project.
libraryDependencies ++= Seq(
"org.neo4j" % "neo4j" % neo4j_version,
"org.scala-lang.modules" %% "scala-java8-compat" % "0.8.0",
"org.scala-lang" %% "scala-pickling" % "0.9.1",
"org.neo4j.test" % "neo4j-harness" % neo4j_version % "test",
"org.neo4j.driver" % "neo4j-java-driver" % "1.0.4" % "test"
)
Now I want to create a task so that every time I run the task, it will copy the jars in the dependencies to a folder.
I know I can manually specify the absolute paths for the jars to copy. I want a task that can automatically derive the paths to the jars. So later when I add a new dependency, I do not need to find the path in .ivy cache again.
Thanks.
You can use managedClasspath to figure this out. See below for an example.
val copyJarsTask = TaskKey[Unit]("copy-jars", "Copys jars")
libraryDependencies ++= Seq(
"org.scala-lang" %% "scala-pickling" % "0.9.1"
)
copyJarsTask := {
val folder = new File("my-jars")
(managedClasspath in Compile).value.files.foreach { f =>
IO.copyFile(f, folder / f.getName)
}
}
Another option is to use the sbt-native-packager plugin with the Java Archetype and run:
sbt stage
The result will be that all your application dependencies and the JAR or the application itself will end up in the target/universal/stage/lib/ directory.

What is the difference between "container" and "provided" in SBT dependencies?

When reading build.sbt of many web applications, one can often see dependencies marked as "provided", see e.g. sbt-assembly documentation:
"org.apache.spark" %% "spark-core" % "0.8.0-incubating" % "provided"
I was unable to find any mention in SBT documentation, however Maven documentation says following about provided:
provided
This is much like compile, but indicates you expect the JDK or a container to provide the dependency at runtime
Sometimes however I have also seen "container" in the same position, like in this build.sbt. Is this the same thing?
val tomcatVersion = "7.0.53"
libraryDependencies ++= Seq(
"org.apache.tomcat.embed" % "tomcat-embed-core" % tomcatVersion % "container",
"org.apache.tomcat.embed" % "tomcat-embed-logging-juli" % tomcatVersion % "container",
"org.apache.tomcat.embed" % "tomcat-embed-jasper" % tomcatVersion % "container",
"org.apache.tomcat" % "tomcat-catalina" % tomcatVersion % "provided",
"org.apache.tomcat" % "tomcat-coyote" % tomcatVersion % "provided"
)
That forth element of the dependency associates the dependency with a configuration; establishing a configuration dependency. It originates with ivy, which sbt uses internally.
The "container" configuration is defined by
xsbt-web-plugin version 0.9, which is brought into the project you reference here.
It is being used to establish the container/hosting runtime for sbt container:start.
As an aside - that runtime would necessarily provide the runtime libraries corresponding to the "provided" configuration, which were used during the compile phase but not included in the transitive dependencies for the resulting artifacts.

How to add Java dependencies to Scala projects's sbt file

I have a spark streaming Scala project which uses Apache NiFi receiver. The projects runs fine under Eclipse/Scala IDE and now I want to package it for deployment now.
When I add it as
libraryDependencies += "org.apache.nifi" %% "nifi-spark-receiver" % "0.3.0"
sbt assumes it's a Scala library and tries to resolve it.
How doe I add NiFi receiver and all it's dependencies to project's SBT file?
Also, is it possible to pint dependencies to local directories instead of sbt trying to resolve?
Thanks in advance.
Here is my sbt file contents:
name := "NiFi Spark Test"
version := "1.0"
scalaVersion := "2.10.5"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.2" % "provided"
libraryDependencies += "org.apache.nifi" %% "nifi-spark-receiver" % "0.3.0"
libraryDependencies += "org.apache.nifi" %% "nifi-spark-receiver" % "0.3.0"
Double % are used for adding scala version as suffix to the maven artefact. It is required because different scala compiler versions produces incompatible bytecode. If you are would like to use java library from maven, then you should use single % character
libraryDependencies += "org.apache.nifi" % "nifi-spark-receiver" % "0.3.0"
I also found that I can put libraries the project depends on into the lib folder and they will be picked up during assembly.

Adding a third-party library to Scala project (Idea 12 with SBT plugin)

I'm developing a Scala application in IntelliJ Idea 12. I have sbt plugin for Idea installed (Setting -> Plugins -> Browse repositories ...). Now I want to use some extra libraries for Scala, let's say one of them is https://github.com/stevej/scala-json. So I downloaded zip file from its source code from its github repository.
What do I do next? What is the standard way of adding a third-party library to Scala project using Intelli Idea 12 with SBT plugin installed?
Try something like this in the .sbt file:
libraryDependencies ++= Seq(
"com.typesafe.slick" %% "slick" % "1.0.0",
"postgresql" % "postgresql" % "9.1-901-1.jdbc4",
"org.scalatest" %% "scalatest" % "1.9.1",
"net.sf.opencsv" % "opencsv" % "2.3",
"org.apache.commons" % "commons-math3" % "3.0"
)
you have to create your .sbt file in your project directory if you don't have one.
This is a quick tutorial on sbt (and another one)