Resolution failure for plugin when added to libraryDependencies in project? - scala

When I run sbt publishLocal, the plugin will be generated in <ivy-repository>/<org>/<plugin>/<scala-version>/<sbt-version>/<plugin-version>/...
For example:
[info] published sbt-cloudengine to /Users/hanxue/.ivy2/local/net.entrypass/sbt-cloudengine/scala_2.10/sbt_0.13/0.2.1/jars/sbt-cloudengine.jar
How can I exclude <scala-version> and <sbt-version> from the output path?
This path is causing resolution failure when I add the plugin as a dependency in build.sbt:
[warn] ==== Local Ivy Repository: tried
[warn] file:///Users/hanxue/.ivy2/local/net/entrypass/sbt-cloudengine/0.2.1/sbt-cloudengine-0.2.1.pom
Plugin's build.sbt is:
sbtPlugin := true
name := "sbt-cloudengine"
organization := "net.entrypass"
version := "0.2.1"
description := "sbt plugin for managing Google Cloud Engine resources"
licenses := Seq("BSD License" -> url("https://github.com/hanxue/sbt-cloudengine/blob/master/LICENSE"))
scalacOptions := Seq("-deprecation", "-unchecked")
publishArtifact in (Compile, packageBin) := true
publishArtifact in (Test, packageBin) := false
publishArtifact in (Compile, packageDoc) := false
publishArtifact in (Compile, packageSrc) := false
publishMavenStyle := false
Update 1
This is how the plugin is referenced in a project's <rootdir>/build.sbt
resolvers += "Local Ivy Repository" at "file://"+Path.userHome.absolutePath+"/.ivy2/local"
libraryDependencies ++= Seq(
"net.entrypass" % "sbt-cloudengine" % "0.2.1"
)
This is the directory listing
$ ls -R ~/.ivy2/local/net.entrypass/sbt-cloudengine/scala_2.10/sbt_0.13/0.2.1/
ivys jars poms
/Users/hanxue/.ivy2/local/net.entrypass/sbt-cloudengine/scala_2.10/sbt_0.13/0.2.1//ivys:
ivy.xml ivy.xml.md5 ivy.xml.sha1
/Users/hanxue/.ivy2/local/net.entrypass/sbt-cloudengine/scala_2.10/sbt_0.13/0.2.1//jars:
sbt-cloudengine.jar sbt-cloudengine.jar.sha1
sbt-cloudengine.jar.md5
/Users/hanxue/.ivy2/local/net.entrypass/sbt-cloudengine/scala_2.10/sbt_0.13/0.2.1//poms:
sbt-cloudengine.pom sbt-cloudengine.pom.sha1
sbt-cloudengine.pom.md5

Since you're publishing a sbt plugin and not a library, the path will correctly contain the sbt version and the scala version.
Your problem comes from the fact you're trying to load the plugin using the libraryDependencies. Instead, you have to use the file project/plugins.sbt with the following inside:
addSbtPlugin("net.entrypass" % "sbt-cloudengine" % "0.2.1")
This way, sbt will search the plugin using the correct path with the help of the current scala and sbt versions.

Related

SBT not finding resolvers added in another sbt file

I have a SBT project, where I have added the resolvers in build.sbt. Now, instead of adding the resolvers in build.sbt file, I am trying to put in a new file, resolvers.sbt.
But the SBT is unable to find the artifacts if I make the resolvers in separate file. However, I can see the message while starting up the SBT that, my resolvers.sbt is being considered.
If I add the resolvers to global file in .sbt directory, it is getting resolved.
sbt version : 1.2.6
Did anyone else faced the same issue ?
build.sbt
import sbt.Credentials
name := "sbt-sample"
version := "0.1"
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
"com.reactore" %% "reactore-infra" % "1.0.0.0-DEV-SNAPSHOT"
)
resolvers.sbt
credentials += Credentials("Artifactory Realm", "192.168.1.120", "yadu", "password")
resolvers ++= Seq(
"Reactore-snapshots" at "http://192.168.1.120:8182/artifactory/libs-snapshot-local"
)
SBT Log
sbt:sbt-sample> reload
[info] Loading settings for project global-plugins from idea.sbt ...
[info] Loading global plugins from /home/administrator/.sbt/1.0/plugins
[info] Loading settings for project sbt-sample-build from resolvers.sbt ...
[info] Loading project definition from /home/administrator/source/poc/sbt-sample/project
[info] Loading settings for project sbt-sample from build.sbt ...
[info] Set current project to sbt-sample (in build file:/home/administrator/source/poc/sbt-sample/)
I am answering my own question here, hoping that it will be useful for someone else.
Found out the issue, I was keeping the resolvers.sbt inside project directory. I moved it to the project's home directory (where build.sbt is present), and now it is resolving.

SBT: How to exclude source files and documentation from the assembly?

I am using the plain assembly plugin with SBT but along with the assembled package, I get extra packages like:
mypackage_2.11.jar
mypackage_2.11-javadoc.jar
mypackage_2.11.-sourcesjar
Is there any way to skip those packages with SBT?
This should disable the generation of these JARs (see http://www.scala-sbt.org/0.13/docs/Detailed-Topics/Artifacts.html):
publishArtifact in (Compile, packageBin) := false
publishArtifact in (Compile, packageDoc) := false
publishArtifact in (Compile, packageSrc) := false

How to add "provided" dependencies back to run/test tasks' classpath?

Here's an example build.sbt:
import AssemblyKeys._
assemblySettings
buildInfoSettings
net.virtualvoid.sbt.graph.Plugin.graphSettings
name := "scala-app-template"
version := "0.1"
scalaVersion := "2.9.3"
val FunnyRuntime = config("funnyruntime") extend(Compile)
libraryDependencies += "org.spark-project" %% "spark-core" % "0.7.3" % "provided"
sourceGenerators in Compile <+= buildInfo
buildInfoPackage := "com.psnively"
buildInfoKeys := Seq[BuildInfoKey](name, version, scalaVersion, target)
assembleArtifact in packageScala := false
val root = project.in(file(".")).
configs(FunnyRuntime).
settings(inConfig(FunnyRuntime)(Classpaths.configSettings ++ baseAssemblySettings ++ Seq(
libraryDependencies += "org.spark-project" %% "spark-core" % "0.7.3" % "funnyruntime"
)): _*)
The goal is to have spark-core "provided" so it and its dependencies are not included in the assembly artifact, but to reinclude them on the runtime classpath for the run- and test-related tasks.
It seems that using a custom scope will ultimately be helpful, but I'm stymied on how to actually cause the default/global run/test tasks to use the custom libraryDependencies and hopefully override the default. I've tried things including:
(run in Global) := (run in FunnyRuntime)
and the like to no avail.
To summarize: this feels essentially a generalization of the web case, where the servlet-api is in "provided" scope, and run/test tasks generally fork a servlet container that really does provide the servlet-api to the running code. The only difference here is that I'm not forking off a separate JVM/environment; I just want to manually augment those tasks' classpaths, effectively "undoing" the "provided" scope, but in a way that continues to exclude the dependency from the assembly artifact.
For a similar case I used in assembly.sbt:
run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run))
and now the 'run' task uses all the libraries, including the ones marked with "provided". No further change was necessary.
Update:
#rob solution seems to be the only one working on latest SBT version, just add to settings in build.sbt:
run in Compile := Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run)).evaluated,
runMain in Compile := Defaults.runMainTask(fullClasspath in Compile, runner in(Compile, run)).evaluated
Adding to #douglaz' answer,
runMain in Compile <<= Defaults.runMainTask(fullClasspath in Compile, runner in (Compile, run))
is the corresponding fix for the runMain task.
Another option is to create separate sbt projects for assembly vs run/test. This allows you to run sbt assemblyProj/assembly to build a fat jar for deploying with spark-submit, as well as sbt runTestProj/run for running directly via sbt with Spark embedded. As added benefits, runTestProj will work without modification in IntelliJ, and a separate main class can be defined for each project in order to e.g. specify the spark master in code when running with sbt.
val sparkDep = "org.apache.spark" %% "spark-core" % sparkVersion
val commonSettings = Seq(
name := "Project",
libraryDependencies ++= Seq(...) // Common deps
)
// Project for running via spark-submit
lazy val assemblyProj = (project in file("proj-dir"))
.settings(
commonSettings,
assembly / mainClass := Some("com.example.Main"),
libraryDependencies += sparkDep % "provided"
)
// Project for running via sbt with embedded spark
lazy val runTestProj = (project in file("proj-dir"))
.settings(
// Projects' target dirs can't overlap
target := target.value.toPath.resolveSibling("target-runtest").toFile,
commonSettings,
// If separate main file needed, e.g. for specifying spark master in code
Compile / run / mainClass := Some("com.example.RunMain"),
libraryDependencies += sparkDep
)
If you use sbt-revolver plugin, here is a solution for its "reStart" task:
fullClasspath in Revolver.reStart <<= fullClasspath in Compile
UPD: for sbt-1.0 you may use the new assignment form:
fullClasspath in Revolver.reStart := (fullClasspath in Compile).value

Why sbt resolver with Resolver method doesn't work?

I've installed a jarjar:jarjar:1.0 in my local maven repo, and my build.sbt is
name := "test"
version := "1.0"
libraryDependencies += "jarjar" % "jarjar" % "1.0"
resolvers += Resolver.file("maven-l", file("/Volumes/Data/Repo/Maven"))
It says jarjar cannot be resolved, and show resolvers gives:
> show resolvers
[info] List(FileRepository(maven-l,FileConfiguration(true,None),sbt.Patterns#30ea6dbc))
This does not look right.
Using resolvers += "maven-l" at "/Volumes/Data/Repo/Maven"
It works just fine.
> show resolvers
[info] List(maven-l: /Volumes/Data/Repo/Maven)
I'm wondering why is this? is this a bug, or specified?
I'm using sbt 0.12.1
Reference: http://www.scala-sbt.org/release/docs/Detailed-Topics/Resolvers.html

How do I get sbt 0.10.0 to compile files in a subdirectory?

I have a build.sbt file in my project root.. all my source files live in the subdirectory src (and src/irc, src/xmpp).
Here is my build.sbt
name := "mrtoms"
organization := "chilon"
scalaVersion := "2.9.0"
version := "0.1"
libraryDependencies ++= Seq("commons-httpclient" % "commons-httpclient" % "3.1")
crossPaths := false
scalaHome := Some(file("/usr/share/scala"))
target := file("project/target")
sourceDirectory := file("src")
mainClass := Some("org.chilon.mrtoms.MrToms")
However sbt always just makes an empty jar file.
I tried putting build.sbt inside the "src" directory but then it missed out all the scala files in subdirectories of "src".
Seems that you need to provide path relative to the base directory. This should work for you (it replaces sourceDirectory := file("src")):
scalaSource in Compile <<= baseDirectory(_ / "src")
Some more information you can find in this thread:
http://groups.google.com/group/simple-build-tool/browse_thread/thread/095e87247d146fa7?fwc=1
If you want to replace the default convention then you need to override both scala and java source locations
scalaSource in Compile <<= baseDirectory(_ / "src")
javaSource in Compile <<= baseDirectory(_ / "src")