Why sbt compile doesn't copy unmanaged resources to classpath? - scala

Could you tell me why sbt compile doesn't copy unmanaged resources to classpath? On the other hand sbt package does. As result I can't start debugging unless I invoke package call manually :(
I'm using SBT 0.12.1
Below is my build.sbt.
import AssemblyKeys._ // put this at the top of the file
net.virtualvoid.sbt.graph.Plugin.graphSettings
assemblySettings
organization := "com.zzz"
version := "0.1"
scalaVersion := "2.10.2"
scalacOptions := Seq("-unchecked", "-language:reflectiveCalls,postfixOps,implicitConversions", "-deprecation", "-feature", "-encoding", "utf8")
unmanagedResourceDirectories in Compile <++= baseDirectory { base =>
Seq( base / "src/main/webapp" )
}
jarName in assembly := "zzz.jar"
mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
{
case "rootdoc.txt" => MergeStrategy.discard
case x => old(x)
}
}
mainClass in assembly := Some("com.zzz.Boot")
name := "zzz"
// disable using the Scala version in output paths and artifacts
crossPaths := false
artifactName := { (sv: ScalaVersion, module: ModuleID, artifact: Artifact) =>
artifact.name + "." + artifact.extension
}
resolvers ++= Seq(
"spray repo" at "http://repo.spray.io/"
)
libraryDependencies ++= {
val sprayVersion = "1.2-M8"
val akkaVersion = "2.2.0-RC1"
Seq(
"io.spray" % "spray-servlet" % sprayVersion withSources(),
"io.spray" % "spray-can" % sprayVersion withSources(),
"io.spray" % "spray-routing" % sprayVersion withSources(),
"com.typesafe.akka" %% "akka-actor" % akkaVersion,
"org.eclipse.jetty" % "jetty-webapp" % "8.1.7.v20120910" % "container",
"org.eclipse.jetty.orbit" % "javax.servlet" % "3.0.0.v201112011016" % "container" artifacts Artifact("javax.servlet", "jar", "jar"),
"net.sourceforge.htmlcleaner" % "htmlcleaner" % "2.2",
"org.apache.httpcomponents" % "httpclient" % "4.2.3",
"org.apache.commons" % "commons-lang3" % "3.1",
"org.mongodb" %% "casbah" % "2.6.0",
"com.romix.scala" % "scala-kryo-serialization" % "0.2-SNAPSHOT",
"org.codehaus.jettison" % "jettison" % "1.3.3",
"com.osinka.subset" %% "subset" % "2.0.1",
"io.spray" %% "spray-json" % "1.2.5" intransitive()
)
}
seq(Revolver.settings: _*)
seq(webSettings: _*)
seq(Twirl.settings: _*)

The job of compile is to compile sources, so it won't typically do anything related to processing resources. However, the resources need to be in the class directory for run, package, test, console, and anything else that uses the fullClasspath. This is done by fullClasspath combining exportedProducts, which are the classes and resources generated by the current project, and dependencyClasspath, which are the classpath entries from dependencies.
The appropriate solution depends on what needs the resources. From the command line, run exported-products to do a compile as well as copy-resources. Programmatically, you will typically want to depend on fullClasspath or exportedProducts.
As a side note, you can typically find out what tasks do what using the inspect command.

Related

Dependency duplication when creating a fat jar for playframework with play-ws

I noticed that when trying to build fat jara for playframework 2.7 with play-ws (sbt assembly), dependency duplications occur. I get a lot of errors related to javax.activation-api and shaded-asynchttpclient e.g.
[error] deduplicate: different file contents found in the following:
[error] /home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/javax/activation/javax.activation-api/1.2.0/javax.activation-api-1.2.0.jar:javax/activation/UnsupportedDataTypeException.class
[error] /home/user/.cache/coursier/v1/https/repo1.maven.org/maven2/com/typesafe/play/shaded-asynchttpclient/2.0.6/shaded-asynchttpclient-2.0.6.jar:javax/activation/UnsupportedDataTypeException.class
The problem turns out to be the play-ws without which sbt assembly is carried out correctly. The only place in my code where I explicitly use javax is dependency injection. Using guice instead gives the same result. Here is my build.sbt (which is based on https://www.playframework.com/documentation/2.7.x/Deploying)
name := "My-App"
version := "1.0"
scalaVersion := "2.11.12"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaSource in ThisScope := baseDirectory.value
mainClass in assembly := Some("play.core.server.ProdServerStart")
fullClasspath in assembly += Attributed.blank(PlayKeys.playPackageAssets.value)
libraryDependencies ++= Seq(
ws,
specs2 % Test,
"com.typesafe.play" %% "play-json" % "2.7.4",
"com.typesafe.play" %% "play-slick" % "4.0.2",
"com.typesafe.play" %% "play-slick-evolutions" % "4.0.2",
"com.typesafe" % "config" % "1.4.0",
"org.mindrot" % "jbcrypt" % "0.4",
"mysql" % "mysql-connector-java" % "8.0.17",
"org.mindrot" % "jbcrypt" % "0.4",
"com.iheart" %% "ficus" % "1.4.7",
"com.typesafe.scala-logging" % "scala-logging_2.11" % "3.9.0"
)
assemblyMergeStrategy in assembly := {
case manifest if manifest.contains("MANIFEST.MF") =>
// We don't need manifest files since sbt-assembly will create
// one with the given settings
MergeStrategy.discard
case referenceOverrides if referenceOverrides.contains("reference-overrides.conf") =>
// Keep the content for all reference-overrides.conf files
MergeStrategy.concat
case x =>
// For all the other files, use the default sbt-assembly merge strategy
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(x)
}

Transitive dependency errors in SBT multi-project

I am building a SBT multi-project project, which has common module and logic module, and logic.dependsOn(common).
In common, SparkSQL 2.2.1 ("org.apache.spark" %% "spark-sql" % "2.2.1") is introduced. In logic, SparkSQL also is used, however I get compilation errors, saying "object spark is not a member of package org.apache".
Now, if I add SparkSQL dependency to logic as "org.apache.spark" %% "spark-sql" % "2.2.1", it works. However if I add "org.apache.spark" %% "spark-sql" % "2.2.1" % Provided", I get the same error.
I don't get why this happened, why not the dependency can't be transitive from common to logic
here is the root sbt files:
lazy val commonSettings = Seq(
organization := "...",
version := "0.1.0",
scalaVersion := "2.11.12",
resolvers ++= Seq(
clojars,
maven_local,
novus,
twitter,
spark_packages,
artima
),
test in assembly := {},
assemblyMergeStrategy in assembly := {...}
)
lazy val root = (project in file(".")).aggregate(common, logic)
lazy val common = (project in file("common")).settings(commonSettings:_*)
lazy val logic = (project in file("logic")).dependsOn(common).settings(commonSettings:_*)
here is the logic module sbt file:
libraryDependencies ++= Seq(
spark_sql.exclude("io.netty", "netty"),
embedded_elasticsearch % "test",
scalatest % "test"
)
dependencyOverrides ++= Seq(
"com.fasterxml.jackson.core" % "jackson-core" % "2.6.5",
"com.fasterxml.jackson.core" % "jackson-databind" % "2.6.5",
"com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.6.5",
"com.fasterxml.jackson.core" % "jackson-annotation" % "2.6.5",
"org.json4s" %% "json4s-jackson" % "3.2.11"
)
assemblyJarName in assembly := "***.jar"

Scala/Spark: Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging

I am pretty new to scala and spark. Trying to fix my set-up of spark/scala development. I am confused by the versions and missing jars. I searched on stackoverflow, but still stuck in this issue. Maybe something missing or mis-configured.
Running commands:
me#Mycomputer:~/spark-2.1.0$ bin/spark-submit --class ETLApp /home/me/src/etl/target/scala-2.10/etl-assembly-0.1.0.jar
Output:
...
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/Logging
...
Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging
build.sbt:
name := "etl"
version := "0.1.0"
scalaVersion := "2.10.5"
javacOptions ++= Seq("-source", "1.8", "-target", "1.8")
mainClass := Some("ETLApp")
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.2" % "provided";
libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.5.2" % "provided";
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.5.2" % "provided";
libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka" % "1.5.2";
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.5.0-M2";
libraryDependencies += "org.apache.curator" % "curator-recipes" % "2.6.0"
libraryDependencies += "org.apache.curator" % "curator-test" % "2.6.0"
libraryDependencies += "args4j" % "args4j" % "2.32"
java -version
java version "1.8.0_101"
scala -version
2.10.5
spark version
2.1.0
Any hints welcomed. Thanks
in that case, your jar must bring all dependend classes along when being submitted to spark.
in maven this would be possible with the assembly plugin and the jar-with-dependencies descriptor. with sbt a quick google found this: https://github.com/sbt/sbt-assembly
you can change your build.sbt as follows:
name := "etl"
version := "0.1.0"
scalaVersion := "2.10.5"
scalacOptions ++= Seq("-deprecation",
"-feature",
"-Xfuture",
"-encoding",
"UTF-8",
"-unchecked",
"-language:postfixOps")
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.5.2" % Provided,
"org.apache.spark" %% "spark-sql" % "1.5.2" % Provided,
"org.apache.spark" %% "spark-streaming" % "1.5.2" % Provided,
"org.apache.spark" %% "spark-streaming-kafka" % "1.5.2" % Provided,
"com.datastax.spark" %% "spark-cassandra-connector" % "1.5.0-M2",
"org.apache.curator" % "curator-recipes" % "2.6.0",
"org.apache.curator" % "curator-test" % "2.6.0",
"args4j" % "args4j" % "2.32")
mainClass in assembly := Some("your.package.name.ETLApp")
assemblyJarName in assembly := s"${name.value}-${version.value}.jar"
assemblyMergeStrategy in assembly := {
case m if m.toLowerCase.endsWith("manifest.mf") => MergeStrategy.discard
case m if m.toLowerCase.matches("meta-inf.*\\.sf$") => MergeStrategy.discard
case "reference.conf" => MergeStrategy.concat
case x: String if x.contains("UnusedStubClass.class") => MergeStrategy.first
case _ => MergeStrategy.first
}
add the sbt-assembly plugin to your plugins.sbt file under the project directory in your Project's Root directory. Running sbt assembly in the Terminal(Linux) or CMD(Windows) in the root directory of your project would download all the dependencies for you and create an U

container:start stopped working in Scalatra sbt project : "invalid key"

I have a scalatra project built the standard way using giter8.
I am uncertain why container:start no longer functions in my scalatra project: no change was made to build.sbt. here is the error:
Using /home/stephen/.sbt/0.12.0 as sbt dir, -sbt-dir to override.
[info] Set current project to wfdemo (in build file:/home/stephen/wfdemo/)
> container:start
[error] Not a valid key: start (similar: state, startYear, target)
[error] container:start
[error]
Here is the sbt:
object KeywordsservletBuild extends Build {
val Organization = "com.astralync"
val Name = "KeywordsServlet"
val Version = "0.1.0-SNAPSHOT"
val ScalaVersion = "2.11.7"
val ScalatraVersion = "2.4.0-RC2-2"
lazy val project = Project (
"keywordsservlet",
file("."),
settings = ScalatraPlugin.scalatraSettings ++ scalateSettings ++ Seq(
organization := Organization,
name := Name,
version := Version,
scalaVersion := ScalaVersion,
resolvers += Classpaths.typesafeReleases,
resolvers += "Scalaz Bintray Repo" at "http://dl.bintray.com/scalaz/releases",
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.3.1",
"org.scalatra" %% "scalatra" % ScalatraVersion,
"org.scalatra" %% "scalatra-scalate" % ScalatraVersion,
"org.scalatra" %% "scalatra-specs2" % ScalatraVersion % "test",
"ch.qos.logback" % "logback-classic" % "1.1.2" % "runtime",
"org.eclipse.jetty" % "jetty-webapp" % "9.2.13.v20150730" % "container",
"javax.servlet" % "javax.servlet-api" % "3.1.0" % "provided",
"net.databinder" % "unfiltered-netty_2.11" % "0.8.4"
),
scalateTemplateConfig in Compile <<= (sourceDirectory in Compile){ base =>
Seq(
TemplateConfig(
base / "webapp" / "WEB-INF" / "templates",
Seq.empty, /* default imports should be added here */
Seq(
Binding("context", "_root_.org.scalatra.scalate.ScalatraRenderContext", importMembers = true, isImplicit = true)
), /* add extra bindings here */
Some("templates")
)
)
}
)
)
}
Suggestions appreciated.
Small but not necessarily obvious fix for this:
; container:start ; shell
The leading semicolon is required.

java.lang.NoSuchMethodError in Scalatra using Scalate with Markdown

So I have a Scalatra app (using Scalatra 2.2.1). I'm building views using Scalate; I've decided to go with the Jade/Markdown one-two. Only one problem: if I try to use markdown in a jade template (started with the :markdown tag), I get this:
scala.Predef$.any2ArrowAssoc(Ljava/lang/Object;)Lscala/Predef$ArrowAssoc;
java.lang.NoSuchMethodError: scala.Predef$.any2ArrowAssoc(Ljava/lang/Object;)Lscala/Predef$ArrowAssoc;
at org.fusesource.scalamd.Markdown$.<init>(md.scala:119)
at org.fusesource.scalamd.Markdown$.<clinit>(md.scala:-1)
at org.fusesource.scalate.filter.ScalaMarkdownFilter$.filter(ScalaMarkdownFilter.scala:32)
at org.fusesource.scalate.RenderContext$class.filter(RenderContext.scala:276)
at org.fusesource.scalate.DefaultRenderContext.filter(DefaultRenderContext.scala:30)
at org.fusesource.scalate.RenderContext$class.value(RenderContext.scala:235)
at org.fusesource.scalate.DefaultRenderContext.value(DefaultRenderContext.scala:30)
at templates.views.$_scalate_$about_jade$.$_scalate_$render(about_jade.scala:37)
at templates.views.$_scalate_$about_jade.render(about_jade.scala:48)
at org.fusesource.scalate.DefaultRenderContext.capture(DefaultRenderContext.scala:92)
at org.fusesource.scalate.layout.DefaultLayoutStrategy.layout(DefaultLayoutStrategy.scala:45)
at org.fusesource.scalate.TemplateEngine$$anonfun$layout$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(TemplateEngine.scala:559)
at org.fusesource.scalate.TemplateEngine$$anonfun$layout$1$$anonfun$apply$mcV$sp$1.apply(TemplateEngine.scala:559)
at org.fusesource.scalate.TemplateEngine$$anonfun$layout$1$$anonfun$apply$mcV$sp$1.apply(TemplateEngine.scala:559)
So that's pretty cool. The error vanishes as soon as I remove the :markdown flag, and beyond that everything compiles (beyond markdown not getting rendered correctly).
Things I know and have found so far:
There's some thought that this error is the biproduct of incompatible Scala versions somewhere in the build. My build.scala defines Scala version as 2.10.0, which Scalatra is explicitly compatible with.
...that said, I have no idea which version of Scalate Scalatra pulls in, and my attempts to override it have not worked so far. I know the current stable of Scalate (1.6.1) is only compatible up to Scala 2.10.0 -- but that's what I'm using.
I am, however, sure that my classpath is clean. I have no conflicting Scala versions. Everything is 2.10.0 in the dependencies.
Has anybody worked with this one before? Any ideas?
EDIT
Per request, here are my build definitions:
//build.sbt
libraryDependencies += "org.scalatest" %% "scalatest" % "2.0.M5b" % "test"
libraryDependencies += "org.twitter4j" % "twitter4j-core" % "3.0.3"
libraryDependencies += "org.fusesource.scalamd" % "scalamd" % "1.5"
//build.properties
sbt.version=0.12.3
//build.scala
import sbt._
import Keys._
import org.scalatra.sbt._
import org.scalatra.sbt.PluginKeys._
import com.mojolly.scalate.ScalatePlugin._
import ScalateKeys._
object TheRangeBuild extends Build {
val Organization = "com.gastove"
val Name = "The Range"
val Version = "0.1.0-SNAPSHOT"
val ScalaVersion = "2.10.0"
val ScalatraVersion = "2.2.1"
lazy val project = Project (
"the-range",
file("."),
settings = Defaults.defaultSettings ++ ScalatraPlugin.scalatraWithJRebel ++ scalateSettings ++ Seq(
organization := Organization,
name := Name,
version := Version,
scalaVersion := ScalaVersion,
resolvers += Classpaths.typesafeReleases,
libraryDependencies ++= Seq(
"org.scalatra" %% "scalatra" % ScalatraVersion,
"org.scalatra" %% "scalatra-scalate" % ScalatraVersion,
"org.scalatra" %% "scalatra-specs2" % ScalatraVersion % "test",
"ch.qos.logback" % "logback-classic" % "1.0.6" % "runtime",
"org.eclipse.jetty" % "jetty-webapp" % "8.1.8.v20121106" % "compile;container",
"org.eclipse.jetty.orbit" % "javax.servlet" % "3.0.0.v201112011016" % "compile;container;provided;test" artifacts (Artifact("javax.servlet", "jar", "jar"))
),
scalateTemplateConfig in Compile <<= (sourceDirectory in Compile){ base =>
Seq(
TemplateConfig(
base / "webapp" / "WEB-INF" / "templates",
Seq.empty, /* default imports should be added here */
Seq(
Binding("context", "_root_.org.scalatra.scalate.ScalatraRenderContext", importMembers = true, isImplicit = true)
), /* add extra bindings here */
Some("templates")
)
)
}
) ++ seq(com.typesafe.startscript.StartScriptPlugin.startScriptForClassesSettings: _*)
)
}
When using the markdown filter, you need to add the scalamd library as runtime dependency:
"org.fusesource.scalamd" %% "scalamd" % "1.6"
The most recent version can be found on Maven Central
Also you can delete the build.sbt file and put the dependencies into build.scala file which makes things a bit simpler.
import sbt._
import Keys._
import org.scalatra.sbt._
import org.scalatra.sbt.PluginKeys._
import com.mojolly.scalate.ScalatePlugin._
import ScalateKeys._
object TheRangeBuild extends Build {
val Organization = "com.gastove"
val Name = "The Range"
val Version = "0.1.0-SNAPSHOT"
val ScalaVersion = "2.10.0"
val ScalatraVersion = "2.2.1"
lazy val project = Project(
"the-range",
file("."),
settings = Defaults.defaultSettings ++ ScalatraPlugin.scalatraWithJRebel ++ scalateSettings ++ Seq(
organization := Organization,
name := Name,
version := Version,
scalaVersion := ScalaVersion,
resolvers += Classpaths.typesafeReleases,
libraryDependencies ++= Seq( // adding this Seq to the libraryDependencies
"org.scalatra" %% "scalatra" % ScalatraVersion,
"org.scalatra" %% "scalatra-scalate" % ScalatraVersion,
"org.scalatra" %% "scalatra-specs2" % ScalatraVersion % "test",
"ch.qos.logback" % "logback-classic" % "1.0.6" % "runtime",
"org.eclipse.jetty" % "jetty-webapp" % "8.1.8.v20121106" % "compile;container",
"org.scalatest" %% "scalatest" % "2.0.M5b" % "test",
"org.twitter4j" % "twitter4j-core" % "3.0.3",
"org.fusesource.scalamd" % "scalamd" % "1.6",
"org.eclipse.jetty.orbit" % "javax.servlet" % "3.0.0.v201112011016" % "compile;container;provided;test" artifacts (Artifact("javax.servlet", "jar", "jar"))
),
scalateTemplateConfig in Compile <<= (sourceDirectory in Compile){ base =>
Seq(
TemplateConfig(
base / "webapp" / "WEB-INF" / "templates",
Seq.empty, /* default imports should be added here */
Seq(
Binding("context", "_root_.org.scalatra.scalate.ScalatraRenderContext", importMembers = true, isImplicit = true)
), /* add extra bindings here */
Some("templates")
)
)
}
) ++ seq(com.typesafe.startscript.StartScriptPlugin.startScriptForClassesSettings: _*)
)
}
This comes from:
libraryDependencies += "org.fusesource.scalamd" % "scalamd" % "1.5"
Looking at the scalamd-1.5 pom.xml, it is built against Scala 2.8.1, which is not compatible with 2.10.
Dependency resolution keeps 2.10 and discard the 2.8.1 dependency, and you end up with this classpath issue.
The only solution you have is to try and build a new scalamd version against Scala 2.10, potentially fix a few things to get it there, and then publish it (at least locally).