NoClassDefFoundError running tests in SBT with scoverage plugin - scala

I have an SBT project with structure like here: https://orrsella.com/2014/09/24/integration-and-end-to-end-test-configurations-in-sbt-for-scala-java-projects/. It includes standard main and test directories and additionally it and e2e. There is also a task "test-all" which runs all tests. Everything works correctly unless I run e2e or test-all together with coverage plugin. I'm getting: java.lang.NoClassDefFoundError: scoverage/Invoker$
Using show it:dependencyClasspath and show e2e:dependencyClasspath, I can see that e2e classpath is missing scoverage plugin jars. Any idea what's wrong and how to solve it?
Build.sbt
import org.scalatra.sbt._
import sbt.Keys._
import sbt._
object MaAppBuild extends Build {
val Organization = "com.my-org"
val Name = "My App"
val Version = "0.1.0-SNAPSHOT"
val ScalaVersion = "2.11.6"
val AkkaVersion = "2.3.4"
val ScalatraVersion = "2.3.0"
lazy val project = Project(
"My-App",
file("."),
configurations = Configurations.default ++ Testing.configs,
settings = Defaults.coreDefaultSettings ++ ScalatraPlugin.scalatraSettings ++ Testing.settings ++ Seq(
organization := Organization,
name := Name,
version := Version,
scalaVersion := ScalaVersion,
resolvers += "Sonatype OSS Snapshots" at "http://oss.sonatype.org/content/repositories/snapshots/",
resolvers += "Akka Repo" at "http://repo.akka.io/repository",
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % AkkaVersion,
"com.typesafe.akka" % "akka-testkit_2.11" % AkkaVersion % "test;it;e2e",
"net.databinder.dispatch" %% "dispatch-core" % "0.11.1",
"org.scalatra" %% "scalatra" % ScalatraVersion,
"com.typesafe.akka" %% "akka-testkit" % AkkaVersion % "test;it;e2e",
"org.scalatra" %% "scalatra-scalatest" % ScalatraVersion % "test;it;e2e",
"com.github.tomakehurst" % "wiremock" % "1.55" % "test;it;e2e",
"ch.qos.logback" % "logback-classic" % "1.0.6" % "runtime",
"org.scalatra" %% "scalatra-json" % "2.4.0.RC1",
"org.json4s" %% "json4s-jackson" % "3.2.11",
"com.typesafe" % "config" % "1.2.1",
"org.json4s" %% "json4s-native" % "3.2.11",
"org.eclipse.jetty" % "jetty-webapp" % "8.1.8.v20121106" % "container",
"org.eclipse.jetty.orbit" % "javax.servlet" % "3.0.0.v201112011016" % "container;provided;test" artifacts Artifact("javax.servlet", "jar", "jar")
)
)
)
}
Integration and e2e tests configuration:
import sbt.Keys._
import sbt._
object Testing {
val IntegrationTest = config("it").extend(Runtime)
val EndToEndTest = config("e2e").extend(Runtime)
val configs = Seq(IntegrationTest, EndToEndTest)
lazy val testAll = TaskKey[Unit]("test-all")
private lazy val itSettings =
inConfig(IntegrationTest)(Defaults.testSettings) ++
Seq(
fork in IntegrationTest := false,
parallelExecution in IntegrationTest := false,
scalaSource in IntegrationTest := baseDirectory.value / "src/it/scala",
resourceDirectory in IntegrationTest := baseDirectory.value / "src/test/resources")
private lazy val e2eSettings =
inConfig(EndToEndTest)(Defaults.testSettings) ++
Seq(
fork in EndToEndTest := false,
parallelExecution in EndToEndTest := false,
scalaSource in EndToEndTest := baseDirectory.value / "src/e2e/scala",
resourceDirectory in EndToEndTest := baseDirectory.value / "src/test/resources")
lazy val settings = e2eSettings ++ itSettings ++ Seq(
testAll <<= (test in EndToEndTest) dependsOn (test in IntegrationTest) dependsOn(test in Test)
)
}
java.lang.NoClassDefFoundError: scoverage/Invoker$
addSbtPlugin("com.mojolly.scalate" % "xsbt-scalate-generator" % "0.5.0")
addSbtPlugin("org.scalatra.sbt" % "scalatra-sbt" % "0.3.5")
addSbtPlugin("org.scoverage" % "sbt-scoverage" % "1.1.0")

It seems that you need to add a setting to your sbt project:
works for me, with "org.scoverage" % "sbt-scoverage" % "1.5.0"
coverageEnabled in Test := true
and I have found that for version <1.4.0 there was another solution:
coverageEnabled.in(ThisBuild ,Test, test) := true

Related

"datasource not a member of org.apache.phoenix" when trying to Save DataFrames to Phoenix using DataSourceV2

I am trying to Save DataFrames to Phoenix using DataSourceV2 following the below mentioned source:
Apache Spark plugin
I created a dataframe and I want to save it to phoenix in the following way:
import org.apache.spark.SparkContext
import org.apache.phoenix.spark.datasource.v2.PhoenixDataSource
val conf = new SparkConf().setAppName("Spark sql to convert rdd to df")
val sc = new SparkContext(conf)
val sqlContext= new org.apache.spark.sql.SQLContext(sc)
import sqlContext.implicits._
val MasterDF = MasterRecordSeq.toDF()
MasterDF.write
.format("phoenix")
.mode(SaveMode.Overwrite)
.options(Map("table" -> masterTableName, PhoenixDataSource.ZOOKEEPER_URL -> "phoenix-server:2181"))
.save()
But the import org.apache.phoenix.spark.datasource.v2.PhoenixDataSource is not being recognized. It throws the following error:
object datasource is not a member of package org.apache.phoenix.spark
I have searched through a lot of internet but I'm not able to find what the bug is.
The following are the dependencies I added in build.sbt:
libraryDependencies += "org.apache.phoenix" % "phoenix-spark" % "5.0.0-HBase-2.0"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.5"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.5"
libraryDependencies += "org.apache.phoenix" % "phoenix-core" % "5.0.0-HBase-2.0"
The following is the complete build file:
import NativePackagerHelper._
import java.util.Properties
import com.typesafe.sbt.packager.MappingsHelper._
//import sbtrelease.ReleaseStateTransformations._
name := """gavel"""
//scapegoatVersion in ThisBuild := "1.1.0"
//version := sys.env.get("BUILD_NUMBER").getOrElse("3.0-LOCAL")
version := "3.0"
scalaVersion := "2.11.12"
//crossScalaVersions := Seq("2.11.11", "2.12.3")
//scapegoatVersion in ThisBuild := "1.3.5"
scalaBinaryVersion in ThisBuild := "2.12"
javacOptions ++= Seq("-source", "1.6", "-target", "1.6")
scalacOptions ++= Seq("-unchecked", "-deprecation", "-feature")
scalacOptions in (Compile, doc) ++= Seq("-unchecked", "-deprecation", "-diagrams", "-implicits", "-skip-packages", "samples")
lazy val root = (project in file(".")).enablePlugins(PlayScala,sbtdocker.DockerPlugin,JavaAppPackaging).settings(
watchSources ++= (baseDirectory.value / "public/frontend" ** "*").get
)
mainClass := Some("play.core.server.ProdServerStart")
fullClasspath in assembly += Attributed.blank(PlayKeys.playPackageAssets.value)
mappings in Universal ++= directory(baseDirectory.value / "public")
unmanagedBase := baseDirectory.value / "libs"
routesGenerator := InjectedRoutesGenerator
resolvers += "scalaz-bintray" at "https://dl.bintray.com/scalaz/releases"
libraryDependencies ++= Seq(
"com.typesafe" % "config" % "1.3.1",
"mysql" % "mysql-connector-java" % "5.1.34",
"com.typesafe.play" %% "play-slick" % "3.0.0",
"com.typesafe.play" %% "play-slick-evolutions" % "3.0.0",
"com.typesafe.play" %% "play-json" % "2.6.0",
"org.scalatestplus.play" %% "scalatestplus-play" % "3.0.0" % "test",
specs2 % Test,
// "io.rest-assured" % "rest-assured" % "3.0.0" % "test",
// "io.rest-assured" % "scala-support" % "3.0.0" % "test",
// "com.squareup.okhttp" % "mockwebserver" % "2.5.0" % "test",
"javax.mail" % "mail" % "1.4",
"io.swagger" %% "swagger-play2" % "1.6.1",
"com.fasterxml.jackson.core" % "jackson-databind" % "2.4.0",
"com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.4.0",
"com.google.code.gson" % "gson" % "1.7.1",
"commons-io" % "commons-io" % "2.4",
"com.typesafe.akka" %% "akka-actor" % "2.4.16",
"com.typesafe.akka" %% "akka-testkit" % "2.4.16" % "test",
"org.typelevel" %% "macro-compat" % "1.1.1",
"org.scala-lang" % "scala-reflect" % scalaVersion.value % "provided",
"org.scalatest" %% "scalatest" % "3.0.0" % "test",
compilerPlugin("org.scalamacros" %% "paradise" % "2.1.0" cross CrossVersion.full),
guice
)
libraryDependencies ++= Seq(
"com.101tec" % "zkclient" % "0.4",
"org.apache.kafka" % "kafka_2.10" % "0.8.1.1"
exclude("javax.jms", "jms")
exclude("com.sun.jdmk", "jmxtools")
exclude("com.sun.jmx", "jmxri")
)
libraryDependencies += ws
libraryDependencies += ehcache
// https://mvnrepository.com/artifact/org.apache.phoenix/phoenix-spark
libraryDependencies += "org.apache.phoenix" % "phoenix-spark" % "5.0.0-HBase-2.0"
libraryDependencies += "com.google.protobuf" % "protobuf-java" % "2.4.0"
libraryDependencies += "org.codehaus.jackson" % "jackson-mapper-asl" % "1.9.13"
libraryDependencies += "com.google.code.gson" % "gson" % "2.3"
libraryDependencies += "org.apache.phoenix" % "phoenix-queryserver-client" % "4.13.1-HBase-1.2"
libraryDependencies += "com.github.takezoe" %% "solr-scala-client" % "0.0.19"
libraryDependencies += "com.squareup.okhttp" % "okhttp" % "2.7.0"
libraryDependencies += "org.threeten" % "threetenbp" % "1.2"
libraryDependencies += "io.gsonfire" % "gson-fire" % "1.0.1"
libraryDependencies += "au.com.bytecode" % "opencsv" % "2.4"
libraryDependencies += "org.simplejavamail" % "simple-java-mail" % "5.0.8"
libraryDependencies += "org.apache.solr" % "solr-solrj" % "6.6.2"
libraryDependencies += "com.jcraft" % "jsch" % "0.1.55"
libraryDependencies += "com.vmware" % "vijava" % "5.1"
libraryDependencies += "com.microsoft.sqlserver" % "mssql-jdbc" % "6.1.0.jre8" % Test
//libraryDependencies += "com.microsoft.sqlserver" % "sqljdbc4" % "4.0"
libraryDependencies += "org.apache.poi" % "poi" % "3.17"
libraryDependencies += "org.apache.poi" % "poi-ooxml" % "3.17"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.5"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.5"
libraryDependencies += "org.apache.phoenix" % "phoenix-core" % "5.0.0-HBase-2.0"
crossSbtVersions := Seq("0.13.17", "1.1.6")
publishTo := {
val isSnapshotValue = isSnapshot.value
val nexus = "https://oss.sonatype.org/"
if(isSnapshotValue) Some("snapshots" at nexus + "content/repositories/snapshots")
else Some("releases" at nexus + "service/local/staging/deploy/maven2")
}
publishMavenStyle := true
publishArtifact in Test := false
parallelExecution in Test := false
dockerfile in docker := {
// The assembly task generates a fat JAR file
val artifact: File = assembly.value
val artifactTargetPath = s"/app/${artifact.name}"
new Dockerfile {
from("java")
from("mysql:5.7")
add(artifact, artifactTargetPath)
entryPoint("java", "-jar", artifactTargetPath)
}
}
val appProperties = settingKey[Properties]("The application properties")
appProperties := {
val prop = new Properties()
IO.load(prop, new File("./conf/database.conf"))
prop
}
javaOptions in Test += "-Dconfig.file=conf/application.test.conf"
resolvers += "Sonatype snapshots" at "http://oss.sonatype.org/content/repositories/snapshots/"
sourceDirectories in (Compile, TwirlKeys.compileTemplates) :=
(unmanagedSourceDirectories in Compile).value
flywayDriver := "com.mysql.jdbc.Driver"
flywayUrl := appProperties.value.getProperty("slick.dbs.default.db.url").replaceAll("\"", "")
flywayUser := appProperties.value.getProperty("slick.dbs.default.db.user")
flywayPassword := appProperties.value.getProperty("slick.dbs.default.db.password").replaceAll("\"", "")
flywayLocations := Seq("filesystem:conf/db/default")
fork in run := true
//coverageEnabled := false
//coverageMinimum := 70
//coverageFailOnMinimum := true
//coverageHighlighting := true
publishArtifact in Test := false
parallelExecution in Test := false
enablePlugins(SbtProguard)
import com.lightbend.sbt.SbtProguard._
javaOptions in (Proguard, proguard) := Seq("-Xmx2G")
proguardOptions in Proguard ++= Seq("-dontnote", "-dontwarn", "-ignorewarnings")
proguardOptions in Proguard += ProguardOptions.keepMain("some.MainClass")
proguardMergeStrategies in Proguard += ProguardMerge.append("*.conf")
proguardMergeStrategies in Proguard ++= Seq(
ProguardMerge.discard("\\.zip$".r),
ProguardMerge.discard("\\.xml$".r),
ProguardMerge.discard("\\.txt$".r),
ProguardMerge.discard("\\.conf$".r),
ProguardMerge.discard("\\.jar$".r)
)
My phoenix version is 5.0. My Hbase version is 2.0.2.3.1.0.0-78. Am I missing any configuration?
I had the same problem (error), but in my specific case it was for a scala script in a Hortonworks Big Data cluster to be executed by Spark
I managed to solve it by compiling the phoenix-spark repository available on github and importing the jar into the spark directory.
Here are the commands I ran to build the jar, I hope it helps.
sudo yum install maven
wget https://github.com/apache/phoenix-connectors/archive/master.zip
unzip master.zip
cd phoenix-connectors/phoenix-spark
mvn clean compile
mvn package
cd target/scala-2.12/
cp phoenix-spark-1.0.0-SNAPSHOT.jar /usr/hdp/current/spark2-client/jars

Import sbt and typesafe in build (IntelliJ)

i can't import the sbt and typesafe libraries into build.sbt in IntelliJ.
The dependencies of sbt and typesafe are in the plugin.sbt file. In the file plugin.sbt also I have the addSbtPlugin method in red:
plugin.sbt
while the import of the libraries are inside the build.sbt file.
build.sbt
My
How can I do?
Update
The build.sbt file is this:
import com.typesafe.sbt.license.{DepModuleInfo, LicenseCategory, LicenseInfo}
import sbt._
import scala.io.Source
// Core library versions (the ones that are used multiple times)
val sparkVersion: String = "2.3.1"
val slf4jVersion: String = "1.7.25"
val logbackVersion: String = "1.2.3"
// Artifactory settings
val artifactoryRealm: String = "artifactory-espoo1.int.net.nokia.com"
val artifactoryUrl: String = s"https://$artifactoryRealm/artifactory/"
val artifactoryUser: Option[String] = sys.env.get("ARTIFACTORY_USER")
val artifactoryPassword: Option[String] = sys.env.get("ARTIFACTORY_PASSWORD")
// Project variables
val organizationId: String = "com.nokia.gs.npo.ae"
val rootPackage: String = organizationId + ".rfco"
// Base settings shared across modules
val baseSettings: Seq[SettingsDefinition] = Seq(
organization := organizationId,
version := Source.fromFile(file("VERSION")).mkString.trim + sys.env.getOrElse("VERSION_TAG", ""),
scalaVersion := "2.11.12",
buildInfoUsePackageAsPath := true,
scalafmtOnCompile in ThisBuild := false, // just invoke `sbt scalafmt` before commits!
parallelExecution in ThisBuild := false,
fork in Test := true,
testForkedParallel in Test := true,
logLevel in test := util.Level.Info,
coverageMinimum := sys.env.getOrElse("COVERAGE_MINIMUM", "80.0").toDouble,
coverageFailOnMinimum := true,
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-sql" % sparkVersion % Provided,
"org.apache.spark" %% "spark-hive" % sparkVersion % Provided,
"org.slf4j" % "slf4j-api" % slf4jVersion % Compile,
"com.nokia.gs.ncs.chubs.common" %% "spark-commons" % "0.5.10" % Compile,
"com.nokia.gs.ncs.chubs.common" %% "lang" % "0.2.0" % Compile,
"com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.9.8" % Compile,
"com.typesafe.play" %% "play-json" % "2.7.1" % Compile,
"org.apache.commons" % "commons-csv" % "1.7" % Compile,
"org.scalatest" %% "scalatest" % "3.0.5" % Test,
"ch.qos.logback" % "logback-classic" % logbackVersion % Test,
"ch.qos.logback" % "logback-core" % logbackVersion % Test,
"org.apache.spark" %% "spark-hive-thriftserver" % sparkVersion % Test,
"com.github.tomakehurst" % "wiremock-standalone" % "2.22.0" % Test
),
excludeDependencies ++= Seq(
"com.fasterxml.jackson.module" % "jackson-module-scala",
"org.slf4j" % "slf4j-log4j12",
"org.hamcrest" % "hamcrest-core",
"javax.servlet" % "servlet-api"
),
publishTo := {
Some("Artifactory Realm" at artifactoryUrl + sys.env.getOrElse("ARTIFACTORY_LOCATION", "ava-maven-snapshots-local"))
},
packagedArtifacts in publish ~= { m =>
val classifiersToExclude = Set(Artifact.SourceClassifier)
m.filter({ case (art, _) => art.classifier.forall(c => !classifiersToExclude.contains(c)) })
},
(artifactoryUser, artifactoryPassword) match {
case (Some(user), Some(password)) =>
credentials += Credentials("Artifactory Realm", artifactoryRealm, user, password)
case _ =>
println("[info] USERNAME and/or PASSWORD is missing for publishing to Artifactory")
credentials := Seq()
}
)
By looking at the build.sbt your plugins.sbt should at least contain these lines:
addSbtPlugin("com.typesafe.sbt" % "sbt-license-report" % "1.2.0")
addSbtPlugin("com.eed3si9n" % "sbt-buildinfo" % "0.9.0")
addSbtPlugin("org.scalameta" % "sbt-scalafmt" % "2.2.1")
addSbtPlugin("org.scoverage" % "sbt-scoverage" % "1.6.1")
I solved import sbt, creating a new project on IntelliJ IDea and importing the source files into the project I created. Oddly now it does the import. Before, however, I opened the source folder with the code from IntelliJ, but not made the import of sbt.
build.sbt
I only have to resolve type safe dependencies, but it is a problem with external and private dependencies.
Thanks for your help

container:start stopped working in Scalatra sbt project : "invalid key"

I have a scalatra project built the standard way using giter8.
I am uncertain why container:start no longer functions in my scalatra project: no change was made to build.sbt. here is the error:
Using /home/stephen/.sbt/0.12.0 as sbt dir, -sbt-dir to override.
[info] Set current project to wfdemo (in build file:/home/stephen/wfdemo/)
> container:start
[error] Not a valid key: start (similar: state, startYear, target)
[error] container:start
[error]
Here is the sbt:
object KeywordsservletBuild extends Build {
val Organization = "com.astralync"
val Name = "KeywordsServlet"
val Version = "0.1.0-SNAPSHOT"
val ScalaVersion = "2.11.7"
val ScalatraVersion = "2.4.0-RC2-2"
lazy val project = Project (
"keywordsservlet",
file("."),
settings = ScalatraPlugin.scalatraSettings ++ scalateSettings ++ Seq(
organization := Organization,
name := Name,
version := Version,
scalaVersion := ScalaVersion,
resolvers += Classpaths.typesafeReleases,
resolvers += "Scalaz Bintray Repo" at "http://dl.bintray.com/scalaz/releases",
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.3.1",
"org.scalatra" %% "scalatra" % ScalatraVersion,
"org.scalatra" %% "scalatra-scalate" % ScalatraVersion,
"org.scalatra" %% "scalatra-specs2" % ScalatraVersion % "test",
"ch.qos.logback" % "logback-classic" % "1.1.2" % "runtime",
"org.eclipse.jetty" % "jetty-webapp" % "9.2.13.v20150730" % "container",
"javax.servlet" % "javax.servlet-api" % "3.1.0" % "provided",
"net.databinder" % "unfiltered-netty_2.11" % "0.8.4"
),
scalateTemplateConfig in Compile <<= (sourceDirectory in Compile){ base =>
Seq(
TemplateConfig(
base / "webapp" / "WEB-INF" / "templates",
Seq.empty, /* default imports should be added here */
Seq(
Binding("context", "_root_.org.scalatra.scalate.ScalatraRenderContext", importMembers = true, isImplicit = true)
), /* add extra bindings here */
Some("templates")
)
)
}
)
)
}
Suggestions appreciated.
Small but not necessarily obvious fix for this:
; container:start ; shell
The leading semicolon is required.

unresolved dependency: com.kyleu#jdub-async_2.11;1.0: not found

plugin.sbt: I define resolvers and plugins here.
logLevel := Level.Debug
resolvers += "JCenter repo" at "https://bintray.com/bintray/jcenter/"
resolvers += "scalaz-bintray" at "https://dl.bintray.com/scalaz/releases"
resolvers += "mvnrepository" at "http://mvnrepository.com/artifact/"
resolvers += "Atlassian Releases" at "https://maven.atlassian.com/public/"
resolvers += "Typesafe repository" at "https://repo.typesafe.com/typesafe/releases/"
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.4.2")
addSbtPlugin("org.scala-js" % "sbt-scalajs" % "0.6.3")
addSbtPlugin("com.vmunier" % "sbt-play-scalajs" % "0.2.6")
addSbtPlugin("com.typesafe.sbt" % "sbt-gzip" % "1.0.0")
build.sbt: There are 3 projects in this file: shared, client and server. Client is a Scala.js application, server is a Play application and shared is a project which code is used in server and client projects.
import sbt.Project.projectToRef
lazy val clients = Seq(client)
lazy val scalaV = "2.11.6"
lazy val server = (project in file("server")).settings(
scalaVersion := scalaV,
scalaJSProjects := clients,
pipelineStages := Seq(scalaJSProd, gzip),
libraryDependencies ++= Seq(
"com.vmunier" %% "play-scalajs-scripts" % "0.3.0",
"org.webjars" % "jquery" % "1.11.1",
"org.json4s" %% "json4s-jackson" % "3.2.11",
"com.github.benhutchison" %% "prickle" % "1.1.7",
"com.kyleu" %% "jdub-async" % "1.0",
specs2 % Test
)
).enablePlugins(PlayScala).
aggregate(clients.map(projectToRef): _*).
dependsOn(exampleSharedJvm)
lazy val client = (project in file("client")).settings(
scalaVersion := scalaV,
persistLauncher := true,
persistLauncher in Test := false,
sourceMapsDirectories += exampleSharedJs.base / "..",
libraryDependencies ++= Seq(
"org.scala-js" %%% "scalajs-dom" % "0.8.0",
"org.scala-js" %%% "scala-parser-combinators" % "1.0.2",
"com.github.benhutchison" %%% "prickle" % "1.1.7"
)
).enablePlugins(ScalaJSPlugin, ScalaJSPlay).
dependsOn(exampleSharedJs)
lazy val shared = (crossProject.crossType(CrossType.Pure) in file("shared")).
settings(scalaVersion := scalaV).
jsConfigure(_ enablePlugins ScalaJSPlay).
jsSettings(sourceMapsBase := baseDirectory.value / "..")
lazy val exampleSharedJvm = shared.jvm
lazy val exampleSharedJs = shared.js
onLoad in Global := (Command.process("project server", _: State)) compose (onLoad in Global).value
During compilation of server project I got an error: unresolved dependency: com.kyleu#jdub-async_2.11;1.0: not found. How to fix this issue?

java.lang.NoSuchMethodError in Scalatra using Scalate with Markdown

So I have a Scalatra app (using Scalatra 2.2.1). I'm building views using Scalate; I've decided to go with the Jade/Markdown one-two. Only one problem: if I try to use markdown in a jade template (started with the :markdown tag), I get this:
scala.Predef$.any2ArrowAssoc(Ljava/lang/Object;)Lscala/Predef$ArrowAssoc;
java.lang.NoSuchMethodError: scala.Predef$.any2ArrowAssoc(Ljava/lang/Object;)Lscala/Predef$ArrowAssoc;
at org.fusesource.scalamd.Markdown$.<init>(md.scala:119)
at org.fusesource.scalamd.Markdown$.<clinit>(md.scala:-1)
at org.fusesource.scalate.filter.ScalaMarkdownFilter$.filter(ScalaMarkdownFilter.scala:32)
at org.fusesource.scalate.RenderContext$class.filter(RenderContext.scala:276)
at org.fusesource.scalate.DefaultRenderContext.filter(DefaultRenderContext.scala:30)
at org.fusesource.scalate.RenderContext$class.value(RenderContext.scala:235)
at org.fusesource.scalate.DefaultRenderContext.value(DefaultRenderContext.scala:30)
at templates.views.$_scalate_$about_jade$.$_scalate_$render(about_jade.scala:37)
at templates.views.$_scalate_$about_jade.render(about_jade.scala:48)
at org.fusesource.scalate.DefaultRenderContext.capture(DefaultRenderContext.scala:92)
at org.fusesource.scalate.layout.DefaultLayoutStrategy.layout(DefaultLayoutStrategy.scala:45)
at org.fusesource.scalate.TemplateEngine$$anonfun$layout$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(TemplateEngine.scala:559)
at org.fusesource.scalate.TemplateEngine$$anonfun$layout$1$$anonfun$apply$mcV$sp$1.apply(TemplateEngine.scala:559)
at org.fusesource.scalate.TemplateEngine$$anonfun$layout$1$$anonfun$apply$mcV$sp$1.apply(TemplateEngine.scala:559)
So that's pretty cool. The error vanishes as soon as I remove the :markdown flag, and beyond that everything compiles (beyond markdown not getting rendered correctly).
Things I know and have found so far:
There's some thought that this error is the biproduct of incompatible Scala versions somewhere in the build. My build.scala defines Scala version as 2.10.0, which Scalatra is explicitly compatible with.
...that said, I have no idea which version of Scalate Scalatra pulls in, and my attempts to override it have not worked so far. I know the current stable of Scalate (1.6.1) is only compatible up to Scala 2.10.0 -- but that's what I'm using.
I am, however, sure that my classpath is clean. I have no conflicting Scala versions. Everything is 2.10.0 in the dependencies.
Has anybody worked with this one before? Any ideas?
EDIT
Per request, here are my build definitions:
//build.sbt
libraryDependencies += "org.scalatest" %% "scalatest" % "2.0.M5b" % "test"
libraryDependencies += "org.twitter4j" % "twitter4j-core" % "3.0.3"
libraryDependencies += "org.fusesource.scalamd" % "scalamd" % "1.5"
//build.properties
sbt.version=0.12.3
//build.scala
import sbt._
import Keys._
import org.scalatra.sbt._
import org.scalatra.sbt.PluginKeys._
import com.mojolly.scalate.ScalatePlugin._
import ScalateKeys._
object TheRangeBuild extends Build {
val Organization = "com.gastove"
val Name = "The Range"
val Version = "0.1.0-SNAPSHOT"
val ScalaVersion = "2.10.0"
val ScalatraVersion = "2.2.1"
lazy val project = Project (
"the-range",
file("."),
settings = Defaults.defaultSettings ++ ScalatraPlugin.scalatraWithJRebel ++ scalateSettings ++ Seq(
organization := Organization,
name := Name,
version := Version,
scalaVersion := ScalaVersion,
resolvers += Classpaths.typesafeReleases,
libraryDependencies ++= Seq(
"org.scalatra" %% "scalatra" % ScalatraVersion,
"org.scalatra" %% "scalatra-scalate" % ScalatraVersion,
"org.scalatra" %% "scalatra-specs2" % ScalatraVersion % "test",
"ch.qos.logback" % "logback-classic" % "1.0.6" % "runtime",
"org.eclipse.jetty" % "jetty-webapp" % "8.1.8.v20121106" % "compile;container",
"org.eclipse.jetty.orbit" % "javax.servlet" % "3.0.0.v201112011016" % "compile;container;provided;test" artifacts (Artifact("javax.servlet", "jar", "jar"))
),
scalateTemplateConfig in Compile <<= (sourceDirectory in Compile){ base =>
Seq(
TemplateConfig(
base / "webapp" / "WEB-INF" / "templates",
Seq.empty, /* default imports should be added here */
Seq(
Binding("context", "_root_.org.scalatra.scalate.ScalatraRenderContext", importMembers = true, isImplicit = true)
), /* add extra bindings here */
Some("templates")
)
)
}
) ++ seq(com.typesafe.startscript.StartScriptPlugin.startScriptForClassesSettings: _*)
)
}
When using the markdown filter, you need to add the scalamd library as runtime dependency:
"org.fusesource.scalamd" %% "scalamd" % "1.6"
The most recent version can be found on Maven Central
Also you can delete the build.sbt file and put the dependencies into build.scala file which makes things a bit simpler.
import sbt._
import Keys._
import org.scalatra.sbt._
import org.scalatra.sbt.PluginKeys._
import com.mojolly.scalate.ScalatePlugin._
import ScalateKeys._
object TheRangeBuild extends Build {
val Organization = "com.gastove"
val Name = "The Range"
val Version = "0.1.0-SNAPSHOT"
val ScalaVersion = "2.10.0"
val ScalatraVersion = "2.2.1"
lazy val project = Project(
"the-range",
file("."),
settings = Defaults.defaultSettings ++ ScalatraPlugin.scalatraWithJRebel ++ scalateSettings ++ Seq(
organization := Organization,
name := Name,
version := Version,
scalaVersion := ScalaVersion,
resolvers += Classpaths.typesafeReleases,
libraryDependencies ++= Seq( // adding this Seq to the libraryDependencies
"org.scalatra" %% "scalatra" % ScalatraVersion,
"org.scalatra" %% "scalatra-scalate" % ScalatraVersion,
"org.scalatra" %% "scalatra-specs2" % ScalatraVersion % "test",
"ch.qos.logback" % "logback-classic" % "1.0.6" % "runtime",
"org.eclipse.jetty" % "jetty-webapp" % "8.1.8.v20121106" % "compile;container",
"org.scalatest" %% "scalatest" % "2.0.M5b" % "test",
"org.twitter4j" % "twitter4j-core" % "3.0.3",
"org.fusesource.scalamd" % "scalamd" % "1.6",
"org.eclipse.jetty.orbit" % "javax.servlet" % "3.0.0.v201112011016" % "compile;container;provided;test" artifacts (Artifact("javax.servlet", "jar", "jar"))
),
scalateTemplateConfig in Compile <<= (sourceDirectory in Compile){ base =>
Seq(
TemplateConfig(
base / "webapp" / "WEB-INF" / "templates",
Seq.empty, /* default imports should be added here */
Seq(
Binding("context", "_root_.org.scalatra.scalate.ScalatraRenderContext", importMembers = true, isImplicit = true)
), /* add extra bindings here */
Some("templates")
)
)
}
) ++ seq(com.typesafe.startscript.StartScriptPlugin.startScriptForClassesSettings: _*)
)
}
This comes from:
libraryDependencies += "org.fusesource.scalamd" % "scalamd" % "1.5"
Looking at the scalamd-1.5 pom.xml, it is built against Scala 2.8.1, which is not compatible with 2.10.
Dependency resolution keeps 2.10 and discard the 2.8.1 dependency, and you end up with this classpath issue.
The only solution you have is to try and build a new scalamd version against Scala 2.10, potentially fix a few things to get it there, and then publish it (at least locally).