Problems in compile scala project using SBT - scala

I am trying to run a Scala project using SBT.
On compiling the project I am getting the below error -
play.sbt.PlayExceptions$CompilationException: Compilation error[Symbol 'type scalaxb.XMLStandardTypes' is missing from the classpath.
This symbol is required by 'trait connectors.concept.XMLProtocol'.
Make sure that type XMLStandardTypes is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
A full rebuild may help if 'XMLProtocol.class' was compiled against an incompatible version of scalaxb.]
at play.sbt.PlayExceptions$CompilationException$.apply(PlayExceptions.scala:34)
at play.sbt.PlayExceptions$CompilationException$.apply(PlayExceptions.scala:34)
at scala.Option.map(Option.scala:230)
at play.sbt.run.PlayReload$.$anonfun$taskFailureHandler$1(PlayReload.scala:40)
at scala.Option.map(Option.scala:230)
at play.sbt.run.PlayReload$.taskFailureHandler(PlayReload.scala:35)
at play.sbt.run.PlayReload$.compileFailure(PlayReload.scala:28)
at play.sbt.run.PlayReload$.$anonfun$compile$3(PlayReload.scala:63)
at scala.util.Either$LeftProjection.map(Either.scala:573)
at play.sbt.run.PlayReload$.compile(PlayReload.scala:63)
my project is using scalaVersion = "2.13.3" and SBT version is = "1.3.13".
Any pointers on how to compile this.
Below are my build.sbt and plugins.sbt files -
build.sbt file -
name := "test-manager"
organization := "zest"
resolvers in ThisBuild ++= Seq(
Resolver.sonatypeRepo("releases"),
Resolver.sonatypeRepo("snapshots"),
"Zest releases" at "https://test/test-releases",
"Zest snapshots" at "https://test/test-snapshots"
)
scalaVersion := "2.13.3"
scalacOptions += "-Ylog-classpath"
lazy val compactEvolutions = TaskKey[Unit]("compactEvolutions")
lazy val yarnBuild = taskKey[Unit]("Build frontend dependencies using yarn")
lazy val yarnWatch = taskKey[Unit]("Watch frontend dependencies using yarn")
lazy val supportedLanguages = Seq("en")
DefaultOptions.addCredentials
lazy val packageSettings = Seq(
maintainer in Linux := "test#test",
packageSummary in Linux := "Custom startscript parameters",
packageName in Linux := name.value,
packageDescription := "Custom startscript parameters",
daemonUser in Linux := "test_api",
daemonGroup in Linux := "test_api",
// rpm specific
rpmRelease := "1",
rpmVendor := "test",
rpmUrl := Some("http://test"),
rpmLicense := Some("Private"),
rpmGroup := Some("permission"),
//File mappings setting. Exclude all resource files
mappings in (Compile, packageBin) := {
val resourceFilenames = (resources in Compile).value.map(_.getName)
(mappings in (Compile, packageBin)).value.filterNot {
case (_, name) =>
resourceFilenames.contains(name)
}
},
scriptClasspath := "/test/conf" +: scriptClasspath.value
)
lazy val root = (project in file("."))
.enablePlugins(PlayScala, JavaAppPackaging, SystemdPlugin, RpmDeployPlugin)
.settings(packageSettings: _*)
.settings(
libraryDependencies ++= Seq(
guice,
ws,
filters,
"com.typesafe.play" %% "play-json" % "2.8.1",
"org.mockito" % "mockito-all" % "1.10.19" % Test,
"org.scalatestplus.play" %% "scalatestplus-play" % "5.1.0" % Test,
"com.norbitltd" %% "spoiwo" % "1.8.0",
"com.univocity" % "univocity-parsers" % "2.9.0",
"org.tpolecat" %% "doobie-core" % "0.9.0",
"org.tpolecat" %% "doobie-hikari" % "0.9.0",
"org.tpolecat" %% "doobie-postgres" % "0.9.0",
"org.tpolecat" %% "doobie-scalatest" % "0.9.0" % Test,
"com.github.tomakehurst" % "wiremock-standalone" % "2.27.1" % Test,
"org.flywaydb" % "flyway-core" % "6.5.5",
"ch.qos.logback" % "logback-classic" % "1.2.3",
"org.dispatchhttp" %% "dispatch-core" % "1.1.3",
"org.scala-lang.modules" %% "scala-xml" % "1.3.0",
"org.scala-lang.modules" %% "scala-parser-combinators" % "1.1.2",
"commons-io" % "commons-io" % "2.7",
"org.typelevel" %% "cats-laws" % "2.1.1",
"de.svenkubiak" % "jBCrypt" % "0.4.1",
"io.github.jyllands-posten" %% "play-prometheus-filters" % "0.6.0"
),
javaOptions in Test += "-Dconfig.resource=application.test.conf",
javacOptions ++= Seq("-source", "1.8", "-target", "1.8"),
scalacOptions += "-target:jvm-1.8",
maxErrors := 50,
routesImport += "common.Bindables._",
compactEvolutions := {
DbCompactor.compactEvolutions()
},
yarnBuild := {
scala.sys.process.Process("yarn" :: "build" :: Nil) ! new process.ProcessLogger {
override def out(s: => String): Unit =
println(s)
override def err(s: => String): Unit =
println(s)
override def buffer[T] (f: => T): T = f
}
},
yarnWatch := {
scala.sys.process.Process("yarn" :: "watch" :: Nil) run new process.ProcessLogger {
override def out(s: => String): Unit =
println(s)
override def err(s: => String): Unit =
println(s)
override def buffer[T] (f: => T): T = f
}
}
)
addCommandAlias("generate-bundles", "test:runMain common.sheets.BundleGenerator")
addCommandAlias("run-local", "run -Dconfig.resource=local.conf")
addCommandAlias("run-dev", "; yarnWatch; run-local")
and plugins.sbt file -
libraryDependencies += "commons-io" % "commons-io" % "2.7"
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.8.2")
addSbtPlugin("org.scalameta" % "sbt-scalafmt" % "2.4.0")
addSbtPlugin("org.scoverage" % "sbt-scoverage" % "1.6.1")
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.7.4")
addSbtPlugin("com.lightbend.paradox" % "sbt-paradox" % "0.8.0")
addSbtPlugin("io.github.jonas" % "sbt-paradox-material-theme" % "0.6.0")
addSbtPlugin("com.github.gseitz" % "sbt-release" % "1.0.13")
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.10.0-RC1")
resolvers += Resolver.sonatypeRepo("public")
I have tried to add scalaxb dependency in both the build.sbt and plugins.sbt file.
But still getting the same error.
Also I have compiled the project with scalacOptions += "-Ylog-classpath"
and there is only one scalaxb jar is bieng used.

Related

"datasource not a member of org.apache.phoenix" when trying to Save DataFrames to Phoenix using DataSourceV2

I am trying to Save DataFrames to Phoenix using DataSourceV2 following the below mentioned source:
Apache Spark plugin
I created a dataframe and I want to save it to phoenix in the following way:
import org.apache.spark.SparkContext
import org.apache.phoenix.spark.datasource.v2.PhoenixDataSource
val conf = new SparkConf().setAppName("Spark sql to convert rdd to df")
val sc = new SparkContext(conf)
val sqlContext= new org.apache.spark.sql.SQLContext(sc)
import sqlContext.implicits._
val MasterDF = MasterRecordSeq.toDF()
MasterDF.write
.format("phoenix")
.mode(SaveMode.Overwrite)
.options(Map("table" -> masterTableName, PhoenixDataSource.ZOOKEEPER_URL -> "phoenix-server:2181"))
.save()
But the import org.apache.phoenix.spark.datasource.v2.PhoenixDataSource is not being recognized. It throws the following error:
object datasource is not a member of package org.apache.phoenix.spark
I have searched through a lot of internet but I'm not able to find what the bug is.
The following are the dependencies I added in build.sbt:
libraryDependencies += "org.apache.phoenix" % "phoenix-spark" % "5.0.0-HBase-2.0"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.5"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.5"
libraryDependencies += "org.apache.phoenix" % "phoenix-core" % "5.0.0-HBase-2.0"
The following is the complete build file:
import NativePackagerHelper._
import java.util.Properties
import com.typesafe.sbt.packager.MappingsHelper._
//import sbtrelease.ReleaseStateTransformations._
name := """gavel"""
//scapegoatVersion in ThisBuild := "1.1.0"
//version := sys.env.get("BUILD_NUMBER").getOrElse("3.0-LOCAL")
version := "3.0"
scalaVersion := "2.11.12"
//crossScalaVersions := Seq("2.11.11", "2.12.3")
//scapegoatVersion in ThisBuild := "1.3.5"
scalaBinaryVersion in ThisBuild := "2.12"
javacOptions ++= Seq("-source", "1.6", "-target", "1.6")
scalacOptions ++= Seq("-unchecked", "-deprecation", "-feature")
scalacOptions in (Compile, doc) ++= Seq("-unchecked", "-deprecation", "-diagrams", "-implicits", "-skip-packages", "samples")
lazy val root = (project in file(".")).enablePlugins(PlayScala,sbtdocker.DockerPlugin,JavaAppPackaging).settings(
watchSources ++= (baseDirectory.value / "public/frontend" ** "*").get
)
mainClass := Some("play.core.server.ProdServerStart")
fullClasspath in assembly += Attributed.blank(PlayKeys.playPackageAssets.value)
mappings in Universal ++= directory(baseDirectory.value / "public")
unmanagedBase := baseDirectory.value / "libs"
routesGenerator := InjectedRoutesGenerator
resolvers += "scalaz-bintray" at "https://dl.bintray.com/scalaz/releases"
libraryDependencies ++= Seq(
"com.typesafe" % "config" % "1.3.1",
"mysql" % "mysql-connector-java" % "5.1.34",
"com.typesafe.play" %% "play-slick" % "3.0.0",
"com.typesafe.play" %% "play-slick-evolutions" % "3.0.0",
"com.typesafe.play" %% "play-json" % "2.6.0",
"org.scalatestplus.play" %% "scalatestplus-play" % "3.0.0" % "test",
specs2 % Test,
// "io.rest-assured" % "rest-assured" % "3.0.0" % "test",
// "io.rest-assured" % "scala-support" % "3.0.0" % "test",
// "com.squareup.okhttp" % "mockwebserver" % "2.5.0" % "test",
"javax.mail" % "mail" % "1.4",
"io.swagger" %% "swagger-play2" % "1.6.1",
"com.fasterxml.jackson.core" % "jackson-databind" % "2.4.0",
"com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.4.0",
"com.google.code.gson" % "gson" % "1.7.1",
"commons-io" % "commons-io" % "2.4",
"com.typesafe.akka" %% "akka-actor" % "2.4.16",
"com.typesafe.akka" %% "akka-testkit" % "2.4.16" % "test",
"org.typelevel" %% "macro-compat" % "1.1.1",
"org.scala-lang" % "scala-reflect" % scalaVersion.value % "provided",
"org.scalatest" %% "scalatest" % "3.0.0" % "test",
compilerPlugin("org.scalamacros" %% "paradise" % "2.1.0" cross CrossVersion.full),
guice
)
libraryDependencies ++= Seq(
"com.101tec" % "zkclient" % "0.4",
"org.apache.kafka" % "kafka_2.10" % "0.8.1.1"
exclude("javax.jms", "jms")
exclude("com.sun.jdmk", "jmxtools")
exclude("com.sun.jmx", "jmxri")
)
libraryDependencies += ws
libraryDependencies += ehcache
// https://mvnrepository.com/artifact/org.apache.phoenix/phoenix-spark
libraryDependencies += "org.apache.phoenix" % "phoenix-spark" % "5.0.0-HBase-2.0"
libraryDependencies += "com.google.protobuf" % "protobuf-java" % "2.4.0"
libraryDependencies += "org.codehaus.jackson" % "jackson-mapper-asl" % "1.9.13"
libraryDependencies += "com.google.code.gson" % "gson" % "2.3"
libraryDependencies += "org.apache.phoenix" % "phoenix-queryserver-client" % "4.13.1-HBase-1.2"
libraryDependencies += "com.github.takezoe" %% "solr-scala-client" % "0.0.19"
libraryDependencies += "com.squareup.okhttp" % "okhttp" % "2.7.0"
libraryDependencies += "org.threeten" % "threetenbp" % "1.2"
libraryDependencies += "io.gsonfire" % "gson-fire" % "1.0.1"
libraryDependencies += "au.com.bytecode" % "opencsv" % "2.4"
libraryDependencies += "org.simplejavamail" % "simple-java-mail" % "5.0.8"
libraryDependencies += "org.apache.solr" % "solr-solrj" % "6.6.2"
libraryDependencies += "com.jcraft" % "jsch" % "0.1.55"
libraryDependencies += "com.vmware" % "vijava" % "5.1"
libraryDependencies += "com.microsoft.sqlserver" % "mssql-jdbc" % "6.1.0.jre8" % Test
//libraryDependencies += "com.microsoft.sqlserver" % "sqljdbc4" % "4.0"
libraryDependencies += "org.apache.poi" % "poi" % "3.17"
libraryDependencies += "org.apache.poi" % "poi-ooxml" % "3.17"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.5"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.5"
libraryDependencies += "org.apache.phoenix" % "phoenix-core" % "5.0.0-HBase-2.0"
crossSbtVersions := Seq("0.13.17", "1.1.6")
publishTo := {
val isSnapshotValue = isSnapshot.value
val nexus = "https://oss.sonatype.org/"
if(isSnapshotValue) Some("snapshots" at nexus + "content/repositories/snapshots")
else Some("releases" at nexus + "service/local/staging/deploy/maven2")
}
publishMavenStyle := true
publishArtifact in Test := false
parallelExecution in Test := false
dockerfile in docker := {
// The assembly task generates a fat JAR file
val artifact: File = assembly.value
val artifactTargetPath = s"/app/${artifact.name}"
new Dockerfile {
from("java")
from("mysql:5.7")
add(artifact, artifactTargetPath)
entryPoint("java", "-jar", artifactTargetPath)
}
}
val appProperties = settingKey[Properties]("The application properties")
appProperties := {
val prop = new Properties()
IO.load(prop, new File("./conf/database.conf"))
prop
}
javaOptions in Test += "-Dconfig.file=conf/application.test.conf"
resolvers += "Sonatype snapshots" at "http://oss.sonatype.org/content/repositories/snapshots/"
sourceDirectories in (Compile, TwirlKeys.compileTemplates) :=
(unmanagedSourceDirectories in Compile).value
flywayDriver := "com.mysql.jdbc.Driver"
flywayUrl := appProperties.value.getProperty("slick.dbs.default.db.url").replaceAll("\"", "")
flywayUser := appProperties.value.getProperty("slick.dbs.default.db.user")
flywayPassword := appProperties.value.getProperty("slick.dbs.default.db.password").replaceAll("\"", "")
flywayLocations := Seq("filesystem:conf/db/default")
fork in run := true
//coverageEnabled := false
//coverageMinimum := 70
//coverageFailOnMinimum := true
//coverageHighlighting := true
publishArtifact in Test := false
parallelExecution in Test := false
enablePlugins(SbtProguard)
import com.lightbend.sbt.SbtProguard._
javaOptions in (Proguard, proguard) := Seq("-Xmx2G")
proguardOptions in Proguard ++= Seq("-dontnote", "-dontwarn", "-ignorewarnings")
proguardOptions in Proguard += ProguardOptions.keepMain("some.MainClass")
proguardMergeStrategies in Proguard += ProguardMerge.append("*.conf")
proguardMergeStrategies in Proguard ++= Seq(
ProguardMerge.discard("\\.zip$".r),
ProguardMerge.discard("\\.xml$".r),
ProguardMerge.discard("\\.txt$".r),
ProguardMerge.discard("\\.conf$".r),
ProguardMerge.discard("\\.jar$".r)
)
My phoenix version is 5.0. My Hbase version is 2.0.2.3.1.0.0-78. Am I missing any configuration?
I had the same problem (error), but in my specific case it was for a scala script in a Hortonworks Big Data cluster to be executed by Spark
I managed to solve it by compiling the phoenix-spark repository available on github and importing the jar into the spark directory.
Here are the commands I ran to build the jar, I hope it helps.
sudo yum install maven
wget https://github.com/apache/phoenix-connectors/archive/master.zip
unzip master.zip
cd phoenix-connectors/phoenix-spark
mvn clean compile
mvn package
cd target/scala-2.12/
cp phoenix-spark-1.0.0-SNAPSHOT.jar /usr/hdp/current/spark2-client/jars

Import sbt and typesafe in build (IntelliJ)

i can't import the sbt and typesafe libraries into build.sbt in IntelliJ.
The dependencies of sbt and typesafe are in the plugin.sbt file. In the file plugin.sbt also I have the addSbtPlugin method in red:
plugin.sbt
while the import of the libraries are inside the build.sbt file.
build.sbt
My
How can I do?
Update
The build.sbt file is this:
import com.typesafe.sbt.license.{DepModuleInfo, LicenseCategory, LicenseInfo}
import sbt._
import scala.io.Source
// Core library versions (the ones that are used multiple times)
val sparkVersion: String = "2.3.1"
val slf4jVersion: String = "1.7.25"
val logbackVersion: String = "1.2.3"
// Artifactory settings
val artifactoryRealm: String = "artifactory-espoo1.int.net.nokia.com"
val artifactoryUrl: String = s"https://$artifactoryRealm/artifactory/"
val artifactoryUser: Option[String] = sys.env.get("ARTIFACTORY_USER")
val artifactoryPassword: Option[String] = sys.env.get("ARTIFACTORY_PASSWORD")
// Project variables
val organizationId: String = "com.nokia.gs.npo.ae"
val rootPackage: String = organizationId + ".rfco"
// Base settings shared across modules
val baseSettings: Seq[SettingsDefinition] = Seq(
organization := organizationId,
version := Source.fromFile(file("VERSION")).mkString.trim + sys.env.getOrElse("VERSION_TAG", ""),
scalaVersion := "2.11.12",
buildInfoUsePackageAsPath := true,
scalafmtOnCompile in ThisBuild := false, // just invoke `sbt scalafmt` before commits!
parallelExecution in ThisBuild := false,
fork in Test := true,
testForkedParallel in Test := true,
logLevel in test := util.Level.Info,
coverageMinimum := sys.env.getOrElse("COVERAGE_MINIMUM", "80.0").toDouble,
coverageFailOnMinimum := true,
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-sql" % sparkVersion % Provided,
"org.apache.spark" %% "spark-hive" % sparkVersion % Provided,
"org.slf4j" % "slf4j-api" % slf4jVersion % Compile,
"com.nokia.gs.ncs.chubs.common" %% "spark-commons" % "0.5.10" % Compile,
"com.nokia.gs.ncs.chubs.common" %% "lang" % "0.2.0" % Compile,
"com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.9.8" % Compile,
"com.typesafe.play" %% "play-json" % "2.7.1" % Compile,
"org.apache.commons" % "commons-csv" % "1.7" % Compile,
"org.scalatest" %% "scalatest" % "3.0.5" % Test,
"ch.qos.logback" % "logback-classic" % logbackVersion % Test,
"ch.qos.logback" % "logback-core" % logbackVersion % Test,
"org.apache.spark" %% "spark-hive-thriftserver" % sparkVersion % Test,
"com.github.tomakehurst" % "wiremock-standalone" % "2.22.0" % Test
),
excludeDependencies ++= Seq(
"com.fasterxml.jackson.module" % "jackson-module-scala",
"org.slf4j" % "slf4j-log4j12",
"org.hamcrest" % "hamcrest-core",
"javax.servlet" % "servlet-api"
),
publishTo := {
Some("Artifactory Realm" at artifactoryUrl + sys.env.getOrElse("ARTIFACTORY_LOCATION", "ava-maven-snapshots-local"))
},
packagedArtifacts in publish ~= { m =>
val classifiersToExclude = Set(Artifact.SourceClassifier)
m.filter({ case (art, _) => art.classifier.forall(c => !classifiersToExclude.contains(c)) })
},
(artifactoryUser, artifactoryPassword) match {
case (Some(user), Some(password)) =>
credentials += Credentials("Artifactory Realm", artifactoryRealm, user, password)
case _ =>
println("[info] USERNAME and/or PASSWORD is missing for publishing to Artifactory")
credentials := Seq()
}
)
By looking at the build.sbt your plugins.sbt should at least contain these lines:
addSbtPlugin("com.typesafe.sbt" % "sbt-license-report" % "1.2.0")
addSbtPlugin("com.eed3si9n" % "sbt-buildinfo" % "0.9.0")
addSbtPlugin("org.scalameta" % "sbt-scalafmt" % "2.2.1")
addSbtPlugin("org.scoverage" % "sbt-scoverage" % "1.6.1")
I solved import sbt, creating a new project on IntelliJ IDea and importing the source files into the project I created. Oddly now it does the import. Before, however, I opened the source folder with the code from IntelliJ, but not made the import of sbt.
build.sbt
I only have to resolve type safe dependencies, but it is a problem with external and private dependencies.
Thanks for your help

Why subprojects does not recognize dependencies?

I've defined two sub projects that looks as follow:
val Http4sVersion = "0.21.0-M4"
val CirceVersion = "0.12.1"
val Specs2Version = "4.7.0"
val LogbackVersion = "1.2.3"
val ScalaTestVersion = "3.0.8"
val TestContainerVersion = "1.11.3"
val KafkaTestContainerVersion = "1.11.3"
val ConfigVersion = "1.3.4"
val SpringVersion = "5.1.8.RELEASE"
val CatsVersion = "2.0.0"
lazy val settings = Seq(
organization := "com.sweetsoft",
name := "connector",
scalaVersion := "2.13.0",
addCompilerPlugin("org.typelevel" %% "kind-projector" % "0.10.3"),
addCompilerPlugin("com.olegpy" %% "better-monadic-for" % "0.3.0"),
scalacOptions ++= Seq(
"-deprecation",
"-encoding", "UTF-8",
"-language:higherKinds",
"-language:postfixOps",
"-feature",
"-Xfatal-warnings",
),
scalacOptions in(Compile, console) ~= {
_.filterNot(Set("-Xlint"))
}
)
lazy val dependencies = Seq(
"org.http4s" %% "http4s-blaze-server" % Http4sVersion,
"org.http4s" %% "http4s-blaze-client" % Http4sVersion,
"org.http4s" %% "http4s-circe" % Http4sVersion,
"org.http4s" %% "http4s-dsl" % Http4sVersion,
"io.circe" %% "circe-generic" % CirceVersion,
"ch.qos.logback" % "logback-classic" % LogbackVersion,
"org.typelevel" %% "cats-core" % CatsVersion,
"com.typesafe" % "config" % ConfigVersion % "test",
"org.scalactic" %% "scalactic" % ScalaTestVersion % "test",
"org.scalatest" %% "scalatest" % ScalaTestVersion % "test",
"org.testcontainers" % "testcontainers" % TestContainerVersion % "test",
"org.testcontainers" % "kafka" % KafkaTestContainerVersion % "test",
"org.springframework" % "spring-core" % SpringVersion % "test",
"org.typelevel" %% "cats-laws" % CatsVersion % "test",
"com.github.alexarchambault" %% "scalacheck-shapeless_1.14" % "1.2.3" % "test",
"org.scalacheck" %% "scalacheck" % "1.14.0" % "test"
)
lazy val global = project
.in(file("."))
.settings(
settings,
libraryDependencies ++= dependencies
)
.aggregate(core, serversupervisor)
lazy val core = (project in file("core"))
.settings(settings)
lazy val serversupervisor = (project in file("serversupervisor"))
.settings(settings)
.dependsOn(core)
As you can see, the two subprojects are core and serversupervisor.
The problem is, that those two subprojects does not recognize dependencies:
I am using Intellj and as you can see, it does not recognize the dependencies.
What am I doing wrong?
Put libraryDependencies ++= dependencies into settings.
global, core and serversupervisor are three different subprojects. They can have different library dependencies. Currently you add them to global but not to core and serversupervisor.
Alternatively you can move libraryDependencies ++= dependencies to Global or
ThisBuild scope rather than specific subproject scope. You can add at top
ThisBuild / libraryDependencies ++= dependencies
or even
Global / libraryDependencies ++= dependencies
https://www.scala-sbt.org/1.x/docs/Multi-Project.html
https://www.scala-sbt.org/1.x/docs/Scopes.html

container:start stopped working in Scalatra sbt project : "invalid key"

I have a scalatra project built the standard way using giter8.
I am uncertain why container:start no longer functions in my scalatra project: no change was made to build.sbt. here is the error:
Using /home/stephen/.sbt/0.12.0 as sbt dir, -sbt-dir to override.
[info] Set current project to wfdemo (in build file:/home/stephen/wfdemo/)
> container:start
[error] Not a valid key: start (similar: state, startYear, target)
[error] container:start
[error]
Here is the sbt:
object KeywordsservletBuild extends Build {
val Organization = "com.astralync"
val Name = "KeywordsServlet"
val Version = "0.1.0-SNAPSHOT"
val ScalaVersion = "2.11.7"
val ScalatraVersion = "2.4.0-RC2-2"
lazy val project = Project (
"keywordsservlet",
file("."),
settings = ScalatraPlugin.scalatraSettings ++ scalateSettings ++ Seq(
organization := Organization,
name := Name,
version := Version,
scalaVersion := ScalaVersion,
resolvers += Classpaths.typesafeReleases,
resolvers += "Scalaz Bintray Repo" at "http://dl.bintray.com/scalaz/releases",
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.3.1",
"org.scalatra" %% "scalatra" % ScalatraVersion,
"org.scalatra" %% "scalatra-scalate" % ScalatraVersion,
"org.scalatra" %% "scalatra-specs2" % ScalatraVersion % "test",
"ch.qos.logback" % "logback-classic" % "1.1.2" % "runtime",
"org.eclipse.jetty" % "jetty-webapp" % "9.2.13.v20150730" % "container",
"javax.servlet" % "javax.servlet-api" % "3.1.0" % "provided",
"net.databinder" % "unfiltered-netty_2.11" % "0.8.4"
),
scalateTemplateConfig in Compile <<= (sourceDirectory in Compile){ base =>
Seq(
TemplateConfig(
base / "webapp" / "WEB-INF" / "templates",
Seq.empty, /* default imports should be added here */
Seq(
Binding("context", "_root_.org.scalatra.scalate.ScalatraRenderContext", importMembers = true, isImplicit = true)
), /* add extra bindings here */
Some("templates")
)
)
}
)
)
}
Suggestions appreciated.
Small but not necessarily obvious fix for this:
; container:start ; shell
The leading semicolon is required.

NoClassDefFoundError running tests in SBT with scoverage plugin

I have an SBT project with structure like here: https://orrsella.com/2014/09/24/integration-and-end-to-end-test-configurations-in-sbt-for-scala-java-projects/. It includes standard main and test directories and additionally it and e2e. There is also a task "test-all" which runs all tests. Everything works correctly unless I run e2e or test-all together with coverage plugin. I'm getting: java.lang.NoClassDefFoundError: scoverage/Invoker$
Using show it:dependencyClasspath and show e2e:dependencyClasspath, I can see that e2e classpath is missing scoverage plugin jars. Any idea what's wrong and how to solve it?
Build.sbt
import org.scalatra.sbt._
import sbt.Keys._
import sbt._
object MaAppBuild extends Build {
val Organization = "com.my-org"
val Name = "My App"
val Version = "0.1.0-SNAPSHOT"
val ScalaVersion = "2.11.6"
val AkkaVersion = "2.3.4"
val ScalatraVersion = "2.3.0"
lazy val project = Project(
"My-App",
file("."),
configurations = Configurations.default ++ Testing.configs,
settings = Defaults.coreDefaultSettings ++ ScalatraPlugin.scalatraSettings ++ Testing.settings ++ Seq(
organization := Organization,
name := Name,
version := Version,
scalaVersion := ScalaVersion,
resolvers += "Sonatype OSS Snapshots" at "http://oss.sonatype.org/content/repositories/snapshots/",
resolvers += "Akka Repo" at "http://repo.akka.io/repository",
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % AkkaVersion,
"com.typesafe.akka" % "akka-testkit_2.11" % AkkaVersion % "test;it;e2e",
"net.databinder.dispatch" %% "dispatch-core" % "0.11.1",
"org.scalatra" %% "scalatra" % ScalatraVersion,
"com.typesafe.akka" %% "akka-testkit" % AkkaVersion % "test;it;e2e",
"org.scalatra" %% "scalatra-scalatest" % ScalatraVersion % "test;it;e2e",
"com.github.tomakehurst" % "wiremock" % "1.55" % "test;it;e2e",
"ch.qos.logback" % "logback-classic" % "1.0.6" % "runtime",
"org.scalatra" %% "scalatra-json" % "2.4.0.RC1",
"org.json4s" %% "json4s-jackson" % "3.2.11",
"com.typesafe" % "config" % "1.2.1",
"org.json4s" %% "json4s-native" % "3.2.11",
"org.eclipse.jetty" % "jetty-webapp" % "8.1.8.v20121106" % "container",
"org.eclipse.jetty.orbit" % "javax.servlet" % "3.0.0.v201112011016" % "container;provided;test" artifacts Artifact("javax.servlet", "jar", "jar")
)
)
)
}
Integration and e2e tests configuration:
import sbt.Keys._
import sbt._
object Testing {
val IntegrationTest = config("it").extend(Runtime)
val EndToEndTest = config("e2e").extend(Runtime)
val configs = Seq(IntegrationTest, EndToEndTest)
lazy val testAll = TaskKey[Unit]("test-all")
private lazy val itSettings =
inConfig(IntegrationTest)(Defaults.testSettings) ++
Seq(
fork in IntegrationTest := false,
parallelExecution in IntegrationTest := false,
scalaSource in IntegrationTest := baseDirectory.value / "src/it/scala",
resourceDirectory in IntegrationTest := baseDirectory.value / "src/test/resources")
private lazy val e2eSettings =
inConfig(EndToEndTest)(Defaults.testSettings) ++
Seq(
fork in EndToEndTest := false,
parallelExecution in EndToEndTest := false,
scalaSource in EndToEndTest := baseDirectory.value / "src/e2e/scala",
resourceDirectory in EndToEndTest := baseDirectory.value / "src/test/resources")
lazy val settings = e2eSettings ++ itSettings ++ Seq(
testAll <<= (test in EndToEndTest) dependsOn (test in IntegrationTest) dependsOn(test in Test)
)
}
java.lang.NoClassDefFoundError: scoverage/Invoker$
addSbtPlugin("com.mojolly.scalate" % "xsbt-scalate-generator" % "0.5.0")
addSbtPlugin("org.scalatra.sbt" % "scalatra-sbt" % "0.3.5")
addSbtPlugin("org.scoverage" % "sbt-scoverage" % "1.1.0")
It seems that you need to add a setting to your sbt project:
works for me, with "org.scoverage" % "sbt-scoverage" % "1.5.0"
coverageEnabled in Test := true
and I have found that for version <1.4.0 there was another solution:
coverageEnabled.in(ThisBuild ,Test, test) := true