Getting started with sbt.
I am getting this error during sbt assembly
deduplicate: different file contents found in the following:
../.ivy2/cache/org.scalatest/scalatest_2.11/bundles/scalatest_2.11-2.2.6.jar:org/scalactic/SeqEqualityConstraints$.class
../.ivy2/cache/org.scalactic/scalactic_2.11/bundles/scalactic_2.11-3.0.0.jar:org/scalactic/SeqEqualityConstraints$.class
this is my build sbt :
scalaVersion := "2.11.8"
scalacOptions := Seq("-unchecked", "-feature", "-deprecation", "-encoding", "utf8")
libraryDependencies ++= {
val phantomV = "1.29.5"
val scalaTestV = "2.2.6"
val scalaMockV = "3.4.2"
val elastic4sV = "2.4.0"
val akkaStreamVersion = "2.4.10"
val akkaVersion = "2.3.12"
Seq(
"com.websudos" %% "phantom-dsl" % phantomV,
"com.websudos" %% "phantom-reactivestreams" % phantomV,
"com.websudos" %% "util-testing" % "0.13.0" % "test, provided",
"com.typesafe.akka" %% "akka-actor" % akkaVersion,
"com.typesafe.akka" %% "akka-http-spray-json-experimental" % akkaStreamVersion,
"com.typesafe.akka" %% "akka-http-core" % akkaStreamVersion,
"com.typesafe.akka" %% "akka-http-experimental" % akkaStreamVersion,
"com.typesafe.akka" %% "akka-http-testkit" % akkaStreamVersion,
"com.typesafe.akka" %% "akka-stream" % akkaStreamVersion,
"com.typesafe.akka" %% "akka-stream-testkit" % akkaStreamVersion,
"org.scalatest" %% "scalatest" % scalaTestV % "test, provided",
"com.typesafe.akka" %% "akka-testkit" % akkaVersion % "test, provided",
"com.typesafe.play" %% "play-streams-experimental" % "2.4.6" % "provided",
"com.sksamuel.elastic4s" %% "elastic4s-core" % elastic4sV,
"com.sksamuel.elastic4s" %% "elastic4s-streams" % elastic4sV,
"org.scalamock" %% "scalamock-scalatest-support" % scalaMockV % "test, provided",
"com.typesafe" % "config" % "1.3.1"
)
}
lazy val root = project.in(file("."))
.settings(mainClass in assembly := Some("com.ind.Main"))
initialCommands := """|import akka.actor._
|import akka.pattern._
|import akka.util._
|import scala.concurrent._
|import scala.concurrent.duration._""".stripMargin
fork in run := true
test in assembly := {}
any idea why am getting that and how can I solve it ?
===== UPDATE ====
I did managed to solve it by adding
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
The ScalaTest and Scalactic version numbers need to match. You have a ScalaTest 2.2.6 trying to use a Scalactic 3.0.0, so that's the problem. One of your dependencies is perhaps using Scalactic 3.0.0.
Related
I've defined two sub projects that looks as follow:
val Http4sVersion = "0.21.0-M4"
val CirceVersion = "0.12.1"
val Specs2Version = "4.7.0"
val LogbackVersion = "1.2.3"
val ScalaTestVersion = "3.0.8"
val TestContainerVersion = "1.11.3"
val KafkaTestContainerVersion = "1.11.3"
val ConfigVersion = "1.3.4"
val SpringVersion = "5.1.8.RELEASE"
val CatsVersion = "2.0.0"
lazy val settings = Seq(
organization := "com.sweetsoft",
name := "connector",
scalaVersion := "2.13.0",
addCompilerPlugin("org.typelevel" %% "kind-projector" % "0.10.3"),
addCompilerPlugin("com.olegpy" %% "better-monadic-for" % "0.3.0"),
scalacOptions ++= Seq(
"-deprecation",
"-encoding", "UTF-8",
"-language:higherKinds",
"-language:postfixOps",
"-feature",
"-Xfatal-warnings",
),
scalacOptions in(Compile, console) ~= {
_.filterNot(Set("-Xlint"))
}
)
lazy val dependencies = Seq(
"org.http4s" %% "http4s-blaze-server" % Http4sVersion,
"org.http4s" %% "http4s-blaze-client" % Http4sVersion,
"org.http4s" %% "http4s-circe" % Http4sVersion,
"org.http4s" %% "http4s-dsl" % Http4sVersion,
"io.circe" %% "circe-generic" % CirceVersion,
"ch.qos.logback" % "logback-classic" % LogbackVersion,
"org.typelevel" %% "cats-core" % CatsVersion,
"com.typesafe" % "config" % ConfigVersion % "test",
"org.scalactic" %% "scalactic" % ScalaTestVersion % "test",
"org.scalatest" %% "scalatest" % ScalaTestVersion % "test",
"org.testcontainers" % "testcontainers" % TestContainerVersion % "test",
"org.testcontainers" % "kafka" % KafkaTestContainerVersion % "test",
"org.springframework" % "spring-core" % SpringVersion % "test",
"org.typelevel" %% "cats-laws" % CatsVersion % "test",
"com.github.alexarchambault" %% "scalacheck-shapeless_1.14" % "1.2.3" % "test",
"org.scalacheck" %% "scalacheck" % "1.14.0" % "test"
)
lazy val global = project
.in(file("."))
.settings(
settings,
libraryDependencies ++= dependencies
)
.aggregate(core, serversupervisor)
lazy val core = (project in file("core"))
.settings(settings)
lazy val serversupervisor = (project in file("serversupervisor"))
.settings(settings)
.dependsOn(core)
As you can see, the two subprojects are core and serversupervisor.
The problem is, that those two subprojects does not recognize dependencies:
I am using Intellj and as you can see, it does not recognize the dependencies.
What am I doing wrong?
Put libraryDependencies ++= dependencies into settings.
global, core and serversupervisor are three different subprojects. They can have different library dependencies. Currently you add them to global but not to core and serversupervisor.
Alternatively you can move libraryDependencies ++= dependencies to Global or
ThisBuild scope rather than specific subproject scope. You can add at top
ThisBuild / libraryDependencies ++= dependencies
or even
Global / libraryDependencies ++= dependencies
https://www.scala-sbt.org/1.x/docs/Multi-Project.html
https://www.scala-sbt.org/1.x/docs/Scopes.html
I have an sbt project that I am trying to build into a jar with the sbt-assembly plugin.
build.sbt:
name := "project-name"
version := "0.1"
scalaVersion := "2.11.12"
val sparkVersion = "2.4.0"
libraryDependencies ++= Seq(
"org.scalatest" %% "scalatest" % "3.0.5" % "test",
"org.apache.spark" %% "spark-core" % sparkVersion % "provided",
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
"com.holdenkarau" %% "spark-testing-base" % "2.3.1_0.10.0" % "test",
// spark-hive dependencies for DataFrameSuiteBase. https://github.com/holdenk/spark-testing-base/issues/143
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided",
"com.amazonaws" % "aws-java-sdk" % "1.11.513" % "provided",
"com.amazonaws" % "aws-java-sdk-sqs" % "1.11.513" % "provided",
"com.amazonaws" % "aws-java-sdk-s3" % "1.11.513" % "provided",
//"org.apache.hadoop" % "hadoop-aws" % "3.1.1"
"org.json" % "json" % "20180813"
)
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
test in assembly := {}
// https://github.com/holdenk/spark-testing-base
fork in Test := true
javaOptions ++= Seq("-Xms512M", "-Xmx2048M", "-XX:MaxPermSize=2048M", "-XX:+CMSClassUnloadingEnabled")
parallelExecution in Test := false
When I build the project with sbt assembly, the resulting jar contains /org/junit/... and /org/opentest4j/... files
Is there any way to not include these test related files in the final jar?
I have tried replacing the line:
"org.scalatest" %% "scalatest" % "3.0.5" % "test"
with:
"org.scalatest" %% "scalatest" % "3.0.5" % "provided"
I am also wondering how the files are included in the jar as junit is not referenced inside build.sbt (there are junit tests in the project however)?
Updated:
name := "project-name"
version := "0.1"
scalaVersion := "2.11.12"
val sparkVersion = "2.4.0"
val excludeJUnitBinding = ExclusionRule(organization = "junit")
libraryDependencies ++= Seq(
// Provided
"org.apache.spark" %% "spark-core" % sparkVersion % "provided" excludeAll(excludeJUnitBinding),
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided" excludeAll(excludeJUnitBinding),
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
"com.holdenkarau" %% "spark-testing-base" % "2.3.1_0.10.0" % "provided" excludeAll(excludeJUnitBinding),
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided",
"com.amazonaws" % "aws-java-sdk" % "1.11.513" % "provided",
"com.amazonaws" % "aws-java-sdk-sqs" % "1.11.513" % "provided",
"com.amazonaws" % "aws-java-sdk-s3" % "1.11.513" % "provided",
// Test
"org.scalatest" %% "scalatest" % "3.0.5" % "test",
// Necessary
"org.json" % "json" % "20180813"
)
excludeDependencies += excludeJUnitBinding
// https://stackoverflow.com/questions/25144484/sbt-assembly-deduplication-found-error
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
// https://github.com/holdenk/spark-testing-base
fork in Test := true
javaOptions ++= Seq("-Xms512M", "-Xmx2048M", "-XX:MaxPermSize=2048M", "-XX:+CMSClassUnloadingEnabled")
parallelExecution in Test := false
To exclude certain transitive dependencies of a dependency, use the excludeAll or exclude methods.
The exclude method should be used when a pom will be published for the project. It requires the organization and module name to exclude.
For example:
libraryDependencies +=
"log4j" % "log4j" % "1.2.15" exclude("javax.jms", "jms")
The excludeAll method is more flexible, but because it cannot be represented in a pom.xml, it should only be used when a pom doesn’t need to be generated.
For example,
libraryDependencies +=
"log4j" % "log4j" % "1.2.15" excludeAll(
ExclusionRule(organization = "com.sun.jdmk"),
ExclusionRule(organization = "com.sun.jmx"),
ExclusionRule(organization = "javax.jms")
)
In certain cases a transitive dependency should be excluded from all dependencies. This can be achieved by setting up ExclusionRules in excludeDependencies(For sbt 0.13.8 and above).
excludeDependencies ++= Seq(
ExclusionRule("commons-logging", "commons-logging")
)
JUnit jar file downloads as part of below dependencies.
"org.apache.spark" %% "spark-core" % sparkVersion % "provided" //(junit)
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided"// (junit)
"com.holdenkarau" %% "spark-testing-base" % "2.3.1_0.10.0" % "test" //(org.junit)
To exclude junit file please update your dependency as below.
val excludeJUnitBinding = ExclusionRule(organization = "junit")
"org.scalatest" %% "scalatest" % "3.0.5" % "test",
"org.apache.spark" %% "spark-core" % sparkVersion % "provided" excludeAll(excludeJUnitBinding),
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided" excludeAll(excludeJUnitBinding),
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
"com.holdenkarau" %% "spark-testing-base" % "2.3.1_0.10.0" % "test" excludeAll(excludeJUnitBinding)
Update:
Please update your build.abt as below.
resolvers += Resolver.url("bintray-sbt-plugins",
url("https://dl.bintray.com/eed3si9n/sbt-plugins/"))(Resolver.ivyStylePatterns)
val excludeJUnitBinding = ExclusionRule(organization = "junit")
libraryDependencies ++= Seq(
// Provided
"org.apache.spark" %% "spark-core" % sparkVersion % "provided" excludeAll(excludeJUnitBinding),
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided" excludeAll(excludeJUnitBinding),
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
"com.holdenkarau" %% "spark-testing-base" % "2.3.1_0.10.0" % "provided" excludeAll(excludeJUnitBinding),
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided",
//"com.amazonaws" % "aws-java-sdk" % "1.11.513" % "provided",
//"com.amazonaws" % "aws-java-sdk-sqs" % "1.11.513" % "provided",
//"com.amazonaws" % "aws-java-sdk-s3" % "1.11.513" % "provided",
// Test
"org.scalatest" %% "scalatest" % "3.0.5" % "test",
// Necessary
"org.json" % "json" % "20180813"
)
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
fork in Test := true
javaOptions ++= Seq("-Xms512M", "-Xmx2048M", "-XX:MaxPermSize=2048M", "-XX:+CMSClassUnloadingEnabled")
parallelExecution in Test := false
plugin.sbt
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")
I have tried and it's not downloading junit jar file.
First of all, apologies if this question has been answered already, or if there's a complete documentation / tutorial about it, I'm a Scala newbie and I've surely missed something.
I have a C# background, and I'm trying to expand my knowledge learning Scala. I would like to include a Scala microservice in a microservices architecture that's actually built using .NET projects. To implement the shared architectural code in the .NET projects I've developed a series of base project, referenced by all the microservices projects, and this operation with VS was pretty straightforward, but I failed to grasp how to achieve the same functionality on Scala.
In particular, I would like to understand whether there's a way to reference a Project A located in a local folder in Project B, in a similar way of referencing a project from Maven like this, for example:
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-http-spray-json" % "10.1.1",
"com.typesafe.akka" %% "akka-http" % "10.1.1",
"com.typesafe.akka" %% "akka-stream" % "2.5.11"
)
and what's the best way of doing it (referencing the jars, for example).
Yes you can reference to another scala project in scala using the simple build tool.
Let's say your architecture has
1) ChatMicroserviceApi.jar (depends on two local jars 2 and 3)
2) ChatMicroserviceSchema.jar
3) ChatMicroserviceExternalDeps.jar
Then your sbt script would look like below. I use sbt-assembly to create fat jars for which you have to add addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.6") in project/plugins.sbt
//build.sbt
name := "ChatMicroserviceParent"
organization in ThisBuild := "com.microservice.chat"
scalaVersion in ThisBuild := "2.12.5"
version := "1.0-SNAPSHOT"
//define what parent consists of in aggregate section
lazy val ChatMicroserviceParent = project
.in(file("."))
.settings(settings)
.aggregate(ChatMicroserviceSchema,
ChatMicroserviceExternalDeps,
ChatMicroserviceApi)
//creates schema.jar
lazy val ChatMicroserviceSchema =
project.settings(name := "ChatMicroserviceSchema",
publishMavenStyle := true,
settings,
apiSchemaAssemblySettings)
//creates external-deps.jar
lazy val ChatMicroserviceExternalDeps =
project.settings(
name := "ChatMicroserviceExternalDeps",
publishMavenStyle := true,
settings,
apiSchemaAssemblySettings,
libraryDependencies ++= Seq(
"com.softwaremill.sttp" %% "core" % "1.1.12",
"com.softwaremill.sttp" %% "async-http-client-backend-future" % "1.1.12"
exclude ("org.asynchttpclient", "async-http-client"),
"org.asynchttpclient" % "async-http-client" % "2.4.4"
excludeAll( ExclusionRule(organization = "io.netty") ),
"com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.9.5",
"com.fasterxml.jackson.module" % "jackson-modules-java8" % "2.9.5",
"com.fasterxml.jackson.datatype" % "jackson-datatype-jsr310" % "2.9.5",
"com.fasterxml.jackson.datatype" % "jackson-datatype-jdk8" % "2.9.5"
)
)
val NettyVersion = "4.1.16.Final"
//creates api.jar which depends on schema.jar and external-deps.jar
lazy val ChatMicroserverApi = project
.settings(
name := "ChatMicroserverApi",
settings,
apiAssemblySettings,
libraryDependencies ++= Seq(
"com.github.finagle" %% "finch-core" % "0.18.0"
excludeAll( ExclusionRule(organization = "io.netty") )
exclude ("com.fasterxml.jackson.core", "jackson-databind"),
"com.github.finagle" %% "finch-circe" % "0.18.0",
"io.netty" % "netty-codec" % NettyVersion,
"io.netty" % "netty-codec-http" % NettyVersion,
"io.netty" % "netty-codec-http2" % NettyVersion,
"io.netty" % "netty-transport" % NettyVersion,
"io.netty" % "netty-buffer" % NettyVersion,
"io.netty" % "netty-common" % NettyVersion,
"io.netty" % "netty-resolver" % NettyVersion,
"io.netty" % "netty-handler" % NettyVersion,
"io.netty" % "netty-handler-proxy" % NettyVersion,
"io.netty" % "netty-transport-native-unix-common" % NettyVersion,
"io.netty" % "netty-transport-native-epoll" % NettyVersion,
"io.netty" % "netty-tcnative-boringssl-static" % "2.0.6.Final",
"io.netty" % "netty-codec-socks" % NettyVersion,
"io.circe" %% "circe-generic" % "0.9.3",
"com.typesafe" % "config" % "1.3.3",
"ch.qos.logback" % "logback-classic" % "1.2.3",
"com.typesafe.akka" %% "akka-actor" % "2.5.12",
"com.typesafe.akka" %% "akka-stream" % "2.5.12",
"com.typesafe.akka" %% "akka-http" % "10.1.1",
"com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.9.5",
"com.fasterxml.jackson.module" % "jackson-modules-java8" % "2.9.5",
"com.fasterxml.jackson.datatype" % "jackson-datatype-jsr310" % "2.9.5",
"com.fasterxml.jackson.datatype" % "jackson-datatype-jdk8" % "2.9.5",
"io.circe" %% "circe-java8" % "0.9.3",
"com.softwaremill.sttp" %% "core" % "1.1.12",
"org.asynchttpclient" % "async-http-client" % "2.4.4"
excludeAll( ExclusionRule(organization = "io.netty") ),
"com.softwaremill.sttp" %% "async-http-client-backend-future" % "1.1.12"
exclude ("org.asynchttpclient", "async-http-client"),
"joda-time" % "joda-time" % "2.9.9"
)
)
.dependsOn(ChatMicroserviceExternalDeps, ChatMicroserviceSchema)
lazy val commonDependencies = Seq()
lazy val settings =
commonSettings
lazy val compilerOptions = Seq(
"-unchecked",
"-feature",
"-language:existentials",
"-language:higherKinds",
"-language:implicitConversions",
"-language:postfixOps",
"-deprecation",
"-encoding",
"utf8"
)
lazy val commonSettings = Seq(
scalacOptions ++= compilerOptions,
resolvers ++= Seq(
Resolver.sonatypeRepo("releases"),
Resolver.sonatypeRepo("snapshots")
)
)
lazy val apiAssemblySettings = Seq(
assemblyJarName in assembly := name.value + "-" + version.value + ".jar",
mainClass in assembly := Some(
"com.microservice.MainServerClass"),
assemblyMergeStrategy in assembly := {
case PathList("reference.conf") => MergeStrategy.concat
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case _ => MergeStrategy.first
}
)
lazy val apiSchemaAssemblySettings = Seq(
assemblyJarName in assembly := name.value + "-" + version.value + ".jar")
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case _ => MergeStrategy.first
}
lazy val apiExternalDepsAssemblySettings = Seq(
assemblyJarName in assembly := name.value + "-" + version.value + ".jar")
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case _ => MergeStrategy.first
}
Here's working example which might help - https://github.com/duwamish-os/chat-server
I am using sbt to build a multi module project, and I want to use sbt-buildinfo to allow one of the services to have this info in it for debugging and versioning purposes.
The problem with this, is that the sbt-buildinfo plugin seems to only take the library dependencies of the root project and not all the dependencies of the aggregated sub projects. It seems to me that if you run the plugin after the aggregation then the root project would have those dependencies in it and would show up, but they do not. Maybe I'm just not understanding the process here, but that seems reasonable to me.
Now, this worked perfectly fine with one project, but with many sub projects it no longer works. In addition now it no longer even generates the File, I don't know what I'm doing wrong at this point. Here is my build. Any help is appreciated.
EclipseKeys.createSrc := EclipseCreateSrc.Default + EclipseCreateSrc.Managed
lazy val sharedSettings = Seq(
organization := "com.planalytics",
version := "0.0.1-SNAPSHOT",
name := "Ingestions",
scalaVersion := "2.11.7",
exportJars := true
) ++ net.virtualvoid.sbt.graph.Plugin.graphSettings ++ Defaults.defaultSettings
lazy val sparkVersion = "1.5.1"
lazy val akkaVersion = "2.3.12"
lazy val sprayVersion = "1.3.3"
lazy val hadoopVersion = "2.7.1"
lazy val exclusionRules = Seq(ExclusionRule(organization = "org.slf4j"),
ExclusionRule(organization = "org.apache.hadoop"))
lazy val sharedDeps = Seq(libraryDependencies ++= Seq(
"org.slf4j" % "slf4j-log4j12" % "1.7.10",
"org.scalacheck" %% "scalacheck" % "1.12.0" % "test",
"org.scalatest" %% "scalatest" % "2.2.4" % "test",
"org.scalanlp" %% "breeze-natives" % "0.11.2"))
lazy val sparkDeps = Seq(libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion excludeAll(exclusionRules: _*),
"org.apache.spark" %% "spark-sql" % sparkVersion excludeAll(exclusionRules: _*),
"org.apache.spark" %% "spark-mllib" % sparkVersion excludeAll(exclusionRules: _*),
"com.databricks" %% "spark-csv" % "1.2.0" excludeAll(exclusionRules: _*),
"org.apache.hadoop" % "hadoop-common" % hadoopVersion,
"org.apache.hadoop" % "hadoop-client" % hadoopVersion,
"org.apache.hadoop" % "hadoop-hdfs" % hadoopVersion,
"org.apache.hadoop" % "hadoop-aws" % hadoopVersion))
lazy val akkaDeps = Seq(libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % akkaVersion,
"com.typesafe.akka" %% "akka-cluster" % akkaVersion,
"com.typesafe.akka" %% "akka-remote" % akkaVersion,
"com.typesafe.akka" %% "akka-slf4j" % akkaVersion))
lazy val sprayDeps = Seq(libraryDependencies ++= Seq(
"io.spray" %% "spray-can" % sprayVersion,
"io.spray" %% "spray-routing" % sprayVersion,
"io.spray" %% "spray-json" % "1.3.2"))
lazy val root = (project in file(".")).
settings(sharedSettings: _*).
settings(
scalacOptions += "-deprecation",
javacOptions ++= Seq("-source", "1.8", "target", "1.8"),
javaOptions += "-Xmx2G").aggregate(ingestion, services, util).enablePlugins(BuildInfoPlugin).
settings(
buildInfoKeys := Seq[BuildInfoKey](name, version, scalaVersion, sbtVersion),
buildInfoOptions += BuildInfoOption.BuildTime,
buildInfoPackage := "com.planalytics",
buildInfoObject := "BuildInfo"
)
lazy val ingestion = Project(
id = "Ingestion-Engine",
base = file("ingestion"),
settings = Project.defaultSettings ++ sharedSettings ++ sparkDeps ++ akkaDeps ++ sharedDeps).
settings(
name := "Ingestion-Engine",
libraryDependencies ++= Seq(
"com.github.seratch" %% "awscala" % "0.5.+",
"com.typesafe.akka" %% "akka-testkit" % akkaVersion % "test"
)
).dependsOn(util)
lazy val services = Project(
id = "Ingestion-Services",
base = file("services"),
settings = Project.defaultSettings ++ sharedSettings ++ sprayDeps ++ sharedDeps).
settings(
name := "Ingestion-Services",
libraryDependencies ++= Seq(
"io.spray" %% "spray-testkit" % sprayVersion % "test"
)
).dependsOn(ingestion)
lazy val util = Project(
id = "Ingestion-Util",
base = file("util"),
settings = Project.defaultSettings ++ sharedSettings ++ sparkDeps ++ sharedDeps).
settings(
name := "Ingestion-Utils",
libraryDependencies ++= Seq(
"net.ceedubs" %% "ficus" % "1.1.2"
),
dependencyOverrides ++= Set(
"com.fasterxml.jackson.core" % "jackson-databind" % "2.4.4"
)
)
I'm also open to any thoughts on my build in general, I feel like I'm doing some things wrong or repetitively, so any suggestions there is appreciated as well. Thanks!
I'm using this template to develop a microservice:
http://www.typesafe.com/activator/template/activator-service-container-tutorial
My sbt file is like this:
import sbt._
import Keys._
name := "activator-service-container-tutorial"
version := "1.0.1"
scalaVersion := "2.11.6"
crossScalaVersions := Seq("2.10.5", "2.11.6")
resolvers += "Scalaz Bintray Repo" at "https://dl.bintray.com/scalaz/releases"
libraryDependencies ++= {
val containerVersion = "1.0.1"
val configVersion = "1.2.1"
val akkaVersion = "2.3.9"
val liftVersion = "2.6.2"
val sprayVersion = "1.3.3"
Seq(
"com.github.vonnagy" %% "service-container" % containerVersion,
"com.github.vonnagy" %% "service-container-metrics-reporting" % containerVersion,
"com.typesafe" % "config" % configVersion,
"com.typesafe.akka" %% "akka-actor" % akkaVersion exclude ("org.scala-lang" , "scala-library"),
"com.typesafe.akka" %% "akka-slf4j" % akkaVersion exclude ("org.slf4j", "slf4j-api") exclude ("org.scala-lang" , "scala-library"),
"ch.qos.logback" % "logback-classic" % "1.1.3",
"io.spray" %% "spray-can" % sprayVersion,
"io.spray" %% "spray-routing" % sprayVersion,
"net.liftweb" %% "lift-json" % liftVersion,
"com.typesafe.akka" %% "akka-testkit" % akkaVersion % "test",
"io.spray" %% "spray-testkit" % sprayVersion % "test",
"junit" % "junit" % "4.12" % "test",
"org.scalaz.stream" %% "scalaz-stream" % "0.7a" % "test",
"org.specs2" %% "specs2-core" % "3.5" % "test",
"org.specs2" %% "specs2-mock" % "3.5" % "test",
"com.twitter" %% "finagle-http" % "6.25.0",
"com.twitter" %% "bijection-util" % "0.7.2"
)
}
scalacOptions ++= Seq(
"-unchecked",
"-deprecation",
"-Xlint",
"-Ywarn-dead-code",
"-language:_",
"-target:jvm-1.7",
"-encoding", "UTF-8"
)
crossPaths := false
parallelExecution in Test := false
assemblyJarName in assembly := "santo.jar"
mainClass in assembly := Some("Service")
The project compiles fine!
But when I run assembly, the terminal show me this:
[error] (*:assembly) deduplicate: different file contents found in the following:
[error] /path/.ivy2/cache/io.dropwizard.metrics/metrics-core/bundles/metrics-core-3.1.1.jar:com/codahale/metrics/ConsoleReporter$1.class
[error] /path/.ivy2/cache/com.codahale.metrics/metrics-core/bundles/metrics-core-3.0.1.jar:com/codahale/metrics/ConsoleReporter$1.class
What options do I have to fix it?
Thanks
The issue as it seems transitive dependency of the dependency is resulting with two different versions of metrics-core. The best thing to do would be to used the right library dependency so that you end up with a single version of this library. Please use https://github.com/jrudolph/sbt-dependency-graph , if it is difficult to figure out dependencies.
If it is not possible to get to a single version then you would most likely to go down exclude route . I assume, this only work, if there is compatibility between the all required versions.