I am wrestling with importing required jar files and deploying them into scala projects after writing build.sbt script. Resolvers deny downloading any scala libraries from neither sonar and maveen 2 resolvers. My scala version is 2.13.0 and sbt is 1.6.1
after compiling sbt builds the program yields errors stated below.
download error: Caught java.io.IOException (Server returned HTTP
response code: 400 for URL:
https://repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.0
/scala-library-2.13.0 .pom) while downloading
https://repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.0
/scala-library-2.13.0 .pom [error] download error: Caught
java.net.URISyntaxException (Illegal character in path at index 68:
http://download.java.net/maven/2/org/scala-lang/scala-library/2.13.0
/scala-library-2.13.0 .pom) while downloading
http://download.java.net/maven/2/org/scala-lang/scala-library/2.13.0
/scala-library-2.13.0 .pom [error] download error: Caught
java.net.ConnectException (Connection refused (Connection refused))
while downloading
http://repo.typesafe.com/typesafe/releases/org/scala-lang/scala-library/2.13.0
/scala-library-2.13.0 .po [error] Error downloading
org.json4s:json4s-native_2.13.0 :3.5.1
Here is my build.sbt script which means error does'nt stem from syntax error.
lazy val commonSettings = Seq(
organization := "scala_REINFORCEMENTLEARNING",
organizationName:="trial",
scalaVersion := "2.13.0",
version := "0.1.0-SNAPSHOT"
)
lazy val root = (project in file("."))
.settings(
commonSettings,
name := "sarsamora",
libraryDependencies ++= Seq(
"org.scalatest" %% "scalatest" % "3.2.10" % "test",
"com.typesafe" % "config" % "1.2.1",
"commons-io" % "commons-io" % "2.4",
"jline" % "jline" % "2.12.1",
"org.json4s" %% "json4s-native" % "3.5.1",
// logging
"ch.qos.logback" % "logback-classic" % "1.1.7",
"com.typesafe.scala-logging" %% "scala-logging" % "3.4.0",
"org.scala-graph" %% "graph-core" % "1.11.3",
"org.scalanlp" %% "breeze" % "0.13",
"org.scalanlp" %% "breeze-natives" % "1.1",
"org.scalanlp" %% "breeze-viz" % "1.1",
"org.jfree" % "jfreechart" % "1.0.19"
)
)
lazy val compiler = (project in file(".")).dependsOn(root)
.settings(commonSettings:_*)
resolvers ++= Seq(
("Typesafe" at "http://repo.typesafe.com/typesafe/releases/").withAllowInsecureProtocol(true),
("Java.net Maven2 Repository" at "http://download.java.net/maven/2/").withAllowInsecureProtocol(true),
)
Look's like you've got an extra whitespace in your version number:
https://repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.0/scala-library-2.13.0 .pom
Fix the Scala version in your build.sbt and it should be it.
Related
I am working on a scala spark project.
I am using below dependencies:
libraryDependencies ++=
Seq(
"org.apache.spark" %% "spark-core" % "2.2.0" ,
"org.apache.spark" %% "spark-sql" % "2.2.0" ,
"org.apache.spark" %% "spark-hive" % "2.2.0"
),
with scalaVersion set to :
ThisBuild / scalaVersion := "2.11.8"
and i am getting below error:
[error] sbt.librarymanagement.ResolveException: unresolved dependency: org.apache.logging.log4j#log4j-api;2.11.1: Resolution failed several times for dependency: org.apache.logging.log4j#log4j-api;2.11.1 {compile=[compile(*), master(*)], runtime=[runtime(*)]}::
[error] typesafe-ivy-releases: unable to get resource for org.apache.logging.log4j#log4j-api;2.11.1: res=https://repo.typesafe.com/typesafe/ivy-releases/org.apache.logging.log4j/log4j-api/2.11.1/ivys/ivy.xml: java.io.IOException: Unexpected response code for CONNECT: 403
[error] sbt-plugin-releases: unable to get resource for org.apache.logging.log4j#log4j-api;2.11.1: res=https://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/org.apache.logging.log4j/log4j-api/2.11.1/ivys/ivy.xml: java.io.IOException: Unexpected response code for CONNECT: 403
Security team has reached out to us to delete the vulnerable log4j-core jar. After which the projects which are using it as transitive dependencies are failing.
Is there a way on just upgrading the log4j version without upgrading scala or spark versions?
It should be a way where i can force the compiler to not fetch log4j-core jar of previous version which is vulnerable and in its place can use 2.17.2 version which is not vulnerable.
I have tried :
dependencyOverrides += "org.apache.logging.log4j" % "log4j-core" % "2.17.2"
also i have excludeAll option in sbt with spark dependencies but both solutions didnt worked out for me.
I just made few updates:
Added below settings to my sbt project:
Updated below settings to use a newer version: in build.properties and assembly.sbt respectively
sbt.version=1.6.2
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "1.1.0")
Added the log4j dependencies on the top so that any transitive dependency now can use a newer version.
Given below is the sample snippet of one of my project:
name := "project name"
version := "0.1"
scalaVersion := "2.11.8"
assemblyJarName in assembly := s"${name.value}-${version.value}.jar"
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("com.google.**" -> "shaded.#1").inAll
)
lazy val root = (project in file(".")).settings(
test in assembly := {}
)
libraryDependencies += "org.apache.logging.log4j" % "log4j-core" % "2.17.2"
libraryDependencies += "org.apache.logging.log4j" % "log4j-api" % "2.17.2"
libraryDependencies += "org.apache.logging.log4j" % "log4j-slf4j-impl" % "2.17.2"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0" % "provided"
libraryDependencies += "org.apache.spark" %% "spark-mllib" % "2.2.0" % "provided"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.0" % "test"
libraryDependencies += "com.typesafe" % "config" % "1.3.1"
libraryDependencies += "org.scalaj" %% "scalaj-http" % "2.4.0"
Below should be provided only in case of conflicts between dependencies if there are any:
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case PathList("org", "slf4j", xs#_*) => MergeStrategy.first
case x => MergeStrategy.first
}
Sorry I am fairly new to Scala and SBT. Here is my build.sbt file
name := "test_stream"
version := "0.1"
scalaVersion := "2.12.10"
resolvers in ThisBuild += Resolver.bintrayRepo("streetcontxt", "maven")
mainClass in Compile := Some("basepackage.Main")
enablePlugins(JavaAppPackaging)
enablePlugins(DockerPlugin)
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-stream" % "2.6.1",
"com.amazonaws" % "aws-java-sdk-s3" % "1.11.693",
"com.streetcontxt" %% "kcl-akka-stream" % "2.0.3",
"me.maciejb.snappyflows" %% "snappy-flows" % "0.2.0",
"org.xerial.snappy" % "snappy-java" % "1.1.7.3",
"org.apache.hadoop" % "hadoop-common" % "2.10.0",
"org.apache.hadoop" % "hadoop-core" % "1.2.1"
)
And I get the following error:
sbt.librarymanagement.ResolveException: unresolved dependency: org.apache.hadoop#hadoop-common;2.10.0: Resolution failed several times for dependency: org.apache.hadoop#hadoop-common;2.10.0 {compile=[default(compile)]}::
try removing ~/.sbt and ~/.ivy2 and run again
Runtime classpath according to 'show runtime:fullClasspath' contains only target/scala-2.11/classes and ~/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.11.7.jar.
compile:fullClasspath contains all libraryDependencies jar locations under ~/.ivy2/cache. Why is this? I am getting java.lang.NoClassDefFoundError on sbt run.
build.sbt:
name := "my-server"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies ++= List(
"com.typesafe.slick" %% "slick" % "3.1.0" % "provided",
"com.twitter.finatra" %% "finatra-http" % "2.1.0" % "provided",
"com.roundeights" %% "hasher" % "1.2.0" % "provided",
"com.twitter" %% "util-logging" % "6.29.0" % "provided"
)
resolvers +=
"Twitter" at "http://maven.twttr.com"
resolvers ++= Seq("RoundEights" at "http://maven.spikemark.net/roundeights")
sbt run results:
Exception in thread "main" java.lang.NoClassDefFoundError: com/twitter/logging/Logger
sbt version 0.13.8
Removing "provided" was the fix here - I was using it incorrectly to resolve ambiguous subversions of dependencies (credit to pfn from freenode #scala)
I have written the following sbt file
name := "Test"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
"org.apache.hadoop" % "hadoop-client" % "2.7.1",
"org.apache.spark" % "spark-core_2.10" % "1.3.0",
"org.apache.avro" % "avro" % "1.7.7",
"org.apache.avro" % "avro-mapred" % "1.7.7"
)
mainClass := Some("com.test.Foo")
I also have the following assembly.sbt file in projects folder
resolvers += Resolver.url("bintray-sbt-plugins", url("http://dl.bintray.com/sbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.0")
when i do sbt assembly i get a huge list of errors
[error] (*:assembly) deduplicate: different file contents found in the following:
[error] /Users/abhishek.srivastava/.ivy2/cache/com.esotericsoftware.kryo/kryo/bundles/kryo-2.21.jar:com/esotericsoftware/minlog/Log$Logger.class
[error] /Users/abhishek.srivastava/.ivy2/cache/com.esotericsoftware.minlog/minlog/jars/minlog-1.2.jar:com/esotericsoftware/minlog/Log$Logger.class
[error] deduplicate: different file contents found in the following:
[error] /Users/abhishek.srivastava/.ivy2/cache/com.esotericsoftware.kryo/kryo/bundles/kryo-2.21.jar:com/esotericsoftware/minlog/Log.class
[error] /Users/abhishek.srivastava/.ivy2/cache/com.esotericsoftware.minlog/minlog/jars/minlog-1.2.jar:com/esotericsoftware/minlog/Log.class
[error] deduplicate: different file contents found in the following:
I was able to resolve the problem. actually there is no need to build a fat jar because the "spark-submit" tool will have everything in the classpath anyway.
thus the right way to build the jar file is
name := "Test"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
"org.apache.hadoop" % "hadoop-client" % "2.7.1" % "provided",
"org.apache.spark" % "spark-core_2.10" % "1.3.0" % "provided",
"org.apache.avro" % "avro" % "1.7.7" % "provided",
"org.apache.avro" % "avro-mapred" % "1.7.7" % "provided"
)
mainClass := Some("com.test.Foo")
1. use the MergeStrategy, see sbt-assembly
2. exclude the duplicated jars, such as:
lazy val hbaseLibSeq = Seq(
("org.apache.hbase" % "hbase" % hbaseVersion).
excludeAll(
ExclusionRule(organization = "org.slf4j"),
ExclusionRule(organization = "org.mortbay.jetty"),
ExclusionRule(organization = "javax.servlet")),
("net.java.dev.jets3t" % "jets3t" % "0.6.1" % "provided").
excludeAll(ExclusionRule(organization = "javax.servlet"))
)
3. use the provided scope
show dependency tree:
➜ cat ~/.sbt/0.13/plugins/plugins.sbt
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.7.5")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2")
➜ cat ~/.sbt/0.13/global.sbt
net.virtualvoid.sbt.graph.Plugin.graphSettings
➜ sbt dependency-graph
I would like to use the phantom cassandra wrapper in my scala project, but when I try to update my sbt build I get a dependency error.
My build.sbt:
version := "1.0"
scalaVersion := "2.11.2"
seq(lsSettings :_*)
libraryDependencies ++= Seq(
"org.clapper" %% "grizzled-scala" % "1.2",
"commons-io" % "commons-io" % "2.4",
"org.rauschig" % "jarchivelib" % "0.6.0",
"com.google.code.findbugs" % "jsr305" % "3.0.0",
"org.scalatest" % "scalatest_2.11" % "2.2.0" % "test",
"com.github.nscala-time" %% "nscala-time" % "1.2.0",
"org.json4s" %% "json4s-native" % "3.2.10",
"org.scala-lang" % "scala-library" % "2.11.2",
"com.websudos" % "phantom-dsl_2.10" % "1.2.0"
)
resolvers += "grizzled-scala-resolver-0" at "https://oss.sonatype.org/content/repositories/releases"
resolvers += "Typesafe repository releases" at "http://repo.typesafe.com/typesafe/releases/"
I get the following error:
[warn] Note: Some unresolved dependencies have extra attributes. Check that these dependencies exist with the requested attributes.
[warn] com.typesafe.sbt:sbt-pgp:0.8.1 (sbtVersion=0.13, scalaVersion=2.10)
Don't know what I have to do...
edit:
Answer from https://github.com/websudosuk/phantom/issues/119
error is on the pom side, new version 1.2.1 coming soon...
Answer from https://github.com/websudosuk/phantom/issues/119
error is on the pom side, new version 1.2.1 coming soon...