I'm working on Spark application which is build using sbt,scalatra when i compile a project, i get following error
$ my-spark-app git:(master) ✗ sbt
[info] Loading project definition from /home/limitless/Documents/projects/test/my-spark-app/project
[info] Updating {file:/home/limitless/Documents/projects/test/my-spark-app/project/}my-spark-app-build...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Compiling 1 Scala source to /home/limitless/Documents/projects/test/my-spark-app/project/target/scala-2.10/sbt-0.13/classes...
[info] Set current project to My Spark App Server (in build file:/home/limitless/Documents/projects/test/my-spark-app/)
> ~;jetty:stop;jetty:stop
[success] Total time: 0 s, completed Aug 31, 2017 4:43:47 PM
[success] Total time: 0 s, completed Aug 31, 2017 4:43:47 PM
1. Waiting for source changes... (press enter to interrupt)
> ~;jetty:stop;jetty:start
[success] Total time: 0 s, completed Aug 31, 2017 4:43:53 PM
[info] Updating {file:/home/limitless/Documents/projects/test/my-spark-app/}my-spark-app-server...
[info] Generating /home/limitless/Documents/projects/test/my-spark-app/target/scala-2.11/resource_managed/main/rebel.xml.
[info] Resolving org.scala-lang#scala-reflect;2.11.0 ...
[info] Done updating.
[info] Compiling Templates in Template Directory: /home/limitless/Documents/projects/test/my-spark-app/src/main/webapp/WEB-INF/templates
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
[info] Compiling 5 Scala sources to /home/limitless/Documents/projects/test/my-spark-app/target/scala-2.11/classes...
[info] Packaging /home/limitless/Documents/projects/test/my-spark-app/target/scala-2.11/my-spark-app-server_2.11-0.1.0-SNAPSHOT.jar ...
[info] Done packaging.
[info] starting server ...
[success] Total time: 29 s, completed Aug 31, 2017 4:44:22 PM
1. Waiting for source changes... (press enter to interrupt)
2017-08-31 16:44:22.686:INFO::main: Logging initialized #106ms
2017-08-31 16:44:22.691:INFO:oejr.Runner:main: Runner
2017-08-31 16:44:22.766:INFO:oejs.Server:main: jetty-9.2.1.v20140609
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/limitless/Documents/projects/test/my-spark-app/target/webapp/WEB-INF/lib/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/limitless/Documents/projects/test/my-spark-app/target/webapp/WEB-INF/lib/logback-classic-1.1.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2017-08-31 16:44:31.657:WARN:oejuc.AbstractLifeCycle:main: FAILED org.eclipse.jetty.annotations.ServletContainerInitializersStarter#59309333: java.lang.NoClassDefFoundError: com/sun/jersey/spi/inject/InjectableProvider
java.lang.NoClassDefFoundError: com/sun/jersey/spi/inject/InjectableProvider
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
build.scala
import sbt._
import Keys._
import org.scalatra.sbt._
import org.scalatra.sbt.PluginKeys._
import com.earldouglas.xwp.JettyPlugin
import com.mojolly.scalate.ScalatePlugin._
import ScalateKeys._
object MySparkAppServerBuild extends Build {
val Organization = "com.learning"
val Name = "My Spark App Server"
val Version = "0.1.0-SNAPSHOT"
val ScalaVersion = "2.11.8"
val ScalatraVersion = "2.5.1"
val SparkVersion = "2.2.0"
lazy val project = Project (
"my-spark-app-server",
file("."),
settings = ScalatraPlugin.scalatraWithJRebel ++ scalateSettings ++ Seq(
organization := Organization,
name := Name,
version := Version,
scalaVersion := ScalaVersion,
resolvers += Classpaths.typesafeReleases,
resolvers += "Scalaz Bintray Repo" at "http://dl.bintray.com/scalaz/releases",
resolvers += Resolver.mavenLocal,
libraryDependencies ++= Seq(
"junit" % "junit" % "4.12" % "test",
"org.scalatra" %% "scalatra" % ScalatraVersion,
"org.scalatra" %% "scalatra-scalate" % ScalatraVersion,
"org.scalatra" %% "scalatra-specs2" % ScalatraVersion % "test",
"ch.qos.logback" % "logback-classic" % "1.1.5" % "runtime",
"org.eclipse.jetty" % "jetty-webapp" % "9.2.15.v20160210" % "container",
"javax.servlet" % "javax.servlet-api" % "3.1.0" % "provided",
"org.apache.spark" %% "spark-core" % SparkVersion
),
scalateTemplateConfig in Compile <<= (sourceDirectory in Compile){ base =>
Seq(
TemplateConfig(
base / "webapp" / "WEB-INF" / "templates",
Seq.empty, /* default imports should be added here */
Seq(
Binding("context", "_root_.org.scalatra.scalate.ScalatraRenderContext", importMembers = true, isImplicit = true)
), /* add extra bindings here */
Some("templates")
)
)
}
)
).enablePlugins(JettyPlugin)
}
I added this "com.sun.jersey" % "jersey-bundle" % "1.19.2" library in my build.scala file and my problem is resolved
libraryDependencies ++= Seq(
"junit" % "junit" % "4.12" % "test",
"org.scalatra" %% "scalatra" % ScalatraVersion,
"org.scalatra" %% "scalatra-scalate" % ScalatraVersion,
"org.scalatra" %% "scalatra-specs2" % ScalatraVersion % "test",
"ch.qos.logback" % "logback-classic" % "1.1.5" % "runtime",
"org.eclipse.jetty" % "jetty-webapp" % "9.2.15.v20160210" % "container",
"javax.servlet" % "javax.servlet-api" % "3.1.0" % "provided",
"org.apache.spark" %% "spark-core" % SparkVersion,
"com.sun.jersey" % "jersey-bundle" % "1.19.2"
)
Related
I have a scala spark project that fails because of some dependency hell. Here is my build.sbt:
scalaVersion := "2.13.3"
val SPARK_VERSION = "3.2.0"
libraryDependencies ++= Seq(
"com.typesafe" % "config" % "1.3.1",
"com.github.pathikrit" %% "better-files" % "3.9.1",
"org.apache.commons" % "commons-compress" % "1.14",
"commons-io" % "commons-io" % "2.6",
"com.typesafe.scala-logging" %% "scala-logging" % "3.9.4",
"ch.qos.logback" % "logback-classic" % "1.2.3" exclude ("org.slf4j", "*"),
"org.plotly-scala" %% "plotly-render" % "0.8.1",
"org.apache.spark" %% "spark-sql" % SPARK_VERSION,
"org.apache.spark" %% "spark-mllib" % SPARK_VERSION,
// Test dependencies
"org.scalatest" %% "scalatest" % "3.2.10" % Test,
"com.amazon.deequ" % "deequ" % "2.0.0-spark-3.1" % Test,
"org.awaitility" % "awaitility" % "3.0.0" % Test,
"org.apache.spark" %% "spark-core" % SPARK_VERSION % Test,
"org.apache.spark" %% "spark-sql" % SPARK_VERSION % Test
Here is the build failure:
[error] stack trace is suppressed; run 'last update' for the full output
[error] stack trace is suppressed; run 'last ssExtractDependencies' for the full output
[error] (update) lmcoursier.internal.shaded.coursier.error.FetchError$DownloadingArtifacts: Error fetching artifacts:
[error] https://repo1.maven.org/maven2/org/apache/avro/avro-mapred/1.10.2/avro-mapred-1.10.2-hadoop2.jar: not found: https://repo1.maven.org/maven2/org/apache/avro/avro-mapred/1.10.2/avro-mapred-1.10.2-hadoop2.jar
[error] (ssExtractDependencies) lmcoursier.internal.shaded.coursier.error.FetchError$DownloadingArtifacts: Error fetching artifacts:
[error] https://repo1.maven.org/maven2/org/apache/avro/avro-mapred/1.10.2/avro-mapred-1.10.2-hadoop2.jar: not found: https://repo1.maven.org/maven2/org/apache/avro/avro-mapred/1.10.2/avro-mapred-1.10.2-hadoop2.jar
[error] Total time: 5 s, completed Dec 19, 2021, 5:14:33 PM
[info] shutting down sbt server
Is this caused by the fact that I',m using Scala 2.13?
I had to do the inevitable and add this to my build.sbt:
ThisBuild / useCoursier := false
I am having a build problem. Here is my sbt file:
name := "SparkPi"
version := "0.2.15"
scalaVersion := "2.11.8"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.10
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.0.1"
// old:
//libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.1"
// https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk
libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.0.002"
scalacOptions ++= Seq("-feature")
Here is the full error message I am seeing:
[info] Set current project to SparkPi (in build file:/Users/xxx/prog/yyy/)
[info] Updating {file:/Users/xxx/prog/yyy/}yyy...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Compiling 2 Scala sources to /Users/xxx/prog/yyy/target/scala-2.11/classes...
[error] /Users/xxx/prog/yyy/src/main/scala/PiSpark.scala:6: object profile is not a member of package com.amazonaws.auth
[error] import com.amazonaws.auth.profile._
[error] ^
[error] /Users/xxx/prog/yyy/src/main/scala/PiSpark.scala:87: not found: type ProfileCredentialsProvider
[error] val creds = new ProfileCredentialsProvider(profile).getCredentials()
[error] ^
[error] two errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 14 s, completed Nov 3, 2016 1:43:34 PM
And here are the imports I am trying to use:
import com.amazonaws.services.s3._
import com.amazonaws.auth.profile._
How do I import com.amazonaws.auth.profile.ProfileCredentialsProvider in Scala?
EDIT
Changed sbt file so spark core version corresponds to Scala version, new contents:
name := "SparkPi"
version := "0.2.15"
scalaVersion := "2.11.8"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.1"
// old:
//libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.1"
// https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk
libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.0.002"
scalacOptions ++= Seq("-feature")
You are using scalaVersion := "2.11.8" but library dependency has underscore 2.10 spark-core_2.10 which is bad.
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.0.1"
^
change 2.10 to 2.11
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.1"
`
I have upgraded build.sbt to use the latest play-slick (2.0.0) and after all was downloaded and the application set to run I got this exception.
`
java.lang.NoSuchMethodError: play.api.Logger$.init(Ljava/io/File;Lscala/Enumeration$Value;)V
at play.core.server.DevServerStart$$anonfun$mainDev$1.apply(DevServerStart.scala:88)
at play.core.server.DevServerStart$$anonfun$mainDev$1.apply(DevServerStart.scala:61)
at play.utils.Threads$.withContextClassLoader(Threads.scala:21)
at play.core.server.DevServerStart$.mainDev(DevServerStart.scala:60)
at play.core.server.DevServerStart$.mainDevHttpMode(DevServerStart.scala:50)
at play.core.server.DevServerStart.mainDevHttpMode(DevServerStart.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at play.runsupport.Reloader$.startDevMode(Reloader.scala:207)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.devModeServer$lzycompute$1(PlayRun.scala:73)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.play$sbt$run$PlayRun$$anonfun$$anonfun$$anonfun$$devModeServer$1(PlayRun.scala:73)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.apply(PlayRun.scala:99)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.apply(PlayRun.scala:52)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) java.lang.reflect.InvocationTargetException
[error] Total time: 0 s, completed 31/ago/2016 23:23:25
`
This is my build.sbt
`
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.7"
libraryDependencies += "mysql" % "mysql-connector-java" % "5.1.22"
libraryDependencies ++= Seq(
//jdbc,
cache,
ws,
specs2 % Test
)
libraryDependencies ++= Seq(
"com.typesafe.play" %% "play-slick" % "2.0.0",
"com.typesafe.play" %% "play-slick-evolutions" % "2.0.0"
)
//libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.1.0"
libraryDependencies += "org.slf4j" % "slf4j-api" % "1.7.21"
libraryDependencies += "org.slf4j" % "slf4j-simple" % "1.7.21"
resolvers += "scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"
// Play provides two styles of routers, one expects its actions to be injected, the
// other, legacy style, accesses its actions statically.
routesGenerator := InjectedRoutesGenerator
libraryDependencies += "com.sksamuel.scrimage" %% "scrimage-core" % "2.1.0"
libraryDependencies += "com.sksamuel.scrimage" %% "scrimage-io-extra" % "2.1.0"
libraryDependencies += "com.sksamuel.scrimage" %% "scrimage-filters" % "2.1.0"
`
and my plugins.sbt
`
cat project/plugins.sbt
// The Play plugin
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.4.0")
// web plugins
addSbtPlugin("com.typesafe.sbt" % "sbt-coffeescript" % "1.0.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-less" % "1.0.6")
addSbtPlugin("com.typesafe.sbt" % "sbt-jshint" % "1.0.3")
addSbtPlugin("com.typesafe.sbt" % "sbt-rjs" % "1.0.7")
addSbtPlugin("com.typesafe.sbt" % "sbt-digest" % "1.1.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-mocha" % "1.1.0")
//offline := true
`
I already tried disabling dependencies but that did not solve the issue.
Any ideas?
Your play version "2.4" seems to be incompatible with the upgraded Slick version.
//try this instead in your plugins.sbt
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.5.x")
My sbt project takes more than 15 minutes when I do
sbt clean compile
I am on a beefy machine on AWS. I am fairly certain its not a resource issue on cpu or internet bandwidth. Also, I have run this command a few times and hence the ivy cache is populated.
Here is all my build related files
/build.sbt
name := "ProjectX"
version := "1.0"
scalaVersion := "2.10.5"
libraryDependencies += ("org.apache.spark" %% "spark-streaming" % "1.4.1")
.exclude("org.slf4j", "slf4j-log4j12")
.exclude("log4j", "log4j")
.exclude("commons-logging", "commons-logging")
.%("provided")
libraryDependencies += ("org.apache.spark" %% "spark-streaming-kinesis-asl" % "1.4.1")
.exclude("org.slf4j", "slf4j-log4j12")
.exclude("log4j", "log4j")
.exclude("commons-logging", "commons-logging")
libraryDependencies += "org.mongodb" %% "casbah" % "2.8.1"
//test
libraryDependencies += "org.scalatest" %% "scalatest" % "2.2.4" % "test"
//logging
libraryDependencies ++= Seq(
//facade
"org.slf4j" % "slf4j-api" % "1.7.12",
"org.clapper" %% "grizzled-slf4j" % "1.0.2",
//jcl (used by aws sdks)
"org.slf4j" % "jcl-over-slf4j" % "1.7.12",
//log4j1 (spark)
"org.slf4j" % "log4j-over-slf4j" % "1.7.12",
//log4j2
"org.apache.logging.log4j" % "log4j-api" % "2.3",
"org.apache.logging.log4j" % "log4j-core" % "2.3",
"org.apache.logging.log4j" % "log4j-slf4j-impl" % "2.3"
//alternative to log4j2
//"org.slf4j" % "slf4j-simple" % "1.7.5"
)
/project/build.properties
sbt.version = 0.13.8
/project/plugins.sbt
logLevel := Level.Warn
addSbtPlugin("org.scalastyle" %% "scalastyle-sbt-plugin" % "0.7.0")
resolvers += "sonatype-releases" at "https://oss.sonatype.org/content/repositories/releases/"
/project/assembly.sbt
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")
On the log do you see entries like:
[info] [SUCCESSFUL ] org.apache.spark#spark-streaming-kinesis-asl_2.10;1.4.1!spark-streaming-kinesis-asl_2.10.jar (239ms)
That's a sign that you're downloading these artifacts. In other words, the AMI you're launching doesn't have the Ivy cache populated.
Using sbt 0.13.12 on my laptop with SSD, I get about 5s for clean and then update.
so-31956971> update
[info] Updating {file:/xxx/so-31956971/}app...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[success] Total time: 5 s, completed Aug 25, 2016 4:00:00 AM
I've just upgraded to sbt 0.11.1 that doesn't seem to be fetching a
certain dependency. Things worked fine before the upgrade.
I have this dependency:
"org.scalatra" %% "scalatra" % "2.1.0-SNAPSHOT",
And when I compile:
> update
[success] Total time: 0 s, completed Nov 18, 2011 5:44:16 PM
> compile
[info] Compiling 29 Scala sources and 1 Java source to
/home/yang/pod/sales/scala/target/scala-2.9.1/classes...
[error] /home/yang/pod/sales/scala/src/main/scala/com/pod/Web.scala:125:
not found: type ScalatraServlet
[error] class PodWeb extends ScalatraServlet with ScalateSupport with
FileUploadSupport {
[error] ^
[error] class file needed by ScalateSupport is missing.
[error] reference type ScalatraKernel of package org.scalatra refers
to nonexisting symbol.
[error] two errors found
[error] {file:/home/yang/pod/sales/scala/}pod/compile:compile:
Compilation failed
[error] Total time: 10 s, completed Nov 18, 2011 5:44:45 PM
The file seems to be missing:
$ ls /home/yang/.ivy2/cache/org.scalatra/scalatra_2.9.1/jars/
scalatra_2.9.1-2.1.0-SNAPSHOT-sources.jar
The file exists in the repo, though:
https://oss.sonatype.org/content/repositories/snapshots/org/scalatra/scalatra_2.9.1/2.1.0-SNAPSHOT/
This is still happening even if I blow away ~/.ivy2/. Any hints what's happening?
Complete build.sbt below:
name := "pod"
version := "1.0"
scalaVersion := "2.9.1"
seq(coffeeSettings: _*)
seq(webSettings :_*)
seq(sbtprotobuf.ProtobufPlugin.protobufSettings: _*)
libraryDependencies ++= Seq(
"org.scalaquery" % "scalaquery_2.9.0" % "0.9.4",
"postgresql" % "postgresql" % "9.0-801.jdbc4", // % "runtime",
"com.jolbox" % "bonecp" % "0.7.1.RELEASE",
"ru.circumflex" % "circumflex-orm" % "2.1-SNAPSHOT",
"ru.circumflex" % "circumflex-core" % "2.1-SNAPSHOT",
"net.sf.ehcache" % "ehcache-core" % "2.4.3",
// snapshots needed for scala 2.9.0 support
"org.scalatra" %% "scalatra" % "2.1.0-SNAPSHOT",
"org.scalatra" %% "scalatra-scalate" % "2.1.0-SNAPSHOT",
"org.scalatra" %% "scalatra-fileupload" % "2.1.0-SNAPSHOT",
"org.fusesource.scalate" % "scalate-jruby" % "1.5.0",
"org.fusesource.scalamd" % "scalamd" % "1.5", // % runtime,
"org.mortbay.jetty" % "jetty" % "6.1.22",
"net.debasishg" % "sjson_2.9.0" % "0.12",
"com.lambdaworks" % "scrypt" % "1.2.0",
"org.mortbay.jetty" % "jetty" % "6.1.22" % "container",
// "org.bowlerframework" %% "core" % "0.4.1",
"net.sf.opencsv" % "opencsv" % "2.1",
"org.apache.commons" % "commons-math" % "2.2",
"org.apache.commons" % "commons-lang3" % "3.0",
"com.google.protobuf" % "protobuf-java" % "2.4.1",
"ch.qos.logback" % "logback-classic" % "0.9.29",
"org.scalatest" % "scalatest_2.9.0" % "1.6.1",
"com.h2database" % "h2" % "1.3.158",
"pentaho.weka" % "pdm-3.7-ce" % "SNAPSHOT",
// this line doesn't work due to sbt bug:
// https://github.com/harrah/xsbt/issues/263
// work around by manually downloading this into the lib/ directory
// "org.rosuda" % "jri" % "0.9-1" from "https://dev.partyondata.com/deps/jri-0.9-1.jar",
"net.java.dev.jna" % "jna" % "3.3.0",
"org.scalala" % "scalala_2.9.0" % "1.0.0.RC2-SNAPSHOT",
"rhino" % "js" % "1.7R2",
"junit" % "junit" % "4.9",
"org.apache.commons" % "commons-email" % "1.2",
"commons-validator" % "commons-validator" % "1.3.1",
"oro" % "oro" % "2.0.8", // validator depends on this
"javax.servlet" % "servlet-api" % "2.5" % "provided->default"
)
fork in run := true
javaOptions in run ++= Seq(
"-Xmx3G",
"-Djava.library.path=" + System.getenv("HOME") +
"/R/x86_64-pc-linux-gnu-library/2.13/rJava/jri:" +
"/usr/lib/R/site-library/rJava/jri"
)
//javaOptions in run ++= Seq(
// "-Dcom.sun.management.jmxremote",
// "-Dcom.sun.management.jmxremote.port=3000",
// "-Dcom.sun.management.jmxremote.authenticate=false",
// "-Dcom.sun.management.jmxremote.ssl=false"
//)
scalacOptions ++= Seq("-g:vars", "-deprecation", "-unchecked")
// needed for the scalatra snapshots
resolvers ++= Seq(
"POD" at "https://dev.partyondata.com/deps/",
"Scala-Tools Snapshots" at "http://scala-tools.org/repo-snapshots/",
"Sonatype OSS Snapshots" at "http://oss.sonatype.org/content/repositories/snapshots/",
"Sonatype OSS releases" at "http://oss.sonatype.org/content/repositories/releases",
"ScalaNLP" at "http://repo.scalanlp.org/repo",
"Pentaho" at "http://repo.pentaho.org/artifactory/pentaho/",
"FuseSource snapshots" at "http://repo.fusesource.com/nexus/content/repositories/snapshots",
"JBoss" at "https://repository.jboss.org/nexus/content/repositories/thirdparty-releases"
)
initialCommands in consoleQuick := """
import scalala.scalar._;
import scalala.tensor.::;
import scalala.tensor.mutable._;
import scalala.tensor.dense._;
import scalala.tensor.sparse._;
import scalala.library.Library._;
import scalala.library.LinearAlgebra._;
import scalala.library.Statistics._;
import scalala.library.Plotting._;
import scalala.operators.Implicits._;
//
import scala.collection.{mutable => mut}
import scala.collection.JavaConversions._
import ru.circumflex.orm._
import ru.circumflex.core._
"""
//
// sxr
//
// addCompilerPlugin("org.scala-tools.sxr" %% "sxr" % "0.2.7")
//
// scalacOptions <+= (scalaSource in Compile) { "-P:sxr:base-directory:" + _.getAbsolutePath }
After blowing away not just ~/.ivy2 but ~/.m2 and ~/.sbt as well, everything worked again.
Sometimes ivy cache entries get corrupted - simply remove ~/.ivy2/cache/org.scalatra/scalatra_2.9.1/jars/, and let SBT re-fetch the dependency from the remote repo. If it doesn't work, try to remove an entire cache directory (~/.ivy2/cache).
I have had occasions where Ivy has got confused. I can't tell you why, unfortunately, but I have found that things work fine after deleting the entire ~/.ivy2 directory hierarchy. Clearly you'll have to download all your dependencies again, though :-(