My sbt project takes more than 15 minutes when I do
sbt clean compile
I am on a beefy machine on AWS. I am fairly certain its not a resource issue on cpu or internet bandwidth. Also, I have run this command a few times and hence the ivy cache is populated.
Here is all my build related files
/build.sbt
name := "ProjectX"
version := "1.0"
scalaVersion := "2.10.5"
libraryDependencies += ("org.apache.spark" %% "spark-streaming" % "1.4.1")
.exclude("org.slf4j", "slf4j-log4j12")
.exclude("log4j", "log4j")
.exclude("commons-logging", "commons-logging")
.%("provided")
libraryDependencies += ("org.apache.spark" %% "spark-streaming-kinesis-asl" % "1.4.1")
.exclude("org.slf4j", "slf4j-log4j12")
.exclude("log4j", "log4j")
.exclude("commons-logging", "commons-logging")
libraryDependencies += "org.mongodb" %% "casbah" % "2.8.1"
//test
libraryDependencies += "org.scalatest" %% "scalatest" % "2.2.4" % "test"
//logging
libraryDependencies ++= Seq(
//facade
"org.slf4j" % "slf4j-api" % "1.7.12",
"org.clapper" %% "grizzled-slf4j" % "1.0.2",
//jcl (used by aws sdks)
"org.slf4j" % "jcl-over-slf4j" % "1.7.12",
//log4j1 (spark)
"org.slf4j" % "log4j-over-slf4j" % "1.7.12",
//log4j2
"org.apache.logging.log4j" % "log4j-api" % "2.3",
"org.apache.logging.log4j" % "log4j-core" % "2.3",
"org.apache.logging.log4j" % "log4j-slf4j-impl" % "2.3"
//alternative to log4j2
//"org.slf4j" % "slf4j-simple" % "1.7.5"
)
/project/build.properties
sbt.version = 0.13.8
/project/plugins.sbt
logLevel := Level.Warn
addSbtPlugin("org.scalastyle" %% "scalastyle-sbt-plugin" % "0.7.0")
resolvers += "sonatype-releases" at "https://oss.sonatype.org/content/repositories/releases/"
/project/assembly.sbt
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")
On the log do you see entries like:
[info] [SUCCESSFUL ] org.apache.spark#spark-streaming-kinesis-asl_2.10;1.4.1!spark-streaming-kinesis-asl_2.10.jar (239ms)
That's a sign that you're downloading these artifacts. In other words, the AMI you're launching doesn't have the Ivy cache populated.
Using sbt 0.13.12 on my laptop with SSD, I get about 5s for clean and then update.
so-31956971> update
[info] Updating {file:/xxx/so-31956971/}app...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[success] Total time: 5 s, completed Aug 25, 2016 4:00:00 AM
Related
Integration test configuration is new to me.
I cannot get my scalatest integration tests to run (on sbt or intellij).
My unittests in src/test/scala run fine.
My integration tests are in src/it/scala
If I run with sbt it:test the error is "No such setting/task"
If I run on intellij (i.e., with the 'run' button), I get
Unable to load a Suite class. This could be due to an error in your runpath. Missing class: xxx.tools.es_ingester.EsIntegrationSpec
java.lang.ClassNotFoundException: xxx.tools.es_ingester.ConfluenceEsIntegrationSpec
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
The class, however, is clearly in /src/it/scala/xxx/tools/es_ingester.
update: build.sbt
name := "xxx.tools.data_extractor"
version := "0.1"
organization := "xxx.tools"
scalaVersion := "2.11.12"
sbtVersion := "1.2.7"
libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.5"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.5" % "test"
libraryDependencies += "com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.9.5"
libraryDependencies += "ch.qos.logback" % "logback-core" % "1.2.3"
libraryDependencies += "org.slf4j" % "slf4j-api" % "1.7.25"
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.2.3"
libraryDependencies += "org.apache.logging.log4j" % "log4j-core" % "2.11.2"
libraryDependencies += "commons-io" % "commons-io" % "2.6"
libraryDependencies += "org.bouncycastle" % "bcprov-jdk15on" % "1.61"
libraryDependencies += "org.mockito" % "mockito-core" % "2.24.0" % Test
libraryDependencies += "com.typesafe" % "config" % "1.3.3"
libraryDependencies += "com.typesafe.play" %% "play" % "2.7.0"
libraryDependencies += "org.elasticsearch.client" % "elasticsearch-rest-client" % "6.6.0"
libraryDependencies += "org.elasticsearch" % "elasticsearch" % "6.6.0"
libraryDependencies += "org.elasticsearch.client" % "elasticsearch-rest-high-level-client" % "6.6.0"
libraryDependencies += "org.jsoup" % "jsoup" % "1.11.3"
You have not added configuration for integration test.
For example, adding it in scala test or default settings etc.
"org.scalatest" %% "scalatest" % "3.0.5" % "it, test"
For all integration, settings refer
I hope, it will help.
I have upgraded build.sbt to use the latest play-slick (2.0.0) and after all was downloaded and the application set to run I got this exception.
`
java.lang.NoSuchMethodError: play.api.Logger$.init(Ljava/io/File;Lscala/Enumeration$Value;)V
at play.core.server.DevServerStart$$anonfun$mainDev$1.apply(DevServerStart.scala:88)
at play.core.server.DevServerStart$$anonfun$mainDev$1.apply(DevServerStart.scala:61)
at play.utils.Threads$.withContextClassLoader(Threads.scala:21)
at play.core.server.DevServerStart$.mainDev(DevServerStart.scala:60)
at play.core.server.DevServerStart$.mainDevHttpMode(DevServerStart.scala:50)
at play.core.server.DevServerStart.mainDevHttpMode(DevServerStart.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at play.runsupport.Reloader$.startDevMode(Reloader.scala:207)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.devModeServer$lzycompute$1(PlayRun.scala:73)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.play$sbt$run$PlayRun$$anonfun$$anonfun$$anonfun$$devModeServer$1(PlayRun.scala:73)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.apply(PlayRun.scala:99)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.apply(PlayRun.scala:52)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) java.lang.reflect.InvocationTargetException
[error] Total time: 0 s, completed 31/ago/2016 23:23:25
`
This is my build.sbt
`
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.7"
libraryDependencies += "mysql" % "mysql-connector-java" % "5.1.22"
libraryDependencies ++= Seq(
//jdbc,
cache,
ws,
specs2 % Test
)
libraryDependencies ++= Seq(
"com.typesafe.play" %% "play-slick" % "2.0.0",
"com.typesafe.play" %% "play-slick-evolutions" % "2.0.0"
)
//libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.1.0"
libraryDependencies += "org.slf4j" % "slf4j-api" % "1.7.21"
libraryDependencies += "org.slf4j" % "slf4j-simple" % "1.7.21"
resolvers += "scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"
// Play provides two styles of routers, one expects its actions to be injected, the
// other, legacy style, accesses its actions statically.
routesGenerator := InjectedRoutesGenerator
libraryDependencies += "com.sksamuel.scrimage" %% "scrimage-core" % "2.1.0"
libraryDependencies += "com.sksamuel.scrimage" %% "scrimage-io-extra" % "2.1.0"
libraryDependencies += "com.sksamuel.scrimage" %% "scrimage-filters" % "2.1.0"
`
and my plugins.sbt
`
cat project/plugins.sbt
// The Play plugin
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.4.0")
// web plugins
addSbtPlugin("com.typesafe.sbt" % "sbt-coffeescript" % "1.0.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-less" % "1.0.6")
addSbtPlugin("com.typesafe.sbt" % "sbt-jshint" % "1.0.3")
addSbtPlugin("com.typesafe.sbt" % "sbt-rjs" % "1.0.7")
addSbtPlugin("com.typesafe.sbt" % "sbt-digest" % "1.1.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-mocha" % "1.1.0")
//offline := true
`
I already tried disabling dependencies but that did not solve the issue.
Any ideas?
Your play version "2.4" seems to be incompatible with the upgraded Slick version.
//try this instead in your plugins.sbt
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.5.x")
I'm trying to instrument my server with Kamon, which requires Aspectj weaver. I'm using sbt 0.13.8
However, the options aren't being passed to the forked process.
I've looked here:
https://github.com/eigengo/activator-akka-aspectj/blob/master/build.sbt
and here:
http://www.scala-sbt.org/0.13/docs/Forking.html
And this is my build.sbt:
import sbt.Keys._
name := """myApp"""
version := "0.0.1"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.6"
libraryDependencies ++= Seq(
//jdbc, don not enable this when using slick
cache,
ws,
specs2 % Test,
"com.typesafe.akka" %% "akka-contrib" % "2.4.+",
"org.scalatest" % "scalatest_2.11" % "2.2.4" % "test",
"org.scalatestplus" %% "play" % "1.4.0-M3" % "test",
"com.github.seratch" %% "awscala" % "0.5.+",
"com.typesafe.play" %% "play-slick" % "1.1.1",
"com.typesafe.play" %% "play-slick-evolutions" % "1.1.1",
"mysql" % "mysql-connector-java" % "5.1.+",
"commons-net" % "commons-net" % "3.3",
"net.sourceforge.htmlcleaner" % "htmlcleaner" % "2.15",
"io.strongtyped" %% "active-slick" % "0.3.3",
"org.aspectj" % "aspectjweaver" % "1.8.8",
"org.aspectj" % "aspectjrt" % "1.8.8",
"io.kamon" %% "kamon-core" % "0.5.+",
// "io.kamon" %% "kkamon-system-metrics" % "0.5.+",
"io.kamon" %% "kamon-scala" % "0.5.+",
// "io.kamon" %% "kamon-akka" % "0.5.+",
"io.kamon" %% "kamon-datadog" % "0.5.+"
)
resolvers ++= Seq(
"scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"
)
// Play provides two styles of routers, one expects its actions to be injected, the
// other, legacy style, accesses its actions statically.
routesGenerator := InjectedRoutesGenerator
javaOptions in run += "-javaagent:" + System.getProperty("user.home") + "/.ivy2/cache/org.aspectj/aspectjweaver/jars/aspectjweaver-1.8.8.jar -Xmx:2G"
fork in run := true
connectInput in run := true
I've tried running the app using ./activator start as well as ./activator stage and then running the script.
What am I doing wrong?
Thanks!
The production application should be configurable during deployment. This is my example of start script:
PARAMETERS="-Dconfig.file=conf/production.conf -Dlogger.file=conf/prod-logger.xml"
PARAMETERS="$PARAMETERS -Dhttp.port=9000"
PARAMETERS="$PARAMETERS -J-Xmx8g -J-Xms8g -J-server -J-verbose:gc -J-Xloggc:../logs/portal/gc.log -J-XX:+PrintGCDateStamps"
nohup bin/myApp $PARAMETERS &
For more details see Production Configuration
I am using IntelliJ IDEA 13.1.2 with the Scala plugin version 0.36.431 on Windows 7 with sbt 0.13.1.
The following project definition build.sbt has no references to any Scala version other than 2.9.3.
import sbt._
import Keys._
import AssemblyKeys._
import NativePackagerKeys._
name := "simplews"
version := "0.1.0-SNAPSHOT"
val sparkVersion = "0.8.1-incubating"
scalaVersion := "2.9.3"
val akkaVersion = "2.0.5"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.9.3" % sparkVersion % "compile->default" withSources(),
"org.apache.spark" % "spark-examples_2.9.3" % sparkVersion % "compile->default" withSources(),
"org.apache.spark" % "spark-tools_2.9.3" % sparkVersion % "compile->default" withSources(),
"org.scalatest" % "scalatest_2.9.3" % "1.9.2" % "test" withSources(),
"org.apache.spark" % "spark-repl_2.9.3" % sparkVersion % "compile->default" withSources(),
"org.apache.kafka" % "kafka" % "0.7.2-spark",
"com.thenewmotion.akka" % "akka-rabbitmq_2.9.2" % "0.0.2" % "compile->default" withSources(),
"com.typesafe.akka" % "akka-actor" % akkaVersion % "compile->default" withSources(),
"com.typesafe.akka" % "akka-testkit" % akkaVersion % "compile->default" withSources(),
"com.rabbitmq" % "amqp-client" % "3.0.1" % "compile->default" withSources(),
"org.specs2" % "specs2_2.9.3" % "1.12.4.1" % "compile->default" withSources(),
"com.nebhale.jsonpath" % "jsonpath" % "1.2" % "compile->default" withSources(),
"org.mockito" % "mockito-all" % "1.8.5",
"junit" % "junit" % "4.11"
)
packagerSettings
packageArchetype.java_application
resolvers ++= Seq(
"Apache repo" at "https://repository.apache.org/content/repositories/releases",
"Cloudera repo" at "https://repository.cloudera.com/artifactory/repo/org/apache/kafka/kafka/0.7.2-spark/",
"akka rabbitmq" at "http://nexus.thenewmotion.com/content/repositories/releases-public",
"Local Repo" at Path.userHome.asFile.toURI.toURL + "/.m2/repository",
Resolver.mavenLocal
)
However as seen in the screenshot the debugger has jumped to scala 2.10.2. Note: the debugger is correctly going to 2.9.3 for some other debugging.
Here is project/plugins.sbt:
resolvers += "sbt-plugins" at "http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "0.7.0-RC2")
addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.6.0")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2")
EDIT In order to reproduce it is necessary to do a mvn local install on one or two libraries that are not available in any public repo.
mvn org.apache.maven.plugins:maven-install-plugin:2.5.1:install-file -Dfile=c:\shared\kafka-0.7.2-spark.jar -DgroupId=org.apache.kafka -DartifactId=kafka -Dversion=0.7.2-spark -Dpackaging=jar
I had in any case not considered someone (om-nom-nom !) would attempt an exact reproduction - so had also omitted otherwise extraneous items like mergeStrategy and assemblyKeys.
A fully independent reproducible setup may be a bit in coming - I have been under rather quite heavy demands here.
I've just upgraded to sbt 0.11.1 that doesn't seem to be fetching a
certain dependency. Things worked fine before the upgrade.
I have this dependency:
"org.scalatra" %% "scalatra" % "2.1.0-SNAPSHOT",
And when I compile:
> update
[success] Total time: 0 s, completed Nov 18, 2011 5:44:16 PM
> compile
[info] Compiling 29 Scala sources and 1 Java source to
/home/yang/pod/sales/scala/target/scala-2.9.1/classes...
[error] /home/yang/pod/sales/scala/src/main/scala/com/pod/Web.scala:125:
not found: type ScalatraServlet
[error] class PodWeb extends ScalatraServlet with ScalateSupport with
FileUploadSupport {
[error] ^
[error] class file needed by ScalateSupport is missing.
[error] reference type ScalatraKernel of package org.scalatra refers
to nonexisting symbol.
[error] two errors found
[error] {file:/home/yang/pod/sales/scala/}pod/compile:compile:
Compilation failed
[error] Total time: 10 s, completed Nov 18, 2011 5:44:45 PM
The file seems to be missing:
$ ls /home/yang/.ivy2/cache/org.scalatra/scalatra_2.9.1/jars/
scalatra_2.9.1-2.1.0-SNAPSHOT-sources.jar
The file exists in the repo, though:
https://oss.sonatype.org/content/repositories/snapshots/org/scalatra/scalatra_2.9.1/2.1.0-SNAPSHOT/
This is still happening even if I blow away ~/.ivy2/. Any hints what's happening?
Complete build.sbt below:
name := "pod"
version := "1.0"
scalaVersion := "2.9.1"
seq(coffeeSettings: _*)
seq(webSettings :_*)
seq(sbtprotobuf.ProtobufPlugin.protobufSettings: _*)
libraryDependencies ++= Seq(
"org.scalaquery" % "scalaquery_2.9.0" % "0.9.4",
"postgresql" % "postgresql" % "9.0-801.jdbc4", // % "runtime",
"com.jolbox" % "bonecp" % "0.7.1.RELEASE",
"ru.circumflex" % "circumflex-orm" % "2.1-SNAPSHOT",
"ru.circumflex" % "circumflex-core" % "2.1-SNAPSHOT",
"net.sf.ehcache" % "ehcache-core" % "2.4.3",
// snapshots needed for scala 2.9.0 support
"org.scalatra" %% "scalatra" % "2.1.0-SNAPSHOT",
"org.scalatra" %% "scalatra-scalate" % "2.1.0-SNAPSHOT",
"org.scalatra" %% "scalatra-fileupload" % "2.1.0-SNAPSHOT",
"org.fusesource.scalate" % "scalate-jruby" % "1.5.0",
"org.fusesource.scalamd" % "scalamd" % "1.5", // % runtime,
"org.mortbay.jetty" % "jetty" % "6.1.22",
"net.debasishg" % "sjson_2.9.0" % "0.12",
"com.lambdaworks" % "scrypt" % "1.2.0",
"org.mortbay.jetty" % "jetty" % "6.1.22" % "container",
// "org.bowlerframework" %% "core" % "0.4.1",
"net.sf.opencsv" % "opencsv" % "2.1",
"org.apache.commons" % "commons-math" % "2.2",
"org.apache.commons" % "commons-lang3" % "3.0",
"com.google.protobuf" % "protobuf-java" % "2.4.1",
"ch.qos.logback" % "logback-classic" % "0.9.29",
"org.scalatest" % "scalatest_2.9.0" % "1.6.1",
"com.h2database" % "h2" % "1.3.158",
"pentaho.weka" % "pdm-3.7-ce" % "SNAPSHOT",
// this line doesn't work due to sbt bug:
// https://github.com/harrah/xsbt/issues/263
// work around by manually downloading this into the lib/ directory
// "org.rosuda" % "jri" % "0.9-1" from "https://dev.partyondata.com/deps/jri-0.9-1.jar",
"net.java.dev.jna" % "jna" % "3.3.0",
"org.scalala" % "scalala_2.9.0" % "1.0.0.RC2-SNAPSHOT",
"rhino" % "js" % "1.7R2",
"junit" % "junit" % "4.9",
"org.apache.commons" % "commons-email" % "1.2",
"commons-validator" % "commons-validator" % "1.3.1",
"oro" % "oro" % "2.0.8", // validator depends on this
"javax.servlet" % "servlet-api" % "2.5" % "provided->default"
)
fork in run := true
javaOptions in run ++= Seq(
"-Xmx3G",
"-Djava.library.path=" + System.getenv("HOME") +
"/R/x86_64-pc-linux-gnu-library/2.13/rJava/jri:" +
"/usr/lib/R/site-library/rJava/jri"
)
//javaOptions in run ++= Seq(
// "-Dcom.sun.management.jmxremote",
// "-Dcom.sun.management.jmxremote.port=3000",
// "-Dcom.sun.management.jmxremote.authenticate=false",
// "-Dcom.sun.management.jmxremote.ssl=false"
//)
scalacOptions ++= Seq("-g:vars", "-deprecation", "-unchecked")
// needed for the scalatra snapshots
resolvers ++= Seq(
"POD" at "https://dev.partyondata.com/deps/",
"Scala-Tools Snapshots" at "http://scala-tools.org/repo-snapshots/",
"Sonatype OSS Snapshots" at "http://oss.sonatype.org/content/repositories/snapshots/",
"Sonatype OSS releases" at "http://oss.sonatype.org/content/repositories/releases",
"ScalaNLP" at "http://repo.scalanlp.org/repo",
"Pentaho" at "http://repo.pentaho.org/artifactory/pentaho/",
"FuseSource snapshots" at "http://repo.fusesource.com/nexus/content/repositories/snapshots",
"JBoss" at "https://repository.jboss.org/nexus/content/repositories/thirdparty-releases"
)
initialCommands in consoleQuick := """
import scalala.scalar._;
import scalala.tensor.::;
import scalala.tensor.mutable._;
import scalala.tensor.dense._;
import scalala.tensor.sparse._;
import scalala.library.Library._;
import scalala.library.LinearAlgebra._;
import scalala.library.Statistics._;
import scalala.library.Plotting._;
import scalala.operators.Implicits._;
//
import scala.collection.{mutable => mut}
import scala.collection.JavaConversions._
import ru.circumflex.orm._
import ru.circumflex.core._
"""
//
// sxr
//
// addCompilerPlugin("org.scala-tools.sxr" %% "sxr" % "0.2.7")
//
// scalacOptions <+= (scalaSource in Compile) { "-P:sxr:base-directory:" + _.getAbsolutePath }
After blowing away not just ~/.ivy2 but ~/.m2 and ~/.sbt as well, everything worked again.
Sometimes ivy cache entries get corrupted - simply remove ~/.ivy2/cache/org.scalatra/scalatra_2.9.1/jars/, and let SBT re-fetch the dependency from the remote repo. If it doesn't work, try to remove an entire cache directory (~/.ivy2/cache).
I have had occasions where Ivy has got confused. I can't tell you why, unfortunately, but I have found that things work fine after deleting the entire ~/.ivy2 directory hierarchy. Clearly you'll have to download all your dependencies again, though :-(