I try to start the sample test NG test in the scala sbt framework. I am using below the dependencies.
"org.apache.spark" %% "spark-core" % sparkVer % Provided,
"org.apache.hadoop" % "hadoop-common" % sparkVer % Provided,
"org.apache.spark" % "spark-sql_2.11" % sparkVer % Provided,
"org.apache.spark" % "spark-hive_2.11" % sparkVer,
"org.scalactic" %% "scalactic" % scalatestVer,
"org.scalatest" %% "scalatest" % scalatestVer % Test,
"info.cukes" % "cucumber-scala_2.11" % cucumberVer,
"info.cukes" % "cucumber-junit" % cucumberVer,
"junit" % "junit" % "4.12",
"org.scalatest" % "scalatest_2.11" % "2.0" % "test",
"org.scalactic" %% "scalactic" % "3.0.1",
"org.scalatest" %% "scalatest" % "3.0.1" % "test",
[The image is ExampleSuite class as you can see I am not able to use it correctly1
Here is the link I follow up for this case http://www.scalatest.org/getting_started_with_testng_in_scala
Here is also sbt.version = 0.13.16
Any help really appreciate.
Add dependencies => "org.testng" % "testng" % "6.14.3" % Test, and "create TestNG XML" plugin upload from marketplace than resolve the issue.
Related
I am in a situation where I need to specify a custom resolver for my SBT project, but only to download 1 or 2 dependencies. I want all the other dependencies to be fetched from Maven repository.
Here is my build.sbt file:
...Project definition...
resolvers := Seq(
"Maven" at "https://repo1.maven.org/"
)
//Akka dependencies
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % akkaActorsVersion,
"com.typesafe.akka" %% "akka-testkit" % akkaActorsVersion % Test,
"com.typesafe.akka" %% "akka-stream" % akkaStreamsVersion,
"com.typesafe.akka" %% "akka-stream-testkit" % akkaStreamsVersion % Test,
"com.typesafe.akka" %% "akka-http" % akkaHttpVersion,
"com.typesafe.akka" %% "akka-http-testkit" % akkaHttpVersion % Test,
"com.datastax.cassandra" % "cassandra-driver-core" % "3.3.0",
"com.typesafe.akka" %% "akka-http-spray-json" % akkaHttpVersion,
"io.spray" %% "spray-json" % "1.3.5",
"de.heikoseeberger" %% "akka-http-circe" % "1.23.0",
"io.circe" %% "circe-generic" % "0.10.0",
"com.pauldijou" %% "jwt-core" % "0.13.0",
"com.pauldijou" %% "jwt-circe" % "0.13.0",
"org.slf4j" % "slf4j-simple" % "1.6.4",
"com.microsoft.azure" % "azure-storage" % "8.4.0",
"com.datastax.cassandra" % "cassandra-driver-extras" % "3.1.4",
"io.jvm.uuid" %% "scala-uuid" % "0.3.0",
"org.scalatest" %% "scalatest" % "3.0.5" % "test",
"org.cassandraunit" % "cassandra-unit" % "3.1.1.0" % "test",
"io.monix" %% "monix" % "3.0.0-8084549",
"org.bouncycastle" % "bcpkix-jdk15on" % "1.48"
)
resolvers := Seq("Artifactory" at "http://10.3.1.6:8081/artifactory/libs-release-local/")
Credentials += Credentials("Artifactory Realm", "10.3.1.6", ARTIFACTORY_USER, ARTIFACTORY_PASSWORD)
libraryDependencies ++=
Seq(
"com.org" % "common-layer_2.11" % "0.3",
)
However the build fails with errors that say that SBT is trying to fetch libraries from Artifactory instead of from Maven.
For example the Cassandra driver dependency
unresolved dependency: com.datastax.cassandra#cassandra-driver-extras;3.1.4: Artifactory: unable to get resource for com/datastax/cassandra#cassandra-driver-extras;3.1.4: res=http://10.3.1.6:8081/artifactory/libs-release-local/com/datastax/cassandra/cassandra-driver-extras/3.1.4/cassandra-driver-extras-3.1.4.pom
I have searched the internet and the documentation and I don't see a clear way to handle this, even though I'm surprised because this seems like a common problem.
Any ideas about how I could enforce the priorities/ordering of resolvers in SBT?
Please note that when you are doing
resolvers := Seq("resolver" at "https://path")
You are overriding the existing user-defined additional resolvers. Therefore if you are doing:
resolvers := Seq("resolver1" at "https://path1")
resolvers := Seq("resolver2" at "https://path2")
You are ending up only with resolver2.
In order to have both resolvers, you need to do something like:
resolvers ++= Seq(
"resolver1" at "https://path1",
"resolver2" at "https://path2"
)
SBT search the dependencies according to the order of the given resolvers. This means that in the given example, it will search first at resolver1, and only if it doesn't find, it will go to resolver2.
Another thing you need to know, is that SBT has predefined resolvers.
You can read more about sbt resolvers at: https://www.scala-sbt.org/1.x/docs/Resolvers.html
Similarly to:
Why is UNRESOLVED DEPENDENCIES error with com.typesafe.slick#slick_2.11;2.0.2: not found?
I got the next error message:
events/*:update) sbt.ResolveException: unresolved dependency: com.typesafe.slick#slick-extensions_2.11;3.1.0: not found
My scala build.sbt has:
lazy val events = (project in file("modules/events")).settings(commonSettings).
settings(Seq(libraryDependencies ++= Seq(
cache,
ws,
evolutions,
specs2,
"com.softwaremill.macwire" %% "macros" % "2.2.5" % "provided",
"com.softwaremill.macwire" %% "util" % "2.2.0",
"ch.qos.logback" % "logback-classic" % "1.1.8",
"de.svenkubiak" % "jBCrypt" % "0.4.1",
"org.scalatestplus.play" %% "scalatestplus-play" % "1.5.0" % "test",
"org.mockito" % "mockito-core" % "2.0.45-beta" % "test",
"mysql" % "mysql-connector-java" % "5.1.34",
"org.postgresql" % "postgresql" % "9.4.1207.jre7",
"com.vividsolutions" % "jts" % "1.13",
"com.typesafe.play" % "play-slick_2.11" % "2.0.2",
"com.typesafe.play" %% "play-slick-evolutions" % "2.0.0",
"com.github.tminglei" %% "slick-pg" % "0.12.1",
"com.github.tminglei" %% "slick-pg_date2" % "0.12.1",
"com.github.tminglei" %% "slick-pg_play-json" % "0.12.1",
"com.typesafe.slick" %% "slick-extensions" % "3.1.0",
"org.scalikejdbc" %% "scalikejdbc" % "2.4.2",
"org.scalikejdbc" %% "scalikejdbc-config" % "2.4.2",
"joda-time" % "joda-time" % "2.9.4",
"com.typesafe.play" %% "play-json" % "2.5.9",
"io.circe" %% "circe-core" % circeVersion,
"io.circe" %% "circe-generic" % circeVersion,
"io.circe" %% "circe-parser" % circeVersion,
"io.circe" %% "circe-jawn" % circeVersion,
"com.github.julien-truffaut" %% "monocle-core" % monocleVersion,
"com.github.julien-truffaut" %% "monocle-macro" % monocleVersion,
"com.github.julien-truffaut" %% "monocle-law" % monocleVersion % "test",
"com.microsoft.sqlserver" % "mssql-jdbc" % "7.4.1.jre8"
)))
I am also using Scala 2.11.9. I also tried adding
resolvers += "typesafe" at "http://repo.typesafe.com/typesafe/releases/"
but no luck. Any suggestions, please?
Actually slick-extensions is not located in http://repo.typesafe.com/typesafe/releases/. If you will look there you will see that com/typesafe/slick/slick-extensions_2.11/ is empty.
But I have found it here https://typesafe.bintray.com/commercial-maven-releases/com/typesafe/slick/slick-extensions_2.11/3.1.0/
And here some information about slick-extensions: https://index.scala-lang.org/slick/slick/slick-extensions/3.1.0.
They recommend using that:
libraryDependencies += "com.typesafe.slick" %% "slick-extensions" % "3.1.0"
resolvers += Resolver.bintrayRepo("typesafe", "commercial-maven-releases")
I have an RDD[some case class] and I want to convert it to a csv file. I am using spark 1.6 and scala 2.10.5 .
stationDetails.toDF.coalesce(1).write.format("com.databricks.spark.csv").save("data/myData.csv")
gives error
Exception in thread "main" java.lang.ClassNotFoundException: Failed to find data source: com.databricks.spark.csv. Please find packages at http://spark-packages.org
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:77)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:219)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:148)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:139)
I am not able to add the dependencies for "com.databricks.spark.csv" in my build.sbt file.
dependencies I added in build.sbt file are:
libraryDependencies ++= Seq(
"org.apache.commons" % "commons-csv" % "1.1",
"com.univocity" % "univocity-parsers" % "1.5.1",
"org.slf4j" % "slf4j-api" % "1.7.5" % "provided",
"org.scalatest" %% "scalatest" % "2.2.1" % "test",
"com.novocode" % "junit-interface" % "0.9" % "test"
)
I also tried this
stationDetails.toDF.coalesce(1).write.csv("data/myData.csv")
but it gives error : csv cannot be resolved.
Please change your build.sbt to below -
libraryDependencies ++= Seq(
"org.apache.commons" % "commons-csv" % "1.1",
"com.databricks" %% "spark-csv" % "1.4.0",
"com.univocity" % "univocity-parsers" % "1.5.1",
"org.slf4j" % "slf4j-api" % "1.7.5" % "provided",
"org.scalatest" %% "scalatest" % "2.2.1" % "test",
"com.novocode" % "junit-interface" % "0.9" % "test"
)
i am using SBT to setup Akka Persistence but it's failing with Error :
Detected java.lang.NoSuchMethodError error, which MAY be caused by incompatible Akka versions on the classpath. Please note that a given Akka version MUST be the same across all modules of Akka that you are using, e.g. if you use akka-actor [2.5.6 (resolved from current classpath)] all other core Akka modules MUST be of the same version. External projects like Alpakka, Persistence plugins or Akka HTTP etc. have their own version numbers - please make sure you're using a compatible set of libraries.
Uncaught error from thread [TopsActivitiesSystem-akka.actor.default-dispatcher-16]: akka.persistence.Eventsourced.persist$(Lakka/persistence/Eventsourced;Ljava/lang/Object;Lscala/Function1;)V, shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled for for ActorSystem[TopsActivitiesSystem]
Detected java.lang.NoSuchMethodError error, which MAY be caused by incompatible Akka versions on the classpath. Please note that a given Akka version MUST be the same across all modules of Akka that you are using, e.g. if you use akka-actor [2.5.6 (resolved from current classpath)] all other core Akka modules MUST be of the same version. External projects like Alpakka, Persistence plugins or Akka HTTP etc. have their own version numbers - please make sure you're using a compatible set of libraries.
Uncaught error from thread [TopsActivitiesSystem-akka.actor.default-dispatcher-3]: akka.persistence.Eventsourced.persist$(Lakka/persistence/Eventsourced;Ljava/lang/Object;Lscala/Function1;)V, shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled for for ActorSystem[TopsActivitiesSystem]
java.lang.NoSuchMethodError: akka.persistence.Eventsourced.persist$(Lakka/persistence/Eventsourced;Ljava/lang/Object;Lscala/Function1;)V
My Configuration SBT Build file is :
scalaVersion := "2.12.4"
lazy val root = (project in file("."))
.configs(IntegrationTest)
def akkaVersion = "2.5.6"
def akkaHttpVersion = "10.0.10"
def logbackVersion = "1.2.3"
def ItAndTest = "it, test"
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % akkaVersion,
"com.typesafe.akka" %% "akka-stream" % akkaVersion,
"com.typesafe.akka" %% "akka-cluster" % akkaVersion,
"com.typesafe.akka" %% "akka-cluster-tools" % akkaVersion,
"com.typesafe.akka" % "akka-cluster-metrics_2.12" % akkaVersion,
"com.redmart.fc" %% "fc-capacity-model" % "1.7.0",
"com.redmart.akka" %% "akka-downing" % "1.1.0",
"com.jsuereth" %% "scala-arm" % "2.0",
"com.typesafe" % "config" % "1.3.1",
"com.typesafe.scala-logging" %% "scala-logging" % "3.7.2",
"com.typesafe.akka" %% "akka-slf4j" % akkaVersion,
"ch.qos.logback" % "logback-classic" % logbackVersion,
"ch.qos.logback" % "logback-access" % logbackVersion,
"net.logstash.logback" % "logstash-logback-encoder" % "4.11",
"joda-time" % "joda-time" % "2.9.3",
"com.typesafe.akka" %% "akka-http" % akkaHttpVersion,
"com.typesafe.akka" %% "akka-http-spray-json" % akkaHttpVersion,
"com.typesafe.akka" %% "akka-persistence-cassandra" % "0.58",
// "com.typesafe.akka" %% "akka-cluster" % akkaVersion,
// "com.typesafe.akka" %% "akka-cluster-tools" % akkaVersion,
"com.lightbend.akka" %% "akka-management-cluster-http" % "0.4",
// "org.iq80.leveldb" % "leveldb" % "0.9",
// "org.fusesource.leveldbjni" % "leveldbjni-all" % "1.8",
"io.spray" %% "spray-json" % "1.3.3",
"com.enragedginger" %% "akka-quartz-scheduler" % "1.6.1-akka-2.5.x",
// "com.softwaremill.macwire" %% "macros" % "2.2.2" % "provided",
// "com.softwaremill.macwire" %% "util" % "2.2.2",
"com.esotericsoftware" % "kryo" % "4.0.1",
"com.github.romix.akka" %% "akka-kryo-serialization" % "0.5.2",
"com.typesafe.akka" %% "akka-http-testkit" % akkaHttpVersion % ItAndTest,
"org.scalatest" %% "scalatest" % "3.0.4" % ItAndTest,
"org.mockito" % "mockito-core" % "2.10.0" % ItAndTest,
"com.github.dnvriend" %% "akka-persistence-inmemory" % "2.5.1.1" % ItAndTest
)
dependencyOverrides += "com.typesafe.akka" %% "akka-actor" % akkaVersion
dependencyOverrides += "com.typesafe.akka" %% "akka-stream" % akkaVersion
dependencyOverrides += "com.typesafe.akka" %% "akka-cluster" % akkaVersion
dependencyOverrides += "com.typesafe.akka" %% "akka-cluster-tools" % akkaVersion
dependencyOverrides += "com.typesafe.akka" %% "akka-persistence" % akkaVersion
dependencyOverrides += "com.typesafe.akka" %% "akka-slf4j" % akkaVersion
dependencyOverrides += "com.typesafe.akka" % "akka-cluster-metrics_2.12" % akkaVersion
Can Anyone explain the reason why it's failing? I am sure i am overriding to latest package. which is 2.5.6 using Dependency Overrides. i am Using SBT 0.13.
Use the current version (0.5) of akka-management-cluster-http:
"com.lightbend.akka" %% "akka-management-cluster-http" % "0.5"
Version 0.4 depends on an older version of Akka.
I am using IntelliJ IDEA 13.1.2 with the Scala plugin version 0.36.431 on Windows 7 with sbt 0.13.1.
The following project definition build.sbt has no references to any Scala version other than 2.9.3.
import sbt._
import Keys._
import AssemblyKeys._
import NativePackagerKeys._
name := "simplews"
version := "0.1.0-SNAPSHOT"
val sparkVersion = "0.8.1-incubating"
scalaVersion := "2.9.3"
val akkaVersion = "2.0.5"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.9.3" % sparkVersion % "compile->default" withSources(),
"org.apache.spark" % "spark-examples_2.9.3" % sparkVersion % "compile->default" withSources(),
"org.apache.spark" % "spark-tools_2.9.3" % sparkVersion % "compile->default" withSources(),
"org.scalatest" % "scalatest_2.9.3" % "1.9.2" % "test" withSources(),
"org.apache.spark" % "spark-repl_2.9.3" % sparkVersion % "compile->default" withSources(),
"org.apache.kafka" % "kafka" % "0.7.2-spark",
"com.thenewmotion.akka" % "akka-rabbitmq_2.9.2" % "0.0.2" % "compile->default" withSources(),
"com.typesafe.akka" % "akka-actor" % akkaVersion % "compile->default" withSources(),
"com.typesafe.akka" % "akka-testkit" % akkaVersion % "compile->default" withSources(),
"com.rabbitmq" % "amqp-client" % "3.0.1" % "compile->default" withSources(),
"org.specs2" % "specs2_2.9.3" % "1.12.4.1" % "compile->default" withSources(),
"com.nebhale.jsonpath" % "jsonpath" % "1.2" % "compile->default" withSources(),
"org.mockito" % "mockito-all" % "1.8.5",
"junit" % "junit" % "4.11"
)
packagerSettings
packageArchetype.java_application
resolvers ++= Seq(
"Apache repo" at "https://repository.apache.org/content/repositories/releases",
"Cloudera repo" at "https://repository.cloudera.com/artifactory/repo/org/apache/kafka/kafka/0.7.2-spark/",
"akka rabbitmq" at "http://nexus.thenewmotion.com/content/repositories/releases-public",
"Local Repo" at Path.userHome.asFile.toURI.toURL + "/.m2/repository",
Resolver.mavenLocal
)
However as seen in the screenshot the debugger has jumped to scala 2.10.2. Note: the debugger is correctly going to 2.9.3 for some other debugging.
Here is project/plugins.sbt:
resolvers += "sbt-plugins" at "http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "0.7.0-RC2")
addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.6.0")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2")
EDIT In order to reproduce it is necessary to do a mvn local install on one or two libraries that are not available in any public repo.
mvn org.apache.maven.plugins:maven-install-plugin:2.5.1:install-file -Dfile=c:\shared\kafka-0.7.2-spark.jar -DgroupId=org.apache.kafka -DartifactId=kafka -Dversion=0.7.2-spark -Dpackaging=jar
I had in any case not considered someone (om-nom-nom !) would attempt an exact reproduction - so had also omitted otherwise extraneous items like mergeStrategy and assemblyKeys.
A fully independent reproducible setup may be a bit in coming - I have been under rather quite heavy demands here.