I am trying to access the files under the resource folder while running streamlets.
Below is my sbt structure.
i want to access files under streamlets -> src -> main -> resources.
i use
getClass.getClassLoader.getResource(s"file.txt")
running locally, it's working, but while running in cluster
getting error, no file and directory.
Please assist .
adding build.sbt :-
lazy val root = project.in(file(".")).aggregate(streamlets, app)
lazy val streamlets = project.in(file("streamlets"))
.enablePlugins(CloudflowAkkaPlugin, ScalaxbPlugin)
.settings(
name := s"$baseName-streamlets",
libraryDependencies ++= Seq(
"ch.qos.logback" % "logback-classic" % logbackVersion,
"org.scalactic" %% "scalactic" % "3.1.1" % "test",
"org.scalatest" %% "scalatest" % "3.1.1" % "test",
"com.lightbend.akka" %% "akka-stream-alpakka-s3" % alpakkaVersion,
"com.typesafe.akka" %% "akka-stream" % akkaVersion,
"com.typesafe.akka" %% "akka-http" % akkaHttpVersion,
"com.typesafe.akka" %% "akka-http-xml" % akkaHttpVersion,
"org.glassfish.jaxb" % "jaxb-runtime" % jaxbVersion, // for JDK11
)
)
lazy val app = project.in(file("app"))
.enablePlugins(CloudflowApplicationPlugin)
.settings(
name := s"$baseName-dev",
runLocalConfigFile := Some("app/src/main/resources/local.conf")
)
.dependsOn(streamlets)
Related
I have a new sbt application that I built using the akka http g8 template.
I am trying to add reactivemongo 1.0 to my build and I am getting this error:
not found: https://repo1.maven.org/maven2/org/reactivemongo/reactivemongo_2.13/1.0/reactivemongo_2.13-1.0.pom
The documentation says this library is in maven central.
How can I determine which resolver my project is using by default currently in sbt?
Is it possible that this library is not built for scala 2.13.3 or 2.13.1?
How can I debug this type of error.
Thanks!
build.sbt:
import Dependencies._
lazy val akkaHttpVersion = "10.2.1"
lazy val akkaVersion = "2.6.10"
lazy val root = (project in file("."))
.settings(
inThisBuild(
List(
organization := "com.example",
scalaVersion := "2.13.3"
)
),
name := "akka-http",
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-http" % akkaHttpVersion,
"com.typesafe.akka" %% "akka-http-spray-json" % akkaHttpVersion,
"com.typesafe.akka" %% "akka-actor-typed" % akkaVersion,
"com.typesafe.akka" %% "akka-stream" % akkaVersion,
"ch.qos.logback" % "logback-classic" % "1.2.3",
"com.softwaremill.macwire" %% "macros" % "2.3.3" % "provided",
"com.softwaremill.macwire" %% "util" % "2.3.3" % "provided",
"com.github.blemale" %% "scaffeine" % "3.1.0" % "compile",
"org.typelevel" %% "cats-core" % "2.1.1",
"com.lihaoyi" %% "scalatags" % "0.8.2",
"com.github.pureconfig" %% "pureconfig" % "0.13.0",
"org.reactivemongo" %% "reactivemongo" % "1.0",
"com.typesafe.akka" %% "akka-http-testkit" % akkaHttpVersion % Test,
"com.typesafe.akka" %% "akka-actor-testkit-typed" % akkaVersion % Test,
"org.scalatest" %% "scalatest" % "3.0.8" % Test
)
)
.enablePlugins(JavaAppPackaging)
Can you try replacing "org.reactivemongo" %% "reactivemongo" % "1.0" with "org.reactivemongo" %% "reactivemongo" % "1.0.0" % "provided"?
I copy it from Maven Repository https://mvnrepository.com/artifact/org.reactivemongo/reactivemongo_2.13/1.0.0
I am in a situation where I need to specify a custom resolver for my SBT project, but only to download 1 or 2 dependencies. I want all the other dependencies to be fetched from Maven repository.
Here is my build.sbt file:
...Project definition...
resolvers := Seq(
"Maven" at "https://repo1.maven.org/"
)
//Akka dependencies
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % akkaActorsVersion,
"com.typesafe.akka" %% "akka-testkit" % akkaActorsVersion % Test,
"com.typesafe.akka" %% "akka-stream" % akkaStreamsVersion,
"com.typesafe.akka" %% "akka-stream-testkit" % akkaStreamsVersion % Test,
"com.typesafe.akka" %% "akka-http" % akkaHttpVersion,
"com.typesafe.akka" %% "akka-http-testkit" % akkaHttpVersion % Test,
"com.datastax.cassandra" % "cassandra-driver-core" % "3.3.0",
"com.typesafe.akka" %% "akka-http-spray-json" % akkaHttpVersion,
"io.spray" %% "spray-json" % "1.3.5",
"de.heikoseeberger" %% "akka-http-circe" % "1.23.0",
"io.circe" %% "circe-generic" % "0.10.0",
"com.pauldijou" %% "jwt-core" % "0.13.0",
"com.pauldijou" %% "jwt-circe" % "0.13.0",
"org.slf4j" % "slf4j-simple" % "1.6.4",
"com.microsoft.azure" % "azure-storage" % "8.4.0",
"com.datastax.cassandra" % "cassandra-driver-extras" % "3.1.4",
"io.jvm.uuid" %% "scala-uuid" % "0.3.0",
"org.scalatest" %% "scalatest" % "3.0.5" % "test",
"org.cassandraunit" % "cassandra-unit" % "3.1.1.0" % "test",
"io.monix" %% "monix" % "3.0.0-8084549",
"org.bouncycastle" % "bcpkix-jdk15on" % "1.48"
)
resolvers := Seq("Artifactory" at "http://10.3.1.6:8081/artifactory/libs-release-local/")
Credentials += Credentials("Artifactory Realm", "10.3.1.6", ARTIFACTORY_USER, ARTIFACTORY_PASSWORD)
libraryDependencies ++=
Seq(
"com.org" % "common-layer_2.11" % "0.3",
)
However the build fails with errors that say that SBT is trying to fetch libraries from Artifactory instead of from Maven.
For example the Cassandra driver dependency
unresolved dependency: com.datastax.cassandra#cassandra-driver-extras;3.1.4: Artifactory: unable to get resource for com/datastax/cassandra#cassandra-driver-extras;3.1.4: res=http://10.3.1.6:8081/artifactory/libs-release-local/com/datastax/cassandra/cassandra-driver-extras/3.1.4/cassandra-driver-extras-3.1.4.pom
I have searched the internet and the documentation and I don't see a clear way to handle this, even though I'm surprised because this seems like a common problem.
Any ideas about how I could enforce the priorities/ordering of resolvers in SBT?
Please note that when you are doing
resolvers := Seq("resolver" at "https://path")
You are overriding the existing user-defined additional resolvers. Therefore if you are doing:
resolvers := Seq("resolver1" at "https://path1")
resolvers := Seq("resolver2" at "https://path2")
You are ending up only with resolver2.
In order to have both resolvers, you need to do something like:
resolvers ++= Seq(
"resolver1" at "https://path1",
"resolver2" at "https://path2"
)
SBT search the dependencies according to the order of the given resolvers. This means that in the given example, it will search first at resolver1, and only if it doesn't find, it will go to resolver2.
Another thing you need to know, is that SBT has predefined resolvers.
You can read more about sbt resolvers at: https://www.scala-sbt.org/1.x/docs/Resolvers.html
I want to create a project with akka and spark. I added dependencies and some other dependencies too. Is these dependencies will cause any effect on using spark.
I have below sbt file
dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-core" % "2.8.7"
dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.8.7"
dependencyOverrides += "com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.8.7"
lazy val commonSettings = Seq(
organization := "com.bitool.analytics",
scalaVersion := "2.11.12",
libraryDependencies ++= Seq(
"org.scala-lang.modules" %% "scala-async" % "0.9.6",
"com.softwaremill.macwire" %% "macros" % "2.3.0",
"com.softwaremill.macwire" %% "macrosakka" % "2.3.0",
"com.typesafe.akka" %% "akka-http" % "10.0.6",
"io.swagger" % "swagger-jaxrs" % "1.5.19",
"com.github.swagger-akka-http" %% "swagger-akka-http" % "0.9.1",
"io.circe" %% "circe-generic" % "0.8.0",
"io.circe" %% "circe-literal" % "0.8.0",
"io.circe" %% "circe-parser" % "0.8.0",
"io.circe" %% "circe-optics" % "0.8.0",
"org.scalafx" %% "scalafx" % "8.0.144-R12",
"org.scalafx" %% "scalafxml-core-sfx8" % "0.4",
"org.apache.spark" %% "spark-core" % "2.3.0",
"org.apache.spark" %% "spark-sql" % "2.3.0",
"org.apache.spark" %% "spark-hive" % "2.3.0",
"org.scala-lang" % "scala-xml" % "2.11.0-M4",
"mysql" % "mysql-connector-java" % "6.0.5"
)
)
lazy val root = (project in file(".")).
settings(commonSettings: _*).
settings(
name := "BITOOL-1.0"
)
ivyScala := ivyScala.value map {
_.copy(overrideScalaVersion = true)
}
fork in run := true
and below is my spark code
private val warehouseLocation = new File("spark-warehouse").getAbsolutePath
val conf = new SparkConf()
conf.setMaster("local[4]")
conf.setAppName("Bitool")
conf.set("spark.sql.warehouse.dir", warehouseLocation)
val SPARK = SparkSession
.builder().config(conf).enableHiveSupport()
.getOrCreate()
val SPARK_CONTEXT = SPARK.sparkContext
When I trying to execute this, It is creating metastore_db folder but spark-warehouse folder is not creating.
This directory is not created by getOrCreate. You can check it in the Spark source code: getOrCreate delegates its actions to SparkSession.getOrCreate, which is just a setter. All the internal tests and CliSuite use a snippet like this to prematurely initialize the dir: val warehousePath = Utils.createTempDir()
Instead, in the actual user code, you have to perform at least one data modification operation to materialize your warehouse directory. Try running something like that just after your code and check warehouse directory on the hard drive again:
import SPARK.implicits._
import SPARK.sql
sql("DROP TABLE IF EXISTS test")
sql("CREATE TABLE IF NOT EXISTS test (key INT, value STRING) USING hive")
I'm trying to instrument my server with Kamon, which requires Aspectj weaver. I'm using sbt 0.13.8
However, the options aren't being passed to the forked process.
I've looked here:
https://github.com/eigengo/activator-akka-aspectj/blob/master/build.sbt
and here:
http://www.scala-sbt.org/0.13/docs/Forking.html
And this is my build.sbt:
import sbt.Keys._
name := """myApp"""
version := "0.0.1"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.6"
libraryDependencies ++= Seq(
//jdbc, don not enable this when using slick
cache,
ws,
specs2 % Test,
"com.typesafe.akka" %% "akka-contrib" % "2.4.+",
"org.scalatest" % "scalatest_2.11" % "2.2.4" % "test",
"org.scalatestplus" %% "play" % "1.4.0-M3" % "test",
"com.github.seratch" %% "awscala" % "0.5.+",
"com.typesafe.play" %% "play-slick" % "1.1.1",
"com.typesafe.play" %% "play-slick-evolutions" % "1.1.1",
"mysql" % "mysql-connector-java" % "5.1.+",
"commons-net" % "commons-net" % "3.3",
"net.sourceforge.htmlcleaner" % "htmlcleaner" % "2.15",
"io.strongtyped" %% "active-slick" % "0.3.3",
"org.aspectj" % "aspectjweaver" % "1.8.8",
"org.aspectj" % "aspectjrt" % "1.8.8",
"io.kamon" %% "kamon-core" % "0.5.+",
// "io.kamon" %% "kkamon-system-metrics" % "0.5.+",
"io.kamon" %% "kamon-scala" % "0.5.+",
// "io.kamon" %% "kamon-akka" % "0.5.+",
"io.kamon" %% "kamon-datadog" % "0.5.+"
)
resolvers ++= Seq(
"scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"
)
// Play provides two styles of routers, one expects its actions to be injected, the
// other, legacy style, accesses its actions statically.
routesGenerator := InjectedRoutesGenerator
javaOptions in run += "-javaagent:" + System.getProperty("user.home") + "/.ivy2/cache/org.aspectj/aspectjweaver/jars/aspectjweaver-1.8.8.jar -Xmx:2G"
fork in run := true
connectInput in run := true
I've tried running the app using ./activator start as well as ./activator stage and then running the script.
What am I doing wrong?
Thanks!
The production application should be configurable during deployment. This is my example of start script:
PARAMETERS="-Dconfig.file=conf/production.conf -Dlogger.file=conf/prod-logger.xml"
PARAMETERS="$PARAMETERS -Dhttp.port=9000"
PARAMETERS="$PARAMETERS -J-Xmx8g -J-Xms8g -J-server -J-verbose:gc -J-Xloggc:../logs/portal/gc.log -J-XX:+PrintGCDateStamps"
nohup bin/myApp $PARAMETERS &
For more details see Production Configuration
I'm using this template to develop a microservice:
http://www.typesafe.com/activator/template/activator-service-container-tutorial
My sbt file is like this:
import sbt._
import Keys._
name := "activator-service-container-tutorial"
version := "1.0.1"
scalaVersion := "2.11.6"
crossScalaVersions := Seq("2.10.5", "2.11.6")
resolvers += "Scalaz Bintray Repo" at "https://dl.bintray.com/scalaz/releases"
libraryDependencies ++= {
val containerVersion = "1.0.1"
val configVersion = "1.2.1"
val akkaVersion = "2.3.9"
val liftVersion = "2.6.2"
val sprayVersion = "1.3.3"
Seq(
"com.github.vonnagy" %% "service-container" % containerVersion,
"com.github.vonnagy" %% "service-container-metrics-reporting" % containerVersion,
"com.typesafe" % "config" % configVersion,
"com.typesafe.akka" %% "akka-actor" % akkaVersion exclude ("org.scala-lang" , "scala-library"),
"com.typesafe.akka" %% "akka-slf4j" % akkaVersion exclude ("org.slf4j", "slf4j-api") exclude ("org.scala-lang" , "scala-library"),
"ch.qos.logback" % "logback-classic" % "1.1.3",
"io.spray" %% "spray-can" % sprayVersion,
"io.spray" %% "spray-routing" % sprayVersion,
"net.liftweb" %% "lift-json" % liftVersion,
"com.typesafe.akka" %% "akka-testkit" % akkaVersion % "test",
"io.spray" %% "spray-testkit" % sprayVersion % "test",
"junit" % "junit" % "4.12" % "test",
"org.scalaz.stream" %% "scalaz-stream" % "0.7a" % "test",
"org.specs2" %% "specs2-core" % "3.5" % "test",
"org.specs2" %% "specs2-mock" % "3.5" % "test",
"com.twitter" %% "finagle-http" % "6.25.0",
"com.twitter" %% "bijection-util" % "0.7.2"
)
}
scalacOptions ++= Seq(
"-unchecked",
"-deprecation",
"-Xlint",
"-Ywarn-dead-code",
"-language:_",
"-target:jvm-1.7",
"-encoding", "UTF-8"
)
crossPaths := false
parallelExecution in Test := false
assemblyJarName in assembly := "santo.jar"
mainClass in assembly := Some("Service")
The project compiles fine!
But when I run assembly, the terminal show me this:
[error] (*:assembly) deduplicate: different file contents found in the following:
[error] /path/.ivy2/cache/io.dropwizard.metrics/metrics-core/bundles/metrics-core-3.1.1.jar:com/codahale/metrics/ConsoleReporter$1.class
[error] /path/.ivy2/cache/com.codahale.metrics/metrics-core/bundles/metrics-core-3.0.1.jar:com/codahale/metrics/ConsoleReporter$1.class
What options do I have to fix it?
Thanks
The issue as it seems transitive dependency of the dependency is resulting with two different versions of metrics-core. The best thing to do would be to used the right library dependency so that you end up with a single version of this library. Please use https://github.com/jrudolph/sbt-dependency-graph , if it is difficult to figure out dependencies.
If it is not possible to get to a single version then you would most likely to go down exclude route . I assume, this only work, if there is compatibility between the all required versions.