use sbt for generate a webapp with icefaces - icefaces

I'm new to sbt and I will generate a web application with jsf 2.0 mojarra and icefaces, but i don't know how to build the build.sbt. I try things like this:
libraryDependencies += "org.icefaces" % "icefaces" % "2.0.2"
libraryDependencies += "net.java" % "jsf-api" % "2.1.2"
libraryDependencies += "net.java" % "jsf-impl" % "2.1.2"
Maybe is this horrible wrong and sbt can't find the module:
module not found: com.sun.faces#jsf-impl:2.1.1-b04/ivys/ivy.xml

resolvers += "java.net maven 2 repo" at "http://download.java.net/maven/2"
libraryDependencies += "org.icefaces" % "icefaces" % "2.0.2"
libraryDependencies += "com.sun.faces" % "jsf-api" % "2.1.2"
libraryDependencies += "com.sun.faces" % "jsf-impl" % "2.1.2"
This will only work with sbt 0.10+. Make sure you keep the blank lines between expressions.

Related

Cannot pass arguments using sbt to gatling simulation

Regarding to Gatling SBT execute a specific simulation topic is there any way to pass argument to simulation?
I've been trying passing command from any CLI like:
sbt -Dx=1 -Dy=2 -Dz=3 "gatling:testOnly fooSimulation"
and:
sbt "-Dx=1 -Dy=2 -Dz=3 gatling:testOnly fooSimulation"
and all similar variations, but in result it gives just a null value.
Same thing I was trying to do in sbt shell, because I use it as well, but no success at all. Maybe my specific configuration in build.sbt is the main reason why it doesn't work. Nevertheless I do not want to pass the arguments in config file, it should be dynamic.
build.sbt
name := "Gatling"
version := "0.1"
scalaVersion := "2.12.11"
enablePlugins(GatlingPlugin)
fork := true
scalacOptions := Seq(
"-encoding", "UTF-8", "-target:jvm-1.8", "-deprecation",
"-feature", "-unchecked", "-language:implicitConversions", "-language:postfixOps")
libraryDependencies += "io.gatling.highcharts" % "gatling-charts-highcharts" % "3.3.1" % Test
libraryDependencies += "io.gatling" % "gatling-test-framework" % "3.3.1" % Test
libraryDependencies += "org.json4s" % "json4s-native_2.12" % "3.6.7" % Test
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.8" % Test
libraryDependencies += "com.microsoft.sqlserver" % "mssql-jdbc" % "7.2.2.jre8" % Test
libraryDependencies += "org.springframework.boot" % "spring-boot-starter" % "2.3.5.RELEASE" % Test
libraryDependencies += "com.typesafe" % "config" % "1.4.1" % Test
Test / javaOptions += "-DpropertiesFile=./src/test/resources/application.properties"
plugins.sbt
addSbtPlugin("io.gatling" % "gatling-sbt" % "3.2.0")
Example code:
class FooSimulation extends Simulation {
before {
println(s"x=${System.getProperty("x")}")
println(s"y=${System.getProperty("y")}")
println(s"z=${System.getProperty("z")}")
}
setUp(
scenario("Foo")
.exec( foo chain builder )
.inject( foo injection )
).protocols( foo protocol )
}
Additionally my sbt shell is running with prefix sbt:gatling, maybe this is the reason?
sbt -Dx=1 -Dy=2 -Dz=3 "gatling:testOnly fooSimulation" is correct.
But the modern sbt syntax is sbt -Dx=1 -Dy=2 -Dz=3 "Gatling/testOnly fooSimulation".
If it doesn't work, you probably have a typo somewhere, or possibly your version of sbt and Gatling are way too old and your should upgrade.

sbt meta-build dependencies

I have a project/Generate.scala that generates some Scala code destined for sourceManaged. Generate.scala has its own dependencies. From the sbt documentation, it seems that those dependencies should go into project/build.sbt. When I tried that, sbt stops resolving my plugins declared in project/plugins.sbt.
What's the right way to declare these dependencies? And how should I think about the meta-build conceptually? It looks like I misunderstand "sbt is recursive."
project/build.sbt:
scalaVersion in ThisBuild:= "2.12.2"
resolvers += Resolver.sonatypeRepo("releases")
resolvers += Resolver.bintrayRepo("scalameta", "maven")
libraryDependencies += "org.scalameta" %% "scalameta" % "1.7.0"
project/plugins.sbt:
addSbtPlugin("com.earldouglas" % "xsbt-web-plugin" % "3.0.1")
addSbtPlugin("io.get-coursier" % "sbt-coursier" % "1.0.0-M15")
addSbtPlugin("org.scala-js" % "sbt-scalajs" % "0.6.16")
addSbtPlugin("com.thesamet" % "sbt-protoc" % "0.99.6" exclude ("com.trueaccord.scalapb", "protoc-bridge_2.10"))
libraryDependencies += "com.trueaccord.scalapb" %% "compilerplugin-shaded" % "0.6.0-pre3"

Error : java.lang.NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;

I get error, just like the title. I'm already research, and found some similar, but its not working on me.
NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor conflits on Elastic Search jar
Java elasticsearch client always null
https://github.com/elastic/elasticsearch/pull/7593
java.lang.NoSuchMethodError during Elastic search start
https://discuss.elastic.co/t/transportclient-in-2-1-x/38818/6
I'm using Scala as programming language to create API, and Elasticsearch as database.
here is my code build.sbt
name := "LearningByDoing"
version := "1.0"
scalaVersion := "2.10.5"
resolvers += "spray repo" at "http://repo.spray.io"
resolvers += "spray nightlies repo" at "http://nightlies.spray.io"
libraryDependencies += "io.spray" % "spray-json_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-can_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-client_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-testkit_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-routing_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-http_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-httpx_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-util_2.10" % "1.3.2"
libraryDependencies += "io.spray" % "spray-can_2.10" % "1.3.2"
libraryDependencies += "mysql" % "mysql-connector-java" % "5.1.12"
libraryDependencies += "org.elasticsearch" % "elasticsearch" % "2.3.1"
libraryDependencies += "com.sksamuel.elastic4s" % "elastic4s-streams_2.10" % "2.3.1"
libraryDependencies += "org.elasticsearch" % "elasticsearch-mapper-attachments" % "2.3.1"
libraryDependencies += "com.typesafe" % "config" % "1.2.1"
libraryDependencies += "com.typesafe.akka" % "akka-actor_2.10" % "2.3.1"
Here is my code plugins.sbt
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.0.0-M4")
addSbtPlugin("com.typesafe.sbt" % "sbt-multi-jvm" % "0.3.9")
addSbtPlugin("org.scalastyle" %% "scalastyle-sbt-plugin" % "0.8.0")
at terminal, I was written sbt clean compile test update package and everything works normal. but when I hit the API is always come error like that.
Seems like you have some wrong guava version, just like the firs link you mentioned, may be with this sbt plugin you can see the dependency tree and figured it out some messing dependencies.
The issue is the TCP client for Elasticsearch since 5.0 uses Netty 4.1, which is incompatible with Spray which uses Netty 4. There is no workaround other than waiting for Spray to upgrade or switching to an elasticsearch HTTP client.

error add libraryDependencies "io.prediction"

good afternoon,
i would ask about error when i tried added libraryDependencies "io.prediction" here's my code
name := "SBTMaret2016"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += toGroupID("org.scalatest") % "scalatest_2.10" % "2.0" % "test" (a)
libraryDependencies += toGroupID("io.prediction") %% "core" % "0.9.5" % "provided" (b)
libraryDependencies += toGroupID("org.apache.spark") %% "spark-core" % "1.3.0" % "provided" (c)
libraryDependencies += toGroupID("org.apache.spark") %% "spark-mllib" % "1.3.0" % "provided"(d)
first time, im add (a,c,d) and it work!
when im re-add (b) it doesn't work. any idea? thanks!
If you look at http://mvnrepository.com/artifact/io.prediction, there are no artifacts published for Scala 2.11. You can switch to 2.10.6 or try to get the source and compile it for 2.11. The Scalatest dependency is also incorrect; it should be toGroupID("org.scalatest") %% "scalatest". (Plus, you can remove all toGroupId calls; the usual way to write this is libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.0" % "provided".)

Sbt to download list of jars specified

I have a list of JARs and I want to download the JARs via SBT into destination directory specified. Is there a way/command to do this?
What I am trying is to have a list of jars in classpath for an external system like spark.
By default spark adds some jars to classpath and
I also have some jars that my app depends on in addition to spark classpath jars.
I don't want to build a fat jar.
And I need to package the dependent jars along with my jar in a tar ball.
My build.sbt
name := "app-jar"
scalaVersion := "2.10.5"
dependencyOverrides += "org.scala-lang" % "scala-library" % scalaVersion.value
scalacOptions ++= Seq("-unchecked", "-deprecation")
libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.4.1"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.1"
libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka" % "1.4.1"
// I want these jars from here
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.4.0-M3"
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector-java" % "1.4.0-M3"
libraryDependencies += "com.google.protobuf" % "protobuf-java" % "2.6.1"
...
// To here in my tar ball
So far I have achieved this using a shell script.
I want to know if there is a way to do the same with sbt .
Add sbt-pack to your project/plugins.sbt (or create it):
addSbtPlugin("org.xerial.sbt" % "sbt-pack" % "0.7.9")
Add packAutoSettings to your build.sbt and then run:
sbt pack
In target/pack/lib you will find all jars (with dependencies).
Update
Add new task to sbt:
val libraries = Seq(
"com.datastax.spark" %% "spark-cassandra-connector" % "1.4.0-M3",
"com.datastax.spark" %% "spark-cassandra-connector-java" % "1.4.0-M3",
"com.google.protobuf" % "protobuf-java" % "2.6.1"
)
libraryDependencies ++= libraries
lazy val removeNotNeeded = taskKey[Unit]("Remove not needed jars")
removeNotNeeded := {
val fileSet = libraries.map(l => s"${l.name}-${l.revision}.jar").toSet
println(s"$fileSet")
val ver = scalaVersion.value.split("\\.").take(2).mkString(".")
println(s"$ver")
file("target/pack/lib").listFiles.foreach{
file =>
val without = file.getName.replace(s"_$ver","")
println(s"$without")
if(!fileSet.contains(without)){
println(s"${file.getName} removed")
sbt.IO.delete(file)
}
}
}
After calling sbt pack call sbt removeNotNeeded. You will received only needed jar files.