I am building an application in Play Framework (2.4.0) / scala and trying to add play.api.libs.streams so I can use the object Streams in my application.
so here is my working build.sbt
libraryDependencies ++= Seq(
specs2 % Test,
cache,
ws,
"com.softwaremill.macwire" %% "macros" % "2.2.2",
"com.softwaremill.macwire" %% "runtime" % "1.0.7",
"org.reactivemongo" %% "play2-reactivemongo" % "0.11.10",
"com.eclipsesource" %% "play-json-schema-validator" % "0.6.5",
"org.scalatest" %% "scalatest" % "2.2.5" % Test,
"org.scalacheck" %% "scalacheck" % "1.12.2" % Test,
"org.scalatestplus" %% "play" % "1.4.0-M4" % Test,
"com.typesafe.akka" %% "akka-stream" % "2.4.4"
)
Now when I try to add the following line :
streams,
or when I just add
libraryDependencies += streams
I get the error :
error: No implicit for Append.Value[Seq[sbt.ModuleID], sbt.TaskKey[sbt.Keys.TaskStreams]] found,
so sbt.TaskKey[sbt.Keys.TaskStreams] cannot be appended to Seq[sbt.ModuleID]
libraryDependencies += streams
And I am unable to launch my project.
I found this question, but tweaking by adding '%' or '%%' did not solve the issue, and I was not sure how to use the solutions as I am just trying to add a play.api.libs dependency and not an external one.
I am kind of stuck here, I don't understand why streams is a sbt.TaskKey[sbt.Keys.TaskStreams] but ws or any other key added in the Sequence is a sbt.ModuleID
This this case the cache, ws, etc lines refer not to packages in play.api.libs, but to build artefacts that the Play sbt-plugin pre-defines as components in the play.sbt.PlayImport object, for example here.
In this context, ws is exactly equivalent to:
"com.typesafe.play" %% "play-ws" % "2.5.4"
The reason you see an error for streams is because there is no such component defined by Play, and therefore SBT assumes you are making reference to a TaskKey.
The play.api.libs.streams.Streams object should be available without anything extra added to your build if you have a PlayScala project on Play 2.5.x and above.
Related
I upgraded my test dependencies for my play project and now I get this problem:
object scalacheck is not a member of package org.scalatestplus
[error] import org.scalatestplus.scalacheck.ScalaCheckDrivenPropertyChecks
These are my test dependencies:
"org.scalatest" %% "scalatest" % "3.2.3" % Test,
"org.scalamock" %% "scalamock" % "4.2.0" % "test",
"com.github.alexarchambault" %% "scalacheck-shapeless_1.14" % "1.2.1" % "test",
"org.scalatestplus.play" %% "scalatestplus-play" % "3.1.2" % "test, it",
Do I have some incompatible versioning going on here?
You don't have the right dependency:
"org.scalatestplus" %% "scalacheck-1-15" % "3.2.3.0"
Since you are using scalatest version 3.2.3, you should use scalacheck version 3.2.3.X, for compatability.
Here is its maven location, and this is its github repo.
Scala 2.13 Code run at Scastie.
Scala 2.12 Code run at Scastie.
Scala/JVM noob here that wants to understand more about logging, specifically when using Apache Spark.
I have written a library in Scala that depends upon a bunch of Spark libraries, here are my dependencies:
import sbt._
object Dependencies {
object Version {
val spark = "2.2.0"
val scalaTest = "3.0.0"
}
val deps = Seq(
"org.apache.spark" %% "spark-core" % Version.spark,
"org.scalatest" %% "scalatest" % Version.scalaTest,
"org.apache.spark" %% "spark-hive" % Version.spark,
"org.apache.spark" %% "spark-sql" % Version.spark,
"com.holdenkarau" %% "spark-testing-base" % "2.2.0_0.8.0" % "test",
"ch.qos.logback" % "logback-core" % "1.2.3",
"ch.qos.logback" % "logback-classic" % "1.2.3",
"com.typesafe.scala-logging" %% "scala-logging" % "3.8.0",
"com.typesafe" % "config" % "1.3.2"
)
val exc = Seq(
ExclusionRule("org.slf4j", "slf4j-log4j12")
)
}
(admittedly I copied a lot of this from elsewhere).
I am able to package my code as a JAR using sbt package which I can then call from Spark by placing the JAR into ${SPARK_HOME}/jars. This is working great.
I now want to implement logging from my code so I do this:
import com.typesafe.scalalogging.Logger
/*
* stuff stuff stuff
*/
val logger : Logger = Logger("name")
logger.info("stuff")
however when I try and call my library (which I'm doing from Python, not that I think that's relevant here) I get an error:
py4j.protocol.Py4JJavaError: An error occurred while calling z:com.company.package.class.function.
E : java.lang.NoClassDefFoundError: com/typesafe/scalalogging/Logger$
Clearly this is because com.typesafe.scala-logging library is not in my JAR. I know I could solve this by packaging using sbt assembly but I don't want to do that because it will include all the other dependencies and cause my JAR to be enormous.
Is there a way to selectively include libraries (com.typesafe.scala-logging in this case) in my JAR? Alternatively, should I be attempting to log using another method, perhaps using a logger that is included with Spark?
Thanks to pasha701 in the comments I attempted packaging my dependencies by using sbt assembly rather than sbt package.
import sbt._
object Dependencies {
object Version {
val spark = "2.2.0"
val scalaTest = "3.0.0"
}
val deps = Seq(
"org.apache.spark" %% "spark-core" % Version.spark % Provided,
"org.scalatest" %% "scalatest" % Version.scalaTest,
"org.apache.spark" %% "spark-hive" % Version.spark % Provided,
"org.apache.spark" %% "spark-sql" % Version.spark % Provided,
"com.holdenkarau" %% "spark-testing-base" % "2.2.0_0.8.0" % "test",
"ch.qos.logback" % "logback-core" % "1.2.3",
"ch.qos.logback" % "logback-classic" % "1.2.3",
"com.typesafe.scala-logging" %% "scala-logging" % "3.8.0",
"com.typesafe" % "config" % "1.3.2"
)
val exc = Seq(
ExclusionRule("org.slf4j", "slf4j-log4j12")
)
}
Unfortunately, even if specifying the spark dependencies as Provided my JAR went from 324K to 12M hence I opted to use println() instead. Here is my commit message:
log using println
I went with the println option because it keeps the size of the JAR small.
I trialled use of com.typesafe.scalalogging.Logger but my tests failed with error:
java.lang.NoClassDefFoundError: com/typesafe/scalalogging/Logger
because that isn't provided with Spark. I attempted to use sbt assembly
instead of sbt package but this caused the size of the JAR to go from
324K to 12M, even with spark dependencies set to Provided. A 12M JAR
isn't worth the trade-off just to use scalaLogging, hence using println
instead.
I note that pasha701 suggested using log4j instead as that is provided with Spark so I shall try that next. Any advice on using log4j from Scala when writing a Spark library would be much appreciated.
As you said 'sbt assembly' will include all the dependencies into your jar.
If you want use certain two option:
Download logback-core and logback-classic and add them on --jar spark2-submit command
Specify the above deps in --packages spark2-submit option
Currently I have a simple scala project.
Running sbt compile from the command line succeeds
In Intellij doing Build > Rebuild Project the build will succeed.
However there are several issues with syntax highlighting.
Receive types are not being recognized, context.become(<some Receive def>) is failing (this is due to Receive not being recognized), and lots of other issues that are seemingly random.
Current build.sbt:
scalaVersion := "2.12.1"
resolvers += "Typesafe Repository" at
"http://repo.typesafe.com/typesafe/releases/"
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.5.1",
"com.typesafe.akka" %% "akka-http-core" % "10.0.5",
"com.typesafe.akka" %% "akka-http" % "10.0.5",
"com.typesafe.akka" %% "akka-http-spray-json" % "10.0.6",
"com.typesafe.akka" %% "akka-slf4j" % "2.5.1",
"com.typesafe.akka" %% "akka-stream" % "2.5.1",
"ch.qos.logback" % "logback-classic" % "1.1.7",
"com.typesafe.akka" %% "akka-testkit" % "2.5.1" % "test"
)
I've tried invalidating my cache and restarting, didn't work.
I've updated Intellij. I tried wiping my .idea files and re-importing the project.
Any thoughts on how to resolve this?
I'm trying to import an akka dependency to my project using sbt.
The akka modules I need are akka-actor and akka-remote. The curious thing is that akka-actor has no problems importing, but the remote module appears as an unknown artifact.
I'm using IntelliJ and scala 2.12.1. Does someone have this problem or can help me in any way?
Try leave a blank line between the two libraryDependencies:
libraryDependencies += "com.typesafe.akka" %% "akka-actor" % "2.5.0"
libraryDependencies += "com.typesafe.akka" %% "akka-remote" % "2.5.0"
Or keep them in a Seq:
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.5.0",
"com.typesafe.akka" %% "akka-remote" % "2.5.0"
)
Piece of build.sbt related to Play! application project.
resolvers += Resolver.jcenterRepo,
resolvers += "scalaz-bintray" at "https://dl.bintray.com/scalaz/releases",
libraryDependencies ++= Seq(
"com.kyleu" %% "jdub-async" % "1.0",
"com.vmunier" %% "play-scalajs-scripts" % "0.3.0",
"org.webjars" % "jquery" % "1.11.1",
"org.json4s" %% "json4s-jackson" % "3.2.11",
evolutions,
"com.github.benhutchison" %% "prickle" % "1.1.7",
specs2 % Test
)
IDEA said to me that it cannot resolve symbol jdub when I import jdub-async library. Meanwhile project is compiling successfully.
How is it possible to fix this bug?
This is just an issue with IntelliJ IDEA's own Scala compiler. It chokes on many things. If it compiles fine with SBT itself, you don't have any bug to resolve (unless you want to report it to IntelliJ).