PlaySpec not found in IntelliJ - scala

Below is a Scala test of websocket:
import java.util.function.Consumer
import play.shaded.ahc.org.asynchttpclient.AsyncHttpClient
import play.api.inject.guice.GuiceApplicationBuilder
import play.api.test.{Helpers, TestServer, WsTestClient}
import scala.compat.java8.FutureConverters
import scala.concurrent.Await
import scala.concurrent.duration._
import org.scalatestplus.play._
class SocketTest extends PlaySpec with ScalaFutures {
"HomeController" should {
"reject a websocket flow if the origin is set incorrectly" in WsTestClient.withClient { client =>
// Pick a non standard port that will fail the (somewhat contrived) origin check...
lazy val port: Int = 31337
val app = new GuiceApplicationBuilder().build()
Helpers.running(TestServer(port, app)) {
val myPublicAddress = s"localhost:$port"
val serverURL = s"ws://$myPublicAddress/ws"
val asyncHttpClient: AsyncHttpClient = client.underlying[AsyncHttpClient]
val webSocketClient = new WebSocketClient(asyncHttpClient)
try {
val origin = "ws://example.com/ws"
val consumer: Consumer[String] = new Consumer[String] {
override def accept(message: String): Unit = println(message)
}
val listener = new WebSocketClient.LoggingListener(consumer)
val completionStage = webSocketClient.call(serverURL, origin, listener)
val f = FutureConverters.toScala(completionStage)
Await.result(f, atMost = 1000.millis)
listener.getThrowable mustBe a[IllegalStateException]
} catch {
case e: IllegalStateException =>
e mustBe an[IllegalStateException]
case e: java.util.concurrent.ExecutionException =>
val foo = e.getCause
foo mustBe an[IllegalStateException]
}
}
}
}
}
But compile is failing on line import org.scalatestplus.play._ with error :
Cannot resolve symbol scalatestplus
From https://www.playframework.com/documentation/2.8.x/ScalaTestingWithScalaTest I have added scalatest and play to build:
build.sbt:
name := "testproject"
version := "1.0"
lazy val `testproject` = (project in file(".")).enablePlugins(PlayScala)
resolvers += "scalaz-bintray" at "https://dl.bintray.com/scalaz/releases"
resolvers += "Akka Snapshot Repository" at "https://repo.akka.io/snapshots/"
scalaVersion := "2.12.2"
libraryDependencies ++= Seq( jdbc , ehcache , ws , guice , specs2 % Test)
// https://mvnrepository.com/artifact/com.typesafe.scala-logging/scala-logging
libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.9.2"
libraryDependencies ++= Seq(
"org.scalatestplus.play" %% "scalatestplus-play" % "3.0.0" % "test"
)
unmanagedResourceDirectories in Test <+= baseDirectory ( _ /"target/web/public/test" )
I've tried rebuilding the project and module in IntelliJ "build" option and "Build Option" when I right click on build.sbt but the import is not found.

sbt dist from Intellij "sbt shell" then File -> "Invalidate caches" with restart of IntelliJ seems to fix the issue
:Invalidate caches screenshot

Related

Scala Flink get java.lang.NoClassDefFoundError: scala/Product$class after using case class for customized DeserializationSchema

It work fine when using generic class.
But get java.lang.NoClassDefFoundError: scala/Product$class error after change class to case class.
Not sure is sbt packaging problem or code problem.
When I'm using:
sbt
scala: 2.11.12
java: 8
sbt assembly to package
package example
import java.util.Properties
import java.nio.charset.StandardCharsets
import org.apache.flink.api.scala._
import org.apache.flink.streaming.util.serialization.{DeserializationSchema, SerializationSchema}
import org.apache.flink.streaming.api.scala.{DataStream, StreamExecutionEnvironment}
import org.apache.flink.streaming.connectors.kafka.{FlinkKafkaConsumer, FlinkKafkaProducer}
import org.apache.flink.streaming.api.watermark.Watermark
import org.apache.flink.streaming.api.functions.AssignerWithPunctuatedWatermarks
import org.apache.flink.api.common.typeinfo.TypeInformation
import Config._
case class Record(
id: String,
startTime: Long
) {}
class RecordDeSerializer extends DeserializationSchema[Record] with SerializationSchema[Record] {
override def serialize(record: Record): Array[Byte] = {
return "123".getBytes(StandardCharsets.UTF_8)
}
override def deserialize(b: Array[Byte]): Record = {
Record("1", 123)
}
override def isEndOfStream(record: Record): Boolean = false
override def getProducedType: TypeInformation[Record] = {
createTypeInformation[Record]
}
}
object RecordConsumer {
def main(args: Array[String]): Unit = {
val config : Properties = {
var p = new Properties()
p.setProperty("zookeeper.connect", Config.KafkaZookeeperServers)
p.setProperty("bootstrap.servers", Config.KafkaBootstrapServers)
p.setProperty("group.id", Config.KafkaGroupID)
p
}
val env = StreamExecutionEnvironment.getExecutionEnvironment
env.enableCheckpointing(1000)
var consumer = new FlinkKafkaConsumer[Record](
Config.KafkaTopic,
new RecordDeSerializer(),
config
)
consumer.setStartFromEarliest()
val stream = env.addSource(consumer).print
env.execute("record consumer")
}
}
Error
2020-08-05 04:07:33,963 INFO org.apache.flink.runtime.checkpoint.CheckpointCoordinator - Discarding checkpoint 1670 of job 4de8831901fa72790d0a9a973cc17dde.
java.lang.NoClassDefFoundError: scala/Product$class
...
build.SBT
First idea is that maybe version is not right.
But every thing work fine if use normal class
Here is build.sbt
ThisBuild / resolvers ++= Seq(
"Apache Development Snapshot Repository" at "https://repository.apache.org/content/repositories/snapshots/",
Resolver.mavenLocal
)
name := "deedee"
version := "0.1-SNAPSHOT"
organization := "dexterlab"
ThisBuild / scalaVersion := "2.11.8"
val flinkVersion = "1.8.2"
val flinkDependencies = Seq(
"org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
"org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided",
"org.apache.flink" %% "flink-streaming-java" % flinkVersion % "provided",
"org.apache.flink" %% "flink-connector-kafka" % flinkVersion,
)
val thirdPartyDependencies = Seq(
"com.github.nscala-time" %% "nscala-time" % "2.24.0",
"com.typesafe.play" %% "play-json" % "2.6.14",
)
lazy val root = (project in file(".")).
settings(
libraryDependencies ++= flinkDependencies,
libraryDependencies ++= thirdPartyDependencies,
libraryDependencies += "org.scala-lang" % "scala-compiler" % scalaVersion.value,
)
assembly / mainClass := Some("dexterlab.TelecoDataConsumer")
// make run command include the provided dependencies
Compile / run := Defaults.runTask(Compile / fullClasspath,
Compile / run / mainClass,
Compile / run / runner
).evaluated
// stays inside the sbt console when we press "ctrl-c" while a Flink programme executes with "run" or "runMain"
Compile / run / fork := true
Global / cancelable := true
// exclude Scala library from assembly
assembly / assemblyOption := (assembly / assemblyOption).value.copy(includeScala = false)
autoCompilerPlugins := true
Finally success after I add this line in build.sbt
assembly / assemblyOption := (assemblu / assemblyOption).value.copy(includeScala = true)
To include scala library when running sbt assembly

how to start server/client grpc using scalapb on spark?

i have a problem about running server/client using ScalaPB on spark.
its totally work fine while I running my code using "sbt run". i want running this code using spark coz next ill import my spark model to predict some label. but while I submit my jar to spark, they give me error like this.
Exception in thread "main" io.grpc.ManagedChannelProvider$ProviderNotFoundException:
No functional server found. Try adding a dependency on the grpc-netty artifact
this is my build.sbt
scalaVersion := "2.11.7"
PB.targets in Compile := Seq(
scalapb.gen() -> (sourceManaged in Compile).value
)
val scalapbVersion =
scalapb.compiler.Version.scalapbVersion
val grpcJavaVersion =
scalapb.compiler.Version.grpcJavaVersion
libraryDependencies ++= Seq(
// protobuf
"com.thesamet.scalapb" %% "scalapb-runtime" % scalapbVersion % "protobuf",
//for grpc
"io.grpc" % "grpc-netty" % grpcJavaVersion ,
"com.thesamet.scalapb" %% "scalapb-runtime-grpc" % scalapbVersion
)
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
using shade still not work
assemblyShadeRules in assembly := Seq(ShadeRule.rename("com.google.**" -> "shadegoogle.#1").inAll)
and this my main
import java.util.logging.Logger
import io.grpc.{Server, ServerBuilder}
import org.apache.spark.ml.tuning.CrossValidatorModel
import org.apache.spark.sql.SparkSession
import testproto.test.{Email, EmailLabel, RouteGuideGrpc}
import scala.concurrent.{ExecutionContext, Future}
object HelloWorldServer {
private val logger = Logger.getLogger(classOf[HelloWorldServer].getName)
def main(args: Array[String]): Unit = {
val server = new HelloWorldServer(ExecutionContext.global)
server.start()
server.blockUntilShutdown()
}
private val port = 50051
}
class HelloWorldServer(executionContext: ExecutionContext) {
self =>
private[this] var server: Server = null
private def start(): Unit = {
server = ServerBuilder.forPort(HelloWorldServer.port).addService(RouteGuideGrpc.bindService(new RouteGuideImpl, executionContext)).build.start
HelloWorldServer.logger.info("Server started, listening on " + HelloWorldServer.port)
sys.addShutdownHook {
System.err.println("*** shutting down gRPC server since JVM is shutting down")
self.stop()
System.err.println("*** server shut down")
}
}
private def stop(): Unit = {
if (server != null) {
server.shutdown()
}
}
private def blockUntilShutdown(): Unit = {
if (server != null) {
server.awaitTermination()
}
}
private class RouteGuideImpl extends RouteGuideGrpc.RouteGuide {
override def getLabel(request: Email): Future[EmailLabel] = {
val replay = EmailLabel(emailId = request.emailId, label = "aaaaa")
Future.successful(replay)
}
}
}
thanks
It looks like grpc-netty is not found when an uber jar is made. Instead of using ServerBuilder, change your code to use io.grpc.netty.NettyServerBuilder.

Exception while trying to run Spark app with JSON parsing

I have a simple Spark application with Scala and SBT. First I tried to do the following:
run sbt clean package
run spark-submit --class Main ./target/scala-2.11/sparktest_2.11-1.0.jar
but it fails with the following exception:
Exception in thread "main" java.lang.NoClassDefFoundError: com/fasterxml/jackson/module/scala/DefaultScalaModule$
Then I tried the assembly plugin for SBT, but I got the following exception instead:
java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.introspect.POJOPropertyBuilder.addField(Lcom/fasterxml/jackson/databind/introspect/AnnotatedField;Lcom/fasterxml/jackson/databind/PropertyName;ZZZ)V
As I can see, everything looks related to the Jackson lib and to the Scala support. Maybe it's some issue related to versions of the libraries?
My build.sbt looks like this:
name := "SparkTest"
version := "1.0"
scalaVersion := "2.11.4"
scalacOptions := Seq("-unchecked", "-deprecation", "-encoding", "utf8", "-feature")
libraryDependencies ++= {
Seq(
"org.apache.spark" %% "spark-core" % "1.2.1" % "provided",
"com.fasterxml.jackson.core" % "jackson-core" % "2.4.1",
"com.fasterxml.jackson.core" % "jackson-databind" % "2.4.1",
"com.fasterxml.jackson.module" %% "jackson-module-scala" % "2.4.1"
)
}
And my application code is simply this:
import com.fasterxml.jackson.databind.{DeserializationFeature, ObjectMapper}
import com.fasterxml.jackson.module.scala.DefaultScalaModule
import org.apache.spark.{SparkConf, SparkContext}
trait JsonUtil {
val mapper = new ObjectMapper()
mapper.registerModule(DefaultScalaModule)
mapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false)
}
case class Person(name: String)
object Main extends JsonUtil {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Spark Test App")
val sc = new SparkContext(conf)
val inputFile = "/home/user/data/person.json"
val input = sc.textFile(inputFile)
val persons = input.flatMap { line ⇒ {
try {
println(s" [DEBUG] trying to parse '$line'")
Some(mapper.readValue(line, classOf[Person]))
} catch {
case e : Exception ⇒
println(s" [EXCEPTION] ${e.getMessage}")
None
}
}}
println("PERSON LIST:")
for (p ← persons) {
println(s" $p")
}
println("END")
}
}
EDIT: the problem seems to be related to the Spark application. If I run simple application just for testing JSON unmarshalling everything goes OK. But if I try to do the same from the Spark application, then the problem appears as described above. Any ideas?

jacoco:cover from play console gives NoClassDefFoundError: Could not initialize class com.sun.xml.internal.ws.api.BindingID

Running jacoco:cover from the Play console using jacoco4sbt (2.1.4) results in many tests failing and messages like:
[debug] Running
TaskDef(com.ourCompany.ourProject.identity.LoginControllerSpec,
org.scalatest.tools.Framework$$anon$1#5b98f69a, false,
[SuiteSelector])
java.lang.NoClassDefFoundError: Could not initialize class com.sun.xml.internal.ws.api.BindingID
at com.sun.xml.internal.ws.wsdl.parser.RuntimeWSDLParser.parseBinding(RuntimeWSDLParser.java:445)
at com.sun.xml.internal.ws.wsdl.parser.RuntimeWSDLParser.parseWSDL(RuntimeWSDLParser.java:342)
at com.sun.xml.internal.ws.wsdl.parser.RuntimeWSDLParser.parse(RuntimeWSDLParser.java:157)
at com.sun.xml.internal.ws.wsdl.parser.RuntimeWSDLParser.parse(RuntimeWSDLParser.java:120)
at com.sun.xml.internal.ws.client.WSServiceDelegate.parseWSDL(WSServiceDelegate.java:257)
at com.sun.xml.internal.ws.client.WSServiceDelegate.(WSServiceDelegate.java:220)
at com.sun.xml.internal.ws.client.WSServiceDelegate.(WSServiceDelegate.java:168)
at com.sun.xml.internal.ws.spi.ProviderImpl.createServiceDelegate(ProviderImpl.java:96)
at javax.xml.ws.Service.(Service.java:77)
at com.bsl.Services.(Services.java:46)
at com.ourCompany.ourProject.identity.UserRepositoryComponent$OdsUserRepository.(UserRepositoryComponent.scala:92)
at com.ourCompany.ourProject.identity.ComponentRegistry$class.$init$(ComponentRegistry.scala:7)
at com.ourCompany.ourProject.identity.LoginControllerSpec$TestLoginController$.(LoginControllerSpec.scala:20)
at com.ourCompany.ourProject.identity.LoginControllerSpec.TestLoginController$lzycompute(LoginControllerSpec.scala:20)
at com.ourCompany.ourProject.identity.LoginControllerSpec.TestLoginController(LoginControllerSpec.scala:20)
at com.ourCompany.ourProject.identity.LoginControllerSpec$$anonfun$1.apply$mcV$sp(LoginControllerSpec.scala:32)
at com.ourCompany.ourProject.identity.LoginControllerSpec$$anonfun$1.apply(LoginControllerSpec.scala:32)
at com.ourCompany.ourProject.identity.LoginControllerSpec$$anonfun$1.apply(LoginControllerSpec.scala:32)
at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
at org.scalatest.Transformer.apply(Transformer.scala:22)
at org.scalatest.Transformer.apply(Transformer.scala:20)
at org.scalatest.FlatSpecLike$$anon$1.apply(FlatSpecLike.scala:1636)
at org.scalatest.Suite$class.withFixture(Suite.scala:1121)
at org.scalatest.FlatSpec.withFixture(FlatSpec.scala:1683)
at org.scalatest.FlatSpecLike$class.invokeWithFixture$1(FlatSpecLike.scala:1633)
at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1645)
at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1645)
at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
at org.scalatest.FlatSpecLike$class.runTest(FlatSpecLike.scala:1645)
at org.scalatest.FlatSpec.runTest(FlatSpec.scala:1683)
at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1703)
at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1703)
at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:390)
at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:427)
at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
at org.scalatest.FlatSpecLike$class.runTests(FlatSpecLike.scala:1703)
at org.scalatest.FlatSpec.runTests(FlatSpec.scala:1683)
at org.scalatest.Suite$class.run(Suite.scala:1423)
at org.scalatest.FlatSpec.org$scalatest$FlatSpecLike$$super$run(FlatSpec.scala:1683)
at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1749)
at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1749)
at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
at org.scalatest.FlatSpecLike$class.run(FlatSpecLike.scala:1749)
at com.ourCompany.ourProject.identity.LoginControllerSpec.org$scalatest$BeforeAndAfterAll$$super$run(LoginControllerSpec.scala:11)
at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:257)
at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:256)
at com.ourCompany.ourProject.identity.LoginControllerSpec.run(LoginControllerSpec.scala:11)
at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:444)
at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:651)
at sbt.TestRunner.runTest$1(TestFramework.scala:84)
at sbt.TestRunner.run(TestFramework.scala:94)
at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:224)
at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:224)
at sbt.TestFramework$.sbt$TestFramework$$withContextLoader(TestFramework.scala:212)
at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:224)
at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:224)
at sbt.TestFunction.apply(TestFramework.scala:229)
at sbt.Tests$.sbt$Tests$$processRunnable$1(Tests.scala:211)
at sbt.Tests$$anonfun$makeSerial$1.apply(Tests.scala:217)
at sbt.Tests$$anonfun$makeSerial$1.apply(Tests.scala:217)
at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:45)
at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:45)
at sbt.std.Transform$$anon$4.work(System.scala:64)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
at sbt.Execute.work(Execute.scala:244)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:160)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:30)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
[error] Could not run test com.ourCompany.ourProject.identity.LoginControllerSpec:
java.lang.NoClassDefFoundError: Could not initialize class
com.sun.xml.internal.ws.api.BindingID
LoginControlerSpec
package com.ourCompany.ourProject.identity
import org.scalatest.{ Matchers, BeforeAndAfterAll, FlatSpec }
import play.api.test.{ FakeApplication, FakeRequest }
import com.ourCompany.ourProject.tags.UnitTest
import play.api.Play
import org.specs2.mock.Mockito
import play.api.test.Helpers._
#UnitTest
class LoginControllerSpec extends FlatSpec with BeforeAndAfterAll with Mockito with Matchers {
override def beforeAll() {
Play.start(FakeApplication())
}
override def afterAll() {
Play.stop()
}
object TestLoginController extends LoginController with Secured with ComponentRegistryMock
val Home = "/"
val username = "XXX#ourCompanyprofessional.com"
val password = "XXX1"
val isRememberMe = false
val expirationPolicy = SessionExpirationPolicy
val odsUser = OdsUser(Identity(username, ""), Some(username))
behavior of "LoginController"
it should "send 200 for renderLoginForm with not authenticated user" in {
status(TestLoginController.renderLoginForm(FakeRequest())) should be(OK)
}
it should "send 303 for renderLoginForm with authenticated user" in {
val cookie = TestLoginController.authenticatorService.create(odsUser.identity, expirationPolicy) match {
case Right(a: Authenticator) => a.toCookie
case _ => fail()
}
TestLoginController.userRepository.find(odsUser.identity.userId) returns Some(odsUser)
val result = TestLoginController.renderLoginForm(FakeRequest().withCookies(cookie))
redirectLocation(result) should be(Some(Home))
status(result) should be(SEE_OTHER)
}
it should "send 303 for login with valid credentials" in {
val fakeRequest = FakeRequest().withFormUrlEncodedBody(
UsernameField -> username, PasswordField -> password, RememberMeField -> isRememberMe.toString)
TestLoginController.userRepository.login(username, password) returns Right(odsUser)
val result = TestLoginController.login(fakeRequest)
cookies(result).get(Authenticator.cookieName) match {
case Some(c) => assert(c.name == Authenticator.cookieName && !c.value.isEmpty)
case _ => fail()
}
redirectLocation(result) should be(Some(Home))
status(result) should be(SEE_OTHER)
}
it should "send 400 for login with invalid credentials" in {
val fakeRequest = FakeRequest().withFormUrlEncodedBody(
UsernameField -> username, PasswordField -> password, RememberMeField -> isRememberMe.toString)
TestLoginController.userRepository.login(username, password) returns Left(new Error)
val result = TestLoginController.login(fakeRequest)
status(result) should be(BAD_REQUEST)
}
it should "send 400 for login with invalid form input" in {
val fakeRequest = FakeRequest().withFormUrlEncodedBody(
UsernameField -> "", PasswordField -> "", RememberMeField -> "")
val result = TestLoginController.login(fakeRequest)
status(result) should be(BAD_REQUEST)
}
}
build.sbt
gitHeadCommitSha in ThisBuild := Process("git rev-parse HEAD").lines.head
(testOptions in Test) += Tests.Argument(TestFrameworks.ScalaTest, "-h", "target/report")
org.scalastyle.sbt.ScalastylePlugin.Settings
scalacOptions := Seq("-feature")
build.scala
import com.typesafe.sbt.SbtNativePackager._
import com.typesafe.sbt.SbtScalariform._
import play.Project._
import sbt.Keys._
import sbt._
import sbtbuildinfo.Plugin._
import scala._
import scala.util.Try
import scala.Some
import de.johoop.jacoco4sbt.JacocoPlugin._
object BuildSettings {
val buildOrganization = "com.ourCompany.ourProject"
val buildVersion = "0.1-SNAPSHOT"
val buildScalaVersion = "2.10.2"
val envConfig = "-Djava.awt.headless=true -Dsbt.log.format=false -Dconfig.file=" +
Option(System.getProperty("env.config")).getOrElse("local.application")
val maxErrors = 20
// disable running browserstack tests by default. Possible options: true | false [default]
val browserstack: Boolean = Try(System.getProperty("test.browserstack").toBoolean).getOrElse(false)
val buildSettings = Defaults.defaultSettings ++ Seq (
organization := buildOrganization,
version := buildVersion,
scalaVersion := buildScalaVersion,
scalacOptions ++= Seq("-unchecked", "-optimise", "-deprecation",
"-Xcheckinit", "-encoding", "utf8", "-feature", "-Yinline-warnings",
"-Xfatal-warnings"),
javaOptions ++= Seq("-Xms512M","-Xmx1536M", "-Xss1M", "-XX:ReservedCodeCacheSize=192M",
"-XX:+CMSClassUnloadingEnabled", "-XX:MaxPermSize=512M"),
javaOptions += envConfig,
publishMavenStyle := false
)
}
object Resolvers {
val remoteRepoUrl = "ourCompany Nexus Snapshots" at "http://nexus.ci.bln.ourCompany-xxx.com/content/repositories/snapshots/"
val publishRepoUrl = "ourCompany Nexus Snapshots" at "http://nexus.ci.bln.ourCompany-xxx.com/content/repositories/snapshots/"
val releaseRepoUrl = "ourCompany Nexus Releases" at "http://nexus.ci.bln.ourCompany-xxx.com/content/repositories/releases/"
}
object Dependencies {
val ods = "de.bsmo.ourCompany-professional" % "sprprof-ws" % "2.2.1-SNAPSHOT"
val scalatest = "org.scalatest" %% "scalatest" % "2.0" % "test->*" withSources()
val mockito = "org.mockito" % "mockito-all" % "1.9.5" % "test"
val ghostDriver = "com.github.detro.ghostdriver" % "phantomjsdriver" % "1.0.3" % "test"
val cmsClient = "com.ourCompany.cms.ws.clients" % "ourCompany-cms-java-api" % "1.1.6"
val solrjClient = "org.apache.solr" % "solr-solrj" % "4.3.1" % "compile"
}
object ApplicationBuild extends Build {
import BuildSettings._
import Dependencies._
import Resolvers._
// Sub-project specific dependencies
val commonDeps = Seq(
ods,
scalatest,
mockito,
ghostDriver,
cmsClient,
jdbc,
anorm,
filters,
solrjClient,
cache
)
//val bN = settingKey[Int]("current build Number")
val gitHeadCommitSha = settingKey[String]("current git commit SHA")
val release = settingKey[Boolean]("Release")
lazy val ourProject = play.Project(
"ourProject",
path = file("."),
settings = Defaults.defaultSettings ++ buildSettings ++
Seq(libraryDependencies ++= commonDeps) ++
Seq(scalariformSettings: _*) ++
Seq(playScalaSettings: _*) ++
Seq(publishArtifact := false) ++
buildInfoSettings ++
jacoco.settings ++
Seq(
sourceGenerators in Compile <+= buildInfo,
buildInfoKeys ++= Seq[BuildInfoKey](
resolvers,
libraryDependencies in Test,
buildInfoBuildNumber,
BuildInfoKey.map(name) { case (k, v) => "project" + k.capitalize -> v.capitalize },
"envConfig" -> envConfig, // computed at project load time
BuildInfoKey.action("buildTime") {
System.currentTimeMillis
} // re-computed each time at compile
),
buildInfoPackage := "com.ourCompany.ourProject"
) ++
Seq(resolvers += remoteRepoUrl) ++
Seq(resolvers += releaseRepoUrl) ++
Seq(mappings in Universal ++= Seq(
file("ops/rpm/start-server.sh") -> "start-server.sh",
file("ops/rpm/stop-server.sh") -> "stop-server.sh"
)) ++
Seq(testOptions in Test += Tests.Argument(if(browserstack) "-n" else "-l", "com.ourCompany.ourProject.tags.BrowserStackTest"))
).settings(version <<= version in ThisBuild)
.settings(parallelExecution in jacoco.Config := false)
.settings({
if(browserstack) {
javaOptions in Test += "-Dconfig.file=conf/browserstack.application.conf"
} else {
javaOptions in Test += "-Dtest.none=true"
}
})
lazy val ourProjectPackaging = Project(
"packaging",
file("ourProjectPackaging"), settings=
Defaults.defaultSettings ++
Seq(Packaging.settings:_*) ++
Seq(resolvers += publishRepoUrl) ++
buildSettings ++
publishSetting ++
Seq(publishArtifact := false) ++
credentialsSetting
).settings(Packaging.rpmDistSettings: _*).settings(version <<= version in Rpm )
lazy val credentialsSetting = credentials += {
Seq("NEXUS_USER", "NEXUS_PASSWORD").map(k => Option(System.getenv(k))) match {
case Seq(Some(user), Some(pass)) =>
Credentials("Sonatype Nexus Repository Manager",
"nexus.ci.bln.ourCompany-xxx.com", user, pass)
case _ =>
Credentials(Path.userHome / ".ivy2" / ".credentials")
}
}
lazy val publishSetting = publishTo <<= version.apply{
v =>
val nexus = "http://nexus.ci.bln.ourCompany-xxx.com/"
if (v.trim.endsWith("SNAPSHOT"))
Some("snapshots" at nexus + "content/repositories/snapshots")
else
Some("releases" at nexus + "content/repositories/snapshots")
}
}
Sequenze of commands
Starting Play Console (play)
clean
jacoco:cover
Details
Version of Play: play 2.2.1 built with Scala 2.10.2 (running Java 1.7.0_45)
Play about:
[info] This is sbt 0.13.0
[info] The current project is {file:/home/schl14/work/ProjektName/}ProjektName 0.1-b012d0e6a2c5b4a746490c0d34856af5e7d09bb9
[info] The current project is built against Scala 2.10.2
[info] Available Plugins: play.Project, com.typesafe.sbteclipse.plugin.EclipsePlugin, org.sbtidea.SbtIdeaPlugin, com.typesafe.sbt.SbtNativePackager, com.typesafe.sbt.SbtScalariform, sbtbuildinfo.Plugin, org.scalastyle.sbt.ScalastylePlugin, de.johoop.jacoco4sbt.JacocoPlugin
[info] sbt, sbt plugins, and build definitions are using Scala 2.10.2
Question
It worked previously (as in many many commits ago) and I failed to pinpoint what changes to the sourcecode made jacoco:cover fail. What could be the cause of this behaviour?
Once solved i will cut down on the amount of unecesary info posted here! Currently its hard to determin what needs to be provided.

Test cleanup hook not executed in scala play application

Below is my Build.scala file
There is no error in test, but the cleanup hook is not executed after test
what is the issue?
import play.Project._
import sbt._
import sbt.Keys._
object AppBuild extends Build {
val appName = "test"
val appVersion = "1.0"
val dependencies = Seq(
"org.scalatest" % "scalatest_2.10" % "2.0.RC1"
)
val main = play.Project(
appName, appVersion,
dependencies,
settings = Defaults.defaultSettings
)
.settings(
scalaVersion := "2.10.1",
testOptions in Test += Tests.Cleanup (
() => println("Cleanup")
)
)
}
testOptions in Test += Tests.Cleanup
does not work with forked test runs as mentioned in another Stackoverflow answer.
But there are workarounds:
Set fork to false
This is simple but may slow down your tests because they won't be executed in parallel.
sbt.Keys.fork in Test := false
Use the test framework
For example http://doc.scalatest.org/1.9.2/index.html#org.scalatest.BeforeAndAfterAll with the protected method afterAll()
Override the test task
My favorite.
test in Test ~= { testTask =>
val result = testTask
println("Cleanup")
result
}