Scala mock, MockFactory expects fails with 'Unexpected call' - scala

I'm trying to follow this doc - scalatest with scala-mock to mock the function and check if it has been called
class AggregateSpec extends FlatSpec with Matchers with MockFactory {
val data = Seq("This", "is", "something", "I", "would", "like", "to", "know")
"combop function" should "BE called for par collection" in {
val mockCombop = mockFunction[Int, Int, Int]
val parData = data.par
val result: Int = parData.aggregate(0)(
seqop = (acc, next) => acc + next.length,
combop = mockCombop
)
result should === (31)
mockCombop.expects(*, *).atLeastOnce()
}
}
As result:
> [info] - should BE called for non-par collection *** FAILED *** [info]
> Unexpected call: MockFunction2-1(4, 2) [info] [info] Expected:
> [info] inAnyOrder { [info] [info] } [info] [info] Actual:
> [info] MockFunction2-1(9, 1) [info] MockFunction2-1(2, 4)
> [info] MockFunction2-1(4, 2) [info] MockFunction2-1(5, 4)
> (Option.scala:121)
Why? How to make it pass with scalatest + scala-mock ?
--
As deps I use:
libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.1",
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test",
libraryDependencies += "org.scalamock" %% "scalamock-scalatest-support" % "3.5.0"

You need to call mockCombop.expects before mockCombop gets called, not after:
"combop function" should "BE called for par collection" in {
val mockCombop = mockFunction[Int, Int, Int]
val parData = data.par
mockCombop.expects(*, *).atLeastOnce()
val result: Int = parData.aggregate(0)(
seqop = (acc, next) => acc + next.length,
combop = mockCombop
)
result should === (31)
}

Related

Apache flink (1.9.1) runtime exception when using case classes in scala (2.12.8)

I am using case class in Scala (2.12.8) Apache Flink (1.9.1) application. I get the following exception when I run the code below Caused by: java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V.
NOTE: I have used the default constructor as per the suggestion ( java.lang.NoSuchMethodException for init method in Scala case class) but that does not work in my case
Here is the complete code
package com.zignallabs
import org.apache.flink.api.scala._
/**
// Implements the program that reads from a Element list, Transforms it into tuple and outputs to TaskManager
*/
case class AddCount ( firstP: String, count: Int) {
def this () = this ("default", 1) // No help when added default constructor as per https://stackoverflow.com/questions/51129809/java-lang-nosuchmethodexception-for-init-method-in-scala-case-class
}
object WordCount {
def main(args: Array[String]): Unit = {
// set up the execution environment
val env = ExecutionEnvironment.getExecutionEnvironment
// get input data
val input =env.fromElements(" one", "two", "three", "four", "five", "end of test")
// ***** Line 31 throws the exception
// Caused by: java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V
// at com.zignallabs.AddCount.<init>(WordCount.scala:7)
// at com.zignallabs.WordCount$.$anonfun$main$1(WordCount.scala:31)
// at org.apache.flink.api.scala.DataSet$$anon$1.map(DataSet.scala:490)
// at org.apache.flink.runtime.operators.chaining.ChainedMapDriver.collect(ChainedMapDriver.java:79)
// at org.apache.flink.runtime.operators.util.metrics.CountingCollector.collect(CountingCollector.java:35)
// at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:196)
// at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
// at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
// at java.lang.Thread.run(Thread.java:748)
val transform = input.map{w => AddCount(w, 1)} // <- Throwing exception
// execute and print result
println(transform)
transform.print()
transform.printOnTaskManager(" Word")
env.execute()
}
}
Run time exception is :
at com.zignallabs.AddCount.<init>(WordCount.scala:7)
at com.zignallabs.WordCount$.$anonfun$main$1(WordCount.scala:31)
at org.apache.flink.api.scala.DataSet$$anon$1.map(DataSet.scala:490)
at org.apache.flink.runtime.operators.chaining.ChainedMapDriver.collect(ChainedMapDriver.java:79)
at org.apache.flink.runtime.operators.util.metrics.CountingCollector.collect(CountingCollector.java:35)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:196)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
at java.lang.Thread.run(Thread.java:748)
I am building and running flink locally using local flink cluster with flink version 1.9.1.
Here is the build.sbt file:
name := "flink191KafkaScala"
version := "0.1-SNAPSHOT"
organization := "com.zignallabs"
scalaVersion := "2.12.8"
val flinkVersion = "1.9.1"
//javacOptions ++= Seq("-source", "1.7", "-target", "1.7")
val http4sVersion = "0.16.6"
resolvers ++= Seq(
"Local Ivy" at "file:///"+Path.userHome+"/.ivy2/local",
"Local Ivy Cache" at "file:///"+Path.userHome+"/.ivy2/cache",
"Local Maven Repository" at "file:///"+Path.userHome+"/.m2/repository",
"Artifactory Cache" at "https://zignal.artifactoryonline.com/zignal/zignal-repos"
)
val excludeCommonsLogging = ExclusionRule(organization = "commons-logging")
libraryDependencies ++= Seq(
"org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
"org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided",
"org.apache.flink" %% "flink-clients" % "1.9.1",
// Upgrade to flink-connector-kafka_2.11
"org.apache.flink" %% "flink-connector-kafka-0.11" % "1.9.1",
//"org.scalaj" %% "scalaj-http" % "2.4.2",
"com.squareup.okhttp3" % "okhttp" % "4.2.2"
)
publishTo := Some("Artifactory Realm" at "https://zignal.artifactoryonline.com/zignal/zignal")
credentials += Credentials("Artifactory Realm", "zignal.artifactoryonline.com", "buildserver", "buildserver")
//mainClass in Compile := Some("com.zignallabs.StoryCounterTopology")
mainClass in Compile := Some("com.zignallabs.WordCount")
scalacOptions ++= Seq(
"-feature",
"-unchecked",
"-deprecation",
"-language:implicitConversions",
"-Yresolve-term-conflict:package",
"-language:postfixOps",
"-target:jvm-1.8")
lazy val root = project.in(file(".")).configs(IntegrationTest)
If you're using default args for the constructors of a case class, it's much more idiomatic Scala to define them like this:
case class AddCount ( firstP: String = "default", count: Int = 1)
This is syntactic sugar that basically gives you the following for free:
case class AddCount ( firstP: String, count: Int) {
def this () = this ("default", 1)
def this (firstP:String) = this (firstP, 1)
def this (count:Int) = this ("default", count)
}
I am able to now run this application using Scala 2.12. The issue was in the environment. I needed to ensure conflicts binaries are not there especially the ones for scala 2.11 and scala 2.12

ScalaTest: no tests are run for GeneratorDrivenPropertyChecks

I am trying to set-up a property based testing with ScalaTest and ScalaCheck ... and based on the ouput it seems that I am succeeding, but it takes too fast and from what I understand normally ScalaCheck should inform you about how may tests were run, in my case this information is absent:
[IJ]sbt:algorithms2_1> testOnly *MedianOf3PartitioningProps
[info] Compiling 1 Scala source to /Users/vasile.gorcinschi/gitPerso/Algorithms/Chapter 2 Sorting/algorithms2_1/target/scala-2.12/test-classes ...
[warn] there was one deprecation warning; re-run with -deprecation for details
[warn] one warning found
[info] Done compiling.
[info] MedianOf3PartitioningProps:
[info] sort
[info] - should sort array of ints from 0 to 100
[info] +
[info] ScalaTest
[info] Run completed in 412 milliseconds.
[info] Total number of tests run: 1
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 1, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 1, Failed 0, Errors 0, Passed 1
Here is the test class:
class MedianOf3PartitioningProps extends FlatSpec with Matchers with GeneratorDrivenPropertyChecks with Gens {
private val medianOf3Partitioning = new MedianOf3Partitioning[Int]
implicit override val generatorDrivenConfig: PropertyCheckConfiguration = PropertyCheckConfig(minSuccessful = 1, maxDiscarded = 500, workers = 1)
behavior of "sort"
it should "sort array of ints from 0 to 100" in {
forAll(arraysGen){ a: Array[Int] =>
info(s"${a.mkString(",")}")
medianOf3Partitioning.sort(a) shouldEqual a.sorted }
}
}
The Gens trait is mine - it only comprises definition for the Gen[Array[Int]]:
trait Gens {
val arraysGen: Gen[Array[Int]] = containerOf[Array, Int](
chooseNum(Int.MinValue, Int.MaxValue) suchThat { _ < 100 }
).suchThat(_.length < 50)
}
I used this source for the test set-up. Just in case, I'm providing versions of scalacheck and scalatest (from Dependencies.scala and build.sbt):
lazy val scalaTest = "org.scalatest" %% "scalatest" % "3.0.5"
...
libraryDependencies ++= {
val scalaTestVersion = "3.0.5"
val scalaCheckVersion = "1.14.0"
Seq(scalaTest % Test,
"org.scalatest" %% "scalatest" % scalaTestVersion % "test",
"org.scalacheck" %% "scalacheck" % scalaCheckVersion % "test",
"com.storm-enroute" %% "scalameter" % "0.9"
)
}
Based on the small example from M. Odersky's "Programming in Scala", I switched from GeneratorDrivenPropertyChecks to more general PropertyChecks. I've also discovered issues with my Gen[Array[Int]] so I had to tweek that too. Posting a solution that worked (discovered cases that fail) in case this will help anyone else:
Gens trait:
trait Gens {
val minIntArraysGen: Gen[Array[Int]] = containerOf[Array, Int](Gen.chooseNum(0, 100))
}
The property based test:
import ca.vgorcinschi.Gens
import org.scalatest.MustMatchers._
import org.scalatest.WordSpec
import org.scalatest.prop.PropertyChecks
class MedianOf3PartitioningProps extends WordSpec with PropertyChecks with Gens {
"sort method" must {
"sort any Int array" in {
forAll (minIntArraysGen){ (a: Array[Int]) =>
whenever(a.nonEmpty) {
val maybeSorted = new MedianOf3Partitioning[Int].sort(a)
maybeSorted must equal (a.sorted)
}
}
}
}
}

Trying to create Unit tests for Slick database for Play web application in Scala

I am struggling trying to configure unit tests for my Play web applications using Scala. I am using Play 2.6 and Scala 2.11.8. When I use the database configuration and execute sbt test I get the error No implementation for play.api.db.slick.DatabaseConfigProvider was bound on my console. So I am going to show how I set my web app and maybe someone could point what is wrong. Just for the sake, the web application is working well. It is just the unit test for the database that I cannot create.
build.sbt
import play.sbt.PlayImport._
name := """crypto-miners-demo"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.8"
libraryDependencies += guice
libraryDependencies += evolutions
libraryDependencies += jdbc
libraryDependencies += filters
libraryDependencies += ws
libraryDependencies += "com.h2database" % "h2" % "1.4.194"
libraryDependencies += "com.typesafe.play" %% "anorm" % "2.5.3"
libraryDependencies += "org.scalatestplus.play" %% "scalatestplus-play" % "3.1.0" % Test
libraryDependencies += "com.typesafe.play" %% "play-slick" % "3.0.0"
libraryDependencies += "com.typesafe.play" %% "play-slick-evolutions" % "3.0.0"
libraryDependencies += "org.xerial" % "sqlite-jdbc" % "3.19.3"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"
dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.6.5"
application.conf:
play.application.loader = di.ApplicationLoader
play.filters.csrf.header.bypassHeaders {
X-Requested-With = "*"
Csrf-Token = "nocheck"
}
play.filters.csrf.bypassCorsTrustedOrigins = false
play.filters.disabled += play.filters.csrf.CSRFFilter
slick.dbs.default.profile = "slick.jdbc.SQLiteProfile$"
slick.dbs.default.db.driver = "org.sqlite.JDBC"
slick.dbs.default.db.url = "jdbc:sqlite:development.db"
slick.dbs.default.db.username = ""
slick.dbs.default.db.password = ""
db.default {
driver = org.sqlite.JDBC
url = "jdbc:sqlite:development.db"
username = ""
password = ""
}
play.modules.disabled += "play.api.db.DBModule"
RackRepositorySpec.scala:
import org.scalatestplus.play.PlaySpec
import org.scalatestplus.play.guice.GuiceOneAppPerTest
import play.api.db.evolutions._
import play.api.db.slick.DatabaseConfigProvider
import play.api.db.{Database, Databases}
import play.api.inject.bind
import play.api.inject.guice.GuiceInjectorBuilder
import play.api.test.Injecting
class RackRepositorySpec extends PlaySpec with GuiceOneAppPerTest with Injecting {
val database = Databases(
driver = "org.sqlite.JDBC",
url = "jdbc:sqlite:development.db",
name = "default",
config = Map(
"username" -> "",
"password" -> ""
)
)
val guice = new GuiceInjectorBuilder()
.overrides(bind[Database].toInstance(database))
.injector()
val defaultDbProvider = guice.instanceOf[DatabaseConfigProvider]
def beforeAll() = Evolutions.applyEvolutions(database)
def afterAll() = {
// Evolutions.cleanupEvolutions(database)
database.shutdown()
}
Evolution(
1,
"create table test (id bigint not null, name varchar(255));",
"drop table test;"
)
}
and I get this error when I execute sbt test:
[info] models.RackRepositorySpec *** ABORTED ***
[info] com.google.inject.ConfigurationException: Guice configuration errors:
[info]
[info] 1) No implementation for play.api.db.slick.DatabaseConfigProvider was bound.
[info] while locating play.api.db.slick.DatabaseConfigProvider
[info]
[info] 1 error
[info] at com.google.inject.internal.InjectorImpl.getProvider(InjectorImpl.java:1045)
[info] at com.google.inject.internal.InjectorImpl.getProvider(InjectorImpl.java:1004)
[info] at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1054)
[info] at play.api.inject.guice.GuiceInjector.instanceOf(GuiceInjectorBuilder.scala:409)
[info] at play.api.inject.guice.GuiceInjector.instanceOf(GuiceInjectorBuilder.scala:404)
[info] at play.api.inject.ContextClassLoaderInjector$$anonfun$instanceOf$2.apply(Injector.scala:117)
[info] at play.api.inject.ContextClassLoaderInjector.withContext(Injector.scala:126)
[info] at play.api.inject.ContextClassLoaderInjector.instanceOf(Injector.scala:117)
[info] at models.RackRepositorySpec.<init>(RackRepositorySpec.scala:26)
[info] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
I solved using the libraryDependencies += specs2 % Test in the build.sbt. I hope this is a good practice for slick:
import org.specs2.mutable.Specification
import play.api.Application
import play.api.test.WithApplicationLoader
import scala.concurrent.ExecutionContext.Implicits.global
import scala.concurrent.{Await, Future}
import scala.concurrent.duration.DurationInt
class RackRepositorySpec extends Specification {
"RackRepository" should {
"work as expected" in new WithApplicationLoader {
val app2dao = Application.instanceCache[RackRepository]
val rackRepository: RackRepository = app2dao(app)
Await.result(rackRepository.delete("r-1"), 3 seconds)
Await.result(rackRepository.delete("r-2"), 3 seconds)
Await.result(rackRepository.delete("r-3"), 3 seconds)
val testRacks = Set(
RackRow("r-1", 0.2F, System.currentTimeMillis()),
RackRow("r-2", 0.5F, System.currentTimeMillis()),
RackRow("r-3", 0.8F, System.currentTimeMillis())
)
Await.result(Future.sequence(testRacks.map(rackRepository.insert)), 3 seconds)
val storedRacks = Await.result(rackRepository.list(), 3 seconds)
storedRacks.toSet must contain(testRacks)
}
}
}

Why can't sbt resolve akka-remote?

When I build and run this program on sbt it has following errors:
[info] Updating {file:/opt/ifkaar/akkaprojects/calculation/}calculation...
[info] Resolving com.typesafe.akka#akka-remote;2.3.4 ...
[warn] module not found: com.typesafe.akka#akka-remote;2.3.4
[warn] ==== local: tried
[warn] /home/sarawaheed/.ivy2/local/com.typesafe.akka/akka-remote/2.3.4/ivys/ivy.xml
[warn] ==== public: tried
[warn] http://repo1.maven.org/maven2/com/typesafe/akka/akka-remote/2.3.4/akka-remote-2.3.4.pom
[warn] ==== Typesafe Repository: tried
[warn] http://repo.typesafe.com/typesafe/releases/com/typesafe/akka/akka-remote/2.3.4/akka-remote-2.3.4.pom
[info] Resolving jline#jline;2.11 ...
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: com.typesafe.akka#akka-remote;2.3.4: not found
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[trace] Stack trace suppressed: run last *:update for the full output.
[error] (*:update) sbt.ResolveException: unresolved dependency: com.typesafe.akka#akka-remote;2.3.4: not found
[error] Total time: 2 s, completed Jul 17, 2014 3:38:42 PM
here is my build.sbt file
name := "calculation"
version := "1.0"
scalaVersion := "2.11.1"
resolvers += "Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/"
libraryDependencies += "com.typesafe.akka" %% "akka-actor" % "2.3.4,"com.typesafe.akka" %% "akka-remote" % "2.3.4""
here is my PI.scala code
import akka.actor._
import akka.routing.RoundRobinRouter
import akka.util.Duration
import akka.util.duration._
object Pi extends App {
calculate(nrOfWorkers = 4, nrOfElements = 10000, nrOfMessages = 10000)
sealed trait PiMessage
case object Calculate extends PiMessage
case class Work(start: Int, nrOfElements: Int) extends PiMessage
case class Result(value: Double) extends PiMessage
case class PiApproximation(pi: Double, duration: Duration)
class Worker extends Actor {
def calculatePiFor(start: Int, nrOfElements: Int): Double = {
var acc = 0.0
for (i ← start until (start + nrOfElements))
acc += 4.0 * (1 - (i % 2) * 2) / (2 * i + 1)
acc
}
def receive = {
case Work(start, nrOfElements) ⇒
sender ! Result(calculatePiFor(start, nrOfElements)) // perform the work
}
}
class Master(nrOfWorkers: Int, nrOfMessages: Int, nrOfElements: Int, listener: ActorRef)
extends Actor {
var pi: Double = _
var nrOfResults: Int = _
val start: Long = System.currentTimeMillis
val workerRouter = context.actorOf(
Props[Worker].withRouter(RoundRobinRouter(nrOfWorkers)), name = "workerRouter")
def receive = {
case Calculate ⇒
for (i ← 0 until nrOfMessages) workerRouter ! Work(i * nrOfElements, nrOfElements)
case Result(value) ⇒
pi += value
nrOfResults += 1
if (nrOfResults == nrOfMessages) {
// Send the result to the listener
listener ! PiApproximation(pi, duration = (System.currentTimeMillis - start).millis)
// Stops this actor and all its supervised children
context.stop(self)
}
}
}
class Listener extends Actor {
def receive = {
case PiApproximation(pi, duration) ⇒
println("\n\tPi approximation: \t\t%s\n\tCalculation time: \t%s"
.format(pi, duration))
context.system.shutdown()
}
}
def calculate(nrOfWorkers: Int, nrOfElements: Int, nrOfMessages: Int) {
// Create an Akka system
val system = ActorSystem("PiSystem")
// create the result listener, which will print the result and shutdown the system
val listener = system.actorOf(Props[Listener], name = "listener")
// create the master
val master = system.actorOf(Props(new Master(
nrOfWorkers, nrOfMessages, nrOfElements, listener)),
name = "master")
// start the calculation
master ! Calculate
}
}
Note: I'm following this tutorial.
The libraryDependencies entry in your build.sbt file should look like this:
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.3.4",
"com.typesafe.akka" %% "akka-remote" % "2.3.4"
)
The ++= operator is a kind of list concatenation operator. It tells sbt to merge this new dependency collection with the existing ones, whichever that may be.
The += operator appends a single item, as opposed to concating a collection. For example:
libraryDependencies += "com.typesafe.akka" %% "akka-actor" % "2.3.4"
libraryDependencies += "com.typesafe.akka" %% "akka-remote" % "2.3.4"
The %% operator is a shorthand to explicitly specifying the Scala version which the required libraries should be compiled against. There's a different library name for each Scala version. For example, instead of writing:
"com.typesafe.akka" % "akka-actor_2.11" % "2.3.4"
You write:
"com.typesafe.akka" %% "akka-actor" % "2.3.4"
The declared scalaVersion setting will be used to "expand" the %% operator. Note that only the X.Y numbers from the X.Y.Z full version will be used. This happens because Scala versions are compatible when Z changes, but not when X or Y change.
You may also eventually end up with the following to avoid duplications:
libraryDependencies ++= Seq("actor", "remote").map("akka-" + _).map("com.typesafe.akka" %% _ % "2.3.4")
And since akka-remote depends on akka-actor the following suffices:
libraryDependencies += "com.typesafe.akka" %% "akka-remote" % "2.3.4"

jacoco:cover from play console gives NoClassDefFoundError: Could not initialize class com.sun.xml.internal.ws.api.BindingID

Running jacoco:cover from the Play console using jacoco4sbt (2.1.4) results in many tests failing and messages like:
[debug] Running
TaskDef(com.ourCompany.ourProject.identity.LoginControllerSpec,
org.scalatest.tools.Framework$$anon$1#5b98f69a, false,
[SuiteSelector])
java.lang.NoClassDefFoundError: Could not initialize class com.sun.xml.internal.ws.api.BindingID
at com.sun.xml.internal.ws.wsdl.parser.RuntimeWSDLParser.parseBinding(RuntimeWSDLParser.java:445)
at com.sun.xml.internal.ws.wsdl.parser.RuntimeWSDLParser.parseWSDL(RuntimeWSDLParser.java:342)
at com.sun.xml.internal.ws.wsdl.parser.RuntimeWSDLParser.parse(RuntimeWSDLParser.java:157)
at com.sun.xml.internal.ws.wsdl.parser.RuntimeWSDLParser.parse(RuntimeWSDLParser.java:120)
at com.sun.xml.internal.ws.client.WSServiceDelegate.parseWSDL(WSServiceDelegate.java:257)
at com.sun.xml.internal.ws.client.WSServiceDelegate.(WSServiceDelegate.java:220)
at com.sun.xml.internal.ws.client.WSServiceDelegate.(WSServiceDelegate.java:168)
at com.sun.xml.internal.ws.spi.ProviderImpl.createServiceDelegate(ProviderImpl.java:96)
at javax.xml.ws.Service.(Service.java:77)
at com.bsl.Services.(Services.java:46)
at com.ourCompany.ourProject.identity.UserRepositoryComponent$OdsUserRepository.(UserRepositoryComponent.scala:92)
at com.ourCompany.ourProject.identity.ComponentRegistry$class.$init$(ComponentRegistry.scala:7)
at com.ourCompany.ourProject.identity.LoginControllerSpec$TestLoginController$.(LoginControllerSpec.scala:20)
at com.ourCompany.ourProject.identity.LoginControllerSpec.TestLoginController$lzycompute(LoginControllerSpec.scala:20)
at com.ourCompany.ourProject.identity.LoginControllerSpec.TestLoginController(LoginControllerSpec.scala:20)
at com.ourCompany.ourProject.identity.LoginControllerSpec$$anonfun$1.apply$mcV$sp(LoginControllerSpec.scala:32)
at com.ourCompany.ourProject.identity.LoginControllerSpec$$anonfun$1.apply(LoginControllerSpec.scala:32)
at com.ourCompany.ourProject.identity.LoginControllerSpec$$anonfun$1.apply(LoginControllerSpec.scala:32)
at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
at org.scalatest.Transformer.apply(Transformer.scala:22)
at org.scalatest.Transformer.apply(Transformer.scala:20)
at org.scalatest.FlatSpecLike$$anon$1.apply(FlatSpecLike.scala:1636)
at org.scalatest.Suite$class.withFixture(Suite.scala:1121)
at org.scalatest.FlatSpec.withFixture(FlatSpec.scala:1683)
at org.scalatest.FlatSpecLike$class.invokeWithFixture$1(FlatSpecLike.scala:1633)
at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1645)
at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1645)
at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
at org.scalatest.FlatSpecLike$class.runTest(FlatSpecLike.scala:1645)
at org.scalatest.FlatSpec.runTest(FlatSpec.scala:1683)
at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1703)
at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1703)
at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:390)
at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:427)
at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
at org.scalatest.FlatSpecLike$class.runTests(FlatSpecLike.scala:1703)
at org.scalatest.FlatSpec.runTests(FlatSpec.scala:1683)
at org.scalatest.Suite$class.run(Suite.scala:1423)
at org.scalatest.FlatSpec.org$scalatest$FlatSpecLike$$super$run(FlatSpec.scala:1683)
at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1749)
at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1749)
at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
at org.scalatest.FlatSpecLike$class.run(FlatSpecLike.scala:1749)
at com.ourCompany.ourProject.identity.LoginControllerSpec.org$scalatest$BeforeAndAfterAll$$super$run(LoginControllerSpec.scala:11)
at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:257)
at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:256)
at com.ourCompany.ourProject.identity.LoginControllerSpec.run(LoginControllerSpec.scala:11)
at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:444)
at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:651)
at sbt.TestRunner.runTest$1(TestFramework.scala:84)
at sbt.TestRunner.run(TestFramework.scala:94)
at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:224)
at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:224)
at sbt.TestFramework$.sbt$TestFramework$$withContextLoader(TestFramework.scala:212)
at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:224)
at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:224)
at sbt.TestFunction.apply(TestFramework.scala:229)
at sbt.Tests$.sbt$Tests$$processRunnable$1(Tests.scala:211)
at sbt.Tests$$anonfun$makeSerial$1.apply(Tests.scala:217)
at sbt.Tests$$anonfun$makeSerial$1.apply(Tests.scala:217)
at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:45)
at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:45)
at sbt.std.Transform$$anon$4.work(System.scala:64)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
at sbt.Execute.work(Execute.scala:244)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:160)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:30)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
[error] Could not run test com.ourCompany.ourProject.identity.LoginControllerSpec:
java.lang.NoClassDefFoundError: Could not initialize class
com.sun.xml.internal.ws.api.BindingID
LoginControlerSpec
package com.ourCompany.ourProject.identity
import org.scalatest.{ Matchers, BeforeAndAfterAll, FlatSpec }
import play.api.test.{ FakeApplication, FakeRequest }
import com.ourCompany.ourProject.tags.UnitTest
import play.api.Play
import org.specs2.mock.Mockito
import play.api.test.Helpers._
#UnitTest
class LoginControllerSpec extends FlatSpec with BeforeAndAfterAll with Mockito with Matchers {
override def beforeAll() {
Play.start(FakeApplication())
}
override def afterAll() {
Play.stop()
}
object TestLoginController extends LoginController with Secured with ComponentRegistryMock
val Home = "/"
val username = "XXX#ourCompanyprofessional.com"
val password = "XXX1"
val isRememberMe = false
val expirationPolicy = SessionExpirationPolicy
val odsUser = OdsUser(Identity(username, ""), Some(username))
behavior of "LoginController"
it should "send 200 for renderLoginForm with not authenticated user" in {
status(TestLoginController.renderLoginForm(FakeRequest())) should be(OK)
}
it should "send 303 for renderLoginForm with authenticated user" in {
val cookie = TestLoginController.authenticatorService.create(odsUser.identity, expirationPolicy) match {
case Right(a: Authenticator) => a.toCookie
case _ => fail()
}
TestLoginController.userRepository.find(odsUser.identity.userId) returns Some(odsUser)
val result = TestLoginController.renderLoginForm(FakeRequest().withCookies(cookie))
redirectLocation(result) should be(Some(Home))
status(result) should be(SEE_OTHER)
}
it should "send 303 for login with valid credentials" in {
val fakeRequest = FakeRequest().withFormUrlEncodedBody(
UsernameField -> username, PasswordField -> password, RememberMeField -> isRememberMe.toString)
TestLoginController.userRepository.login(username, password) returns Right(odsUser)
val result = TestLoginController.login(fakeRequest)
cookies(result).get(Authenticator.cookieName) match {
case Some(c) => assert(c.name == Authenticator.cookieName && !c.value.isEmpty)
case _ => fail()
}
redirectLocation(result) should be(Some(Home))
status(result) should be(SEE_OTHER)
}
it should "send 400 for login with invalid credentials" in {
val fakeRequest = FakeRequest().withFormUrlEncodedBody(
UsernameField -> username, PasswordField -> password, RememberMeField -> isRememberMe.toString)
TestLoginController.userRepository.login(username, password) returns Left(new Error)
val result = TestLoginController.login(fakeRequest)
status(result) should be(BAD_REQUEST)
}
it should "send 400 for login with invalid form input" in {
val fakeRequest = FakeRequest().withFormUrlEncodedBody(
UsernameField -> "", PasswordField -> "", RememberMeField -> "")
val result = TestLoginController.login(fakeRequest)
status(result) should be(BAD_REQUEST)
}
}
build.sbt
gitHeadCommitSha in ThisBuild := Process("git rev-parse HEAD").lines.head
(testOptions in Test) += Tests.Argument(TestFrameworks.ScalaTest, "-h", "target/report")
org.scalastyle.sbt.ScalastylePlugin.Settings
scalacOptions := Seq("-feature")
build.scala
import com.typesafe.sbt.SbtNativePackager._
import com.typesafe.sbt.SbtScalariform._
import play.Project._
import sbt.Keys._
import sbt._
import sbtbuildinfo.Plugin._
import scala._
import scala.util.Try
import scala.Some
import de.johoop.jacoco4sbt.JacocoPlugin._
object BuildSettings {
val buildOrganization = "com.ourCompany.ourProject"
val buildVersion = "0.1-SNAPSHOT"
val buildScalaVersion = "2.10.2"
val envConfig = "-Djava.awt.headless=true -Dsbt.log.format=false -Dconfig.file=" +
Option(System.getProperty("env.config")).getOrElse("local.application")
val maxErrors = 20
// disable running browserstack tests by default. Possible options: true | false [default]
val browserstack: Boolean = Try(System.getProperty("test.browserstack").toBoolean).getOrElse(false)
val buildSettings = Defaults.defaultSettings ++ Seq (
organization := buildOrganization,
version := buildVersion,
scalaVersion := buildScalaVersion,
scalacOptions ++= Seq("-unchecked", "-optimise", "-deprecation",
"-Xcheckinit", "-encoding", "utf8", "-feature", "-Yinline-warnings",
"-Xfatal-warnings"),
javaOptions ++= Seq("-Xms512M","-Xmx1536M", "-Xss1M", "-XX:ReservedCodeCacheSize=192M",
"-XX:+CMSClassUnloadingEnabled", "-XX:MaxPermSize=512M"),
javaOptions += envConfig,
publishMavenStyle := false
)
}
object Resolvers {
val remoteRepoUrl = "ourCompany Nexus Snapshots" at "http://nexus.ci.bln.ourCompany-xxx.com/content/repositories/snapshots/"
val publishRepoUrl = "ourCompany Nexus Snapshots" at "http://nexus.ci.bln.ourCompany-xxx.com/content/repositories/snapshots/"
val releaseRepoUrl = "ourCompany Nexus Releases" at "http://nexus.ci.bln.ourCompany-xxx.com/content/repositories/releases/"
}
object Dependencies {
val ods = "de.bsmo.ourCompany-professional" % "sprprof-ws" % "2.2.1-SNAPSHOT"
val scalatest = "org.scalatest" %% "scalatest" % "2.0" % "test->*" withSources()
val mockito = "org.mockito" % "mockito-all" % "1.9.5" % "test"
val ghostDriver = "com.github.detro.ghostdriver" % "phantomjsdriver" % "1.0.3" % "test"
val cmsClient = "com.ourCompany.cms.ws.clients" % "ourCompany-cms-java-api" % "1.1.6"
val solrjClient = "org.apache.solr" % "solr-solrj" % "4.3.1" % "compile"
}
object ApplicationBuild extends Build {
import BuildSettings._
import Dependencies._
import Resolvers._
// Sub-project specific dependencies
val commonDeps = Seq(
ods,
scalatest,
mockito,
ghostDriver,
cmsClient,
jdbc,
anorm,
filters,
solrjClient,
cache
)
//val bN = settingKey[Int]("current build Number")
val gitHeadCommitSha = settingKey[String]("current git commit SHA")
val release = settingKey[Boolean]("Release")
lazy val ourProject = play.Project(
"ourProject",
path = file("."),
settings = Defaults.defaultSettings ++ buildSettings ++
Seq(libraryDependencies ++= commonDeps) ++
Seq(scalariformSettings: _*) ++
Seq(playScalaSettings: _*) ++
Seq(publishArtifact := false) ++
buildInfoSettings ++
jacoco.settings ++
Seq(
sourceGenerators in Compile <+= buildInfo,
buildInfoKeys ++= Seq[BuildInfoKey](
resolvers,
libraryDependencies in Test,
buildInfoBuildNumber,
BuildInfoKey.map(name) { case (k, v) => "project" + k.capitalize -> v.capitalize },
"envConfig" -> envConfig, // computed at project load time
BuildInfoKey.action("buildTime") {
System.currentTimeMillis
} // re-computed each time at compile
),
buildInfoPackage := "com.ourCompany.ourProject"
) ++
Seq(resolvers += remoteRepoUrl) ++
Seq(resolvers += releaseRepoUrl) ++
Seq(mappings in Universal ++= Seq(
file("ops/rpm/start-server.sh") -> "start-server.sh",
file("ops/rpm/stop-server.sh") -> "stop-server.sh"
)) ++
Seq(testOptions in Test += Tests.Argument(if(browserstack) "-n" else "-l", "com.ourCompany.ourProject.tags.BrowserStackTest"))
).settings(version <<= version in ThisBuild)
.settings(parallelExecution in jacoco.Config := false)
.settings({
if(browserstack) {
javaOptions in Test += "-Dconfig.file=conf/browserstack.application.conf"
} else {
javaOptions in Test += "-Dtest.none=true"
}
})
lazy val ourProjectPackaging = Project(
"packaging",
file("ourProjectPackaging"), settings=
Defaults.defaultSettings ++
Seq(Packaging.settings:_*) ++
Seq(resolvers += publishRepoUrl) ++
buildSettings ++
publishSetting ++
Seq(publishArtifact := false) ++
credentialsSetting
).settings(Packaging.rpmDistSettings: _*).settings(version <<= version in Rpm )
lazy val credentialsSetting = credentials += {
Seq("NEXUS_USER", "NEXUS_PASSWORD").map(k => Option(System.getenv(k))) match {
case Seq(Some(user), Some(pass)) =>
Credentials("Sonatype Nexus Repository Manager",
"nexus.ci.bln.ourCompany-xxx.com", user, pass)
case _ =>
Credentials(Path.userHome / ".ivy2" / ".credentials")
}
}
lazy val publishSetting = publishTo <<= version.apply{
v =>
val nexus = "http://nexus.ci.bln.ourCompany-xxx.com/"
if (v.trim.endsWith("SNAPSHOT"))
Some("snapshots" at nexus + "content/repositories/snapshots")
else
Some("releases" at nexus + "content/repositories/snapshots")
}
}
Sequenze of commands
Starting Play Console (play)
clean
jacoco:cover
Details
Version of Play: play 2.2.1 built with Scala 2.10.2 (running Java 1.7.0_45)
Play about:
[info] This is sbt 0.13.0
[info] The current project is {file:/home/schl14/work/ProjektName/}ProjektName 0.1-b012d0e6a2c5b4a746490c0d34856af5e7d09bb9
[info] The current project is built against Scala 2.10.2
[info] Available Plugins: play.Project, com.typesafe.sbteclipse.plugin.EclipsePlugin, org.sbtidea.SbtIdeaPlugin, com.typesafe.sbt.SbtNativePackager, com.typesafe.sbt.SbtScalariform, sbtbuildinfo.Plugin, org.scalastyle.sbt.ScalastylePlugin, de.johoop.jacoco4sbt.JacocoPlugin
[info] sbt, sbt plugins, and build definitions are using Scala 2.10.2
Question
It worked previously (as in many many commits ago) and I failed to pinpoint what changes to the sourcecode made jacoco:cover fail. What could be the cause of this behaviour?
Once solved i will cut down on the amount of unecesary info posted here! Currently its hard to determin what needs to be provided.