Request was not handled with spray-testkit - scalacheck

My service route:
get(
path("add" / IntNumber / IntNumber)( (a, b) =>
complete((a + b).toString())
)
) ~
post(
path("add") (
formFields('a.as[Int], 'b.as[Int]) {
(a, b) => complete((a + b).toString())
})
)
my spec:
import spray.http.FormData
class RouteDefinitionSpec
extends org.specs2.mutable.Specification
with org.specs2.ScalaCheck
with spray.testkit.Specs2RouteTest with RouteDefinition {
def actorRefFactory = system
"the route" should {
"add with get requests" in {
prop { (a: Int, b: Int) =>
Get(s"/add/$a/$b") ~> route ~> check {
responseAs[String] === s"${a+b}"
}
}
}
"add with post form data request" in {
prop { (a: Int, b: Int) =>
Post("/add", FormData(Seq("a" -> a.toString, "b" -> b.toString))) ~> route ~> check {
responseAs[String] === s"${a+b}"
}
}
}
}
}
Both GET and POST routes are working proper if tested in browser. The POST works also in test.
What is wrong with my GET route? Why it can not be tested? What causes such an error and how to avoid it?
[info] RouteDefinitionSpec
[info]
[info] the route should
[error] x add with get requests
[error] Falsified after 0 passed tests.
[error] > ARG_0: 2147483647
[error] > ARG_1: -2147483648
[error] > Request was not handled (RouteDefinitionSpec.scala:5)
[info]
[info] + add with post form data request
[info]
[info]
[info] Total for specification RouteDefinitionSpec
[info] Finished in 393 ms
[info] 2 examples, 102 expectations, 1 failure, 0 error
UPDATE:
it seems something to do with scalacheck because following not-property-based test is also "green":
"add test without scalacheck" in {
Get("/add/30/58") ~> route ~> check {
responseAs[String] === "88"
}
}

Related

ThreadPoolExecution error with scala, slick, specs2

I am trying to write a db unit test for my module using slick and specs2 in scala language. I have written two tests at the moment but the second one fails with ThreadPoolExecution error.
I have tried many approaches but still can't manage to solve my problem.
Any help would be greatly appreciated.
Here is the test code:
class UserClientSpecs extends Specification{
"App should " should {
"add value to database" in new WithApplication {
println("before first test")
val recordEntry = new UserClient(None, "Lohs_atkal", 2)
val newRecord = UserService.addUser(recordEntry)
newRecord.onComplete {
case Success(value) => println(s"Got the callback, meaning = $value")
case Failure(e) => println(e.getMessage)
}
newRecord should not beNull
val count = UserService.listAllUsers.map {
v =>
println("the number of database entries are " + v.length)
}
}
"delete a record" in new WithApplication {
println("before second test")
val recordEntry = new UserClient(Some(0), "Lielaks Lohs", 5)
val newRecord = UserService.addUser(recordEntry)
newRecord.map {
v => println("newRecord value", v)
}
newRecord.onComplete {
case Success(value) => println(s"Got the callback, meaning = $value")
case Failure(e) => println(e.getMessage)
}
val recordEntry2 = new UserClient(Some(1), "Lielaks Lohs2", 50)
val newRecord2 = UserService.addUser(recordEntry2)
val countOne = UserService.listAllUsers.map {
res =>
println(res.length)
}
val deleteUser = UserService.deleteUser(1)
val countTwo = UserService.listAllUsers.map {
res =>
res should_==(1)
res should !==(2)
}
}
}
}
And the error I am getting when running my test through sbt -> testOnly:
[play-scala] $ testOnly models.UserClientSpecs
[info] UserClientSpecs
[info]
before first test
Got the callback, meaning = Some(UserClient(None,Lohs_atkal,2))
the number of database entries are 1
[info] App should should
[info] + add value to database
before second test
Task slick.backend.DatabaseComponent$DatabaseDef$$anon$2#6d01394 rejected from java.util.concurrent.ThreadPoolExecutor#7fa5be9b[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 2]
[info] + delete a record
[info]
[info] Total for specification UserClientSpecs
[info] Finished in 2 seconds, 739 ms
[info] 2 examples, 0 failure, 0 error
[info]
[info] Passed: Total 2, Failed 0, Errors 0, Passed 2
[success] Total time: 8 s, completed Mar 10, 2016 1:40:10 PM

Race condition in Slick Code

I have written this slick DAO and its unit test in specs2.
My code has race conditions. When I run the same tests, I get different outputs.
The race conditions exist even though in both the functions I do Await.result(future, Duration.Inf)
DAO
package com.example
import slick.backend.DatabasePublisher
import slick.driver.H2Driver.api._
import scala.concurrent.ExecutionContext.Implicits.global
import slick.jdbc.meta._
import scala.concurrent._
import ExecutionContext.Implicits.global
import scala.concurrent.duration._
case class Person(id: Int, firstname: String, lastname: String)
class People(tag: Tag) extends Table[Person](tag, "PEOPLE") {
def id = column[Int]("PERSON_ID", O.PrimaryKey)
def firstname = column[String]("PERSON_FIRST_NAME")
def lastname = column[String]("PERSON_LAST_NAME")
def * = (id, firstname, lastname) <> (Person.tupled, Person.unapply _)
}
object PersonDAO {
private def createList(numRows: Int) : List[Person] = {
def recFunc(counter: Int, result: List[Person]) : List[Person] = {
counter match {
case x if x <= numRows => recFunc(counter + 1, Person(counter, "test" + counter, "user" + counter) :: result)
case _ => result
}
}
recFunc(1, List[Person]())
}
val db = Database.forConfig("test1")
val people = TableQuery[People]
def createAndPopulate(numRows: Int) = {
val action1 = people.schema.create
val action2 = people ++= Seq(createList(numRows) : _* )
val combined = db.run(action1 andThen action2)
val future1 = combined.map { result =>
result map {x =>
println(s"number of rows inserted $x")
x
}
}
Await.result(future1, Duration.Inf).getOrElse(0)
}
def printAll() = {
val a = people.result
val b = db.run(a)
val y = b map { result =>
result map {x => x}
}
val z = Await.result(y, Duration.Inf)
println(z)
println(z.length)
z
}
}
Unit Test
import org.specs2.mutable._
import com.example._
class HelloSpec extends Specification {
"This usecase " should {
"should insert rows " in {
val x = PersonDAO.createAndPopulate(100)
x === 100
}
}
"This usecase " should {
"return 100 rows" in {
val x = PersonDAO.printAll()
val y = PersonDAO.printAll()
y.length === 100
}
}
}
When I run this same code using activator test I see 2 different types of outputs on different runs
sometimes the code gets exception
number of rows inserted 100
[info] HelloSpec
[info]
[info] This usecase should
[info] + should insert rows
[info]
[info] This usecase should
[info] ! return 100 rows
[error] JdbcSQLException: : Table PEOPLE not found; SQL statement:
[error] select x2."PERSON_ID", x2."PERSON_FIRST_NAME", x2."PERSON_LAST_NAME" from "PEOPLE" x2 [42S02-60] (Message.java:84)
[error] org.h2.message.Message.getSQLException(Message.java:84)
[error] org.h2.message.Message.getSQLException(Message.java:88)
[error] org.h2.message.Message.getSQLException(Message.java:66)
Sometimes the 1st function call returns 0 rows and the 2nd function call returns 100 values
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
number of rows inserted 100
Vector()
0
Vector(Person(100,test100,user100), Person(99,test99,user99), Person(98,test98,user98), Person(97,test97,user97), Person(96,test96,user96), Person(95,test95,user95), Person(94,test94,user94), Person(93,test93,user93), Person(92,test92,user92), Person(91,test91,user91), Person(90,test90,user90), Person(89,test89,user89), Person(88,test88,user88), Person(87,test87,user87), Person
I don't understand why does my code have these race conditions because I block on future in each method.
Your assumption that two test cases should run serial, one after the other is not right. The test cases are running parallel. Just use sequential to verify that thats the case.

Some(null) spray json deserialization error

I'm getting a http response with the following json object
{
transaction_hash: "fbb36255453bf8ff465d9ca5c427bd0e36cc799fda090cbcd62113f1f3e97cb4",
output_index: 0,
value: 2000000,
asset_id: null,
asset_quantity: null,
addresses: [
"1C4kYhyLftmkn48YarSoLupxHfYFo8kp64"
],
script_hex: "76a914795efb808598d6a24d1734b929fce1d4b713215188ac",
spent: false,
confirmations: 72935
}
and this is how I am reading the json object into native scala objects
override def read(value: JsValue): UnspentTXO = {
val jsObject = value.asJsObject
// get only non-optional values here
val Seq(transaction_hash, output_index, locked_satoshies, addresses, script_hex, spent) =
jsObject.getFields("transaction_hash", "output_index", "value", "addresses",
"script_hex", "spent")
println("Asset Id: " + jsObject.fields.get("asset_id"))
val assetId = jsObject.fields.get("asset_id") match {
case Some(JsString(s)) => println("S : " + s); Some(s)
case None => None
}
val assetQuantity = jsObject.fields.get("asset_quantity") match {
case Some(JsNumber(n)) => Some(n.toLong)
case None => None
}
// convert JsArray to List[ BitcoinAdress ]
val addressList = addresses match {
case ja: JsArray => {
ja.elements.toList.map(e => BitcoinAddress(e.convertTo[String]))
}
}
UnspentTXO(transaction_hash.convertTo[String], output_index.convertTo[Int],
locked_satoshies.convertTo[Long], assetId, assetQuantity,
addressList, script_hex.convertTo[String], spent.convertTo[Boolean])
finally here is the error message:
[info] - must create an unsigned nlocktime for all of the bitcoin in an address *** FAILED ***
[info] spray.httpx.PipelineException: Some(null) (of class scala.Some)
[info] at spray.httpx.ResponseTransformation$$anonfun$unmarshal$1.apply(ResponseTransformation.scala:36)
[info] at spray.httpx.ResponseTransformation$$anonfun$unmarshal$1.apply(ResponseTransformation.scala:31)
[info] at scala.util.Success$$anonfun$map$1.apply(Try.scala:236)
[info] at scala.util.Try$.apply(Try.scala:191)
[info] at scala.util.Success.map(Try.scala:236)
[info] at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
[info] at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
[info] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
[info] at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
[info] at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
[info] ...
[info] Cause: scala.MatchError: Some(null) (of class scala.Some)
[info] at com.coinprism.blockchain.UnspentTXOProtocol$UnspentTXOProtocolFormat$.read(UnspentTXOProtocol.scala:30)
[info] at com.coinprism.blockchain.UnspentTXOProtocol$UnspentTXOProtocolFormat$.read(UnspentTXOProtocol.scala:21)
[info] at spray.json.JsValue.convertTo(JsValue.scala:31)
[info] at spray.json.CollectionFormats$$anon$1$$anonfun$read$1.apply(CollectionFormats.scala:28)
[info] at spray.json.CollectionFormats$$anon$1$$anonfun$read$1.apply(CollectionFormats.scala:28)
[info] at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
[info] at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
[info] at scala.collection.Iterator$class.foreach(Iterator.scala:743)
[info] at scala.collection.AbstractIterator.foreach(Iterator.scala:1195)
[info] at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
but I am getting an error saying that I am trying to read an asset_id into to Some(null). I thought with spray json that null values are deserialized as a native scala type None? Where am I going wrong here?
Why don't you use DefaultJsonProtocol for parsing and instead doing all the things manually? Even your protocol has some quirks about special formatting for integer values, spray-json can do it for you.
So you can define a case class describing your model and use an automatic json mapper from spray:
case class SomeReply (
transaction_hash:String,
output_index:Long,
value:Long,
asset_id:Option[Long],
asset_quantity:Option[Long],
addresses:List[String],
script_hex:String,
spent:Boolean,
confirmations:Long
)
object SomeReplyProtocol extends DefaultJsonProtocol {
// custom format for (string|int) => number conversion
implicit object StringNumFormat extends RootJsonFormat[Long] {
val numberPattern = "([0-9]+)".r
def write(value:Long) = JsNumber(value)
def read(value:JsValue) = value match {
case JsString(numberPattern(number)) => number.toLong
case JsNumber(number) => number.toLong
case _ => deserializationError("Cannot deserialize StringNumber")
}
}
implicit val replyFmt = jsonFormat9(SomeReply)
}
So you can use this code in this way:
object Main {
val jsonString = """{
"transaction_hash": "fbb36255453bf8ff465d9ca5c427bd0e36cc799fda090cbcd62113f1f3e97cb4",
"output_index": 0,
"value": 2000000,
"asset_id": null,
"asset_quantity": null,
"addresses": [
"1C4kYhyLftmkn48YarSoLupxHfYFo8kp64"
],
"script_hex": "76a914795efb808598d6a24d1734b929fce1d4b713215188ac",
"spent": false,
"confirmations": 72935
}""" // value is a number
val jsonStringQuirk = """{
"transaction_hash": "fbb36255453bf8ff465d9ca5c427bd0e36cc799fda090cbcd62113f1f3e97cb4",
"output_index": 0,
"value": "2000000",
"asset_id": null,
"asset_quantity": null,
"addresses": [
"1C4kYhyLftmkn48YarSoLupxHfYFo8kp64"
],
"script_hex": "76a914795efb808598d6a24d1734b929fce1d4b713215188ac",
"spent": false,
"confirmations": 72935
}""" // value is a String here!!
def main(args: Array[String]): Unit = {
import SomeReplyProtocol._
val json = jsonString.parseJson.convertTo[SomeReply]
val jsonQ = jsonStringQuirk.parseJson.convertTo[SomeReply]
println(json) // null mapped to None, yay!
println(jsonQ) // implicit string -> int conversion here
}
}

Not all akka stream Sinks receive the emitted data

When running the following akka streaming FlowGraph not all the emitted Chars are received by all Sinks.
package sample.stream
import java.io.{ FileOutputStream, PrintWriter }
import akka.actor.ActorSystem
import akka.stream.ActorFlowMaterializer
import akka.stream.scaladsl.{ Broadcast, FlowGraph, Sink, Source }
import scala.concurrent.forkjoin.ThreadLocalRandom
import scala.util.{ Failure, Success, Try }
object Sample {
def main(args: Array[String]): Unit = {
println("start")
implicit val system = ActorSystem("Sys")
import system.dispatcher
implicit val materializer = ActorFlowMaterializer()
var counter = -1
val countSource: Source[Char, Unit] = Source(() => Iterator.continually { counter += 1; (counter + 'A').toChar }.take(11))
var counter1 = 0
val consoleSink1 = Sink.foreach[Char] { counter =>
println("sink1:" + counter1 + ":" + counter)
counter1 += 1
Thread.sleep(100)
//Thread.sleep(300)
}
var counter2 = 0
val consoleSink2 = Sink.foreach[Char] { counter =>
println("sink2:" + counter2 + ":" + counter)
counter2 += 1
Thread.sleep(200)
}
val materialized = FlowGraph.closed(consoleSink1, consoleSink2)((x1, x2) => x1) { implicit builder =>
(console1, console2) =>
import FlowGraph.Implicits._
val broadcast = builder.add(Broadcast[Char](2))
countSource ~> broadcast ~> console1
broadcast ~> console2
}.run()
// ensure the output file is closed and the system shutdown upon completion
materialized.onComplete {
case Success(_) =>
system.shutdown()
case Failure(e) =>
println(s"Failure: ${e.getMessage}")
system.shutdown()
}
println("waiting the remaining ones")
//scala.concurrent.Await.ready(materialized, scala.concurrent.duration.DurationInt(100).seconds)
//system.shutdown()
println("end")
}
}
After running the following output is generated
[info] Running sample.stream.Sample
[info] start
[info] waiting the remaining ones
[info] end
[info] sink2:0:A
[info] sink1:0:A
[info] sink1:1:B
[info] sink1:2:C
[info] sink2:1:B
[info] sink1:3:D
[info] sink2:2:C
[info] sink1:4:E
[info] sink1:5:F
[info] sink2:3:D
[info] sink1:6:G
[info] sink1:7:H
[info] sink2:4:E
[info] sink2:5:F
[info] sink1:8:I
[info] sink1:9:J
[info] sink2:6:G
[info] sink2:7:H
[info] sink1:10:K
The second sink doesn't receive the 8th, 9th and 10th values: IJK but still the entire flow is ended.
What should I do to wait for both Sinks to consume all the data?
I discovered that if I change the (x1,x2)=>x1 to (x1,x2)=>x2 this will wait. That is the same with sleeping 300ms in the first sink.
The function that you pass to a second parameter list of FlowGraph.closed determines what materialized value is returned when you run the flow. So when you pass in (x1,x2)=>x1 you return a future which is completed when the first sink gets all elements and then the callback on that future shuts down the actor system without the second sink having a chance receiving all of the elements.
Instead, you should get both futures out and shutdown the system only when both futures are completed.
You can actually see how this approach is used in some of the akka-stream tests here.

How to start Play application before tests and then shut it down in specs2?

My goal is to get an application running and execute multiple tests on the same instance of the app.
I have tried this solution without much luck. Here is my test:
class ApplicationSpec extends Specification { sequential
object AppWithTestDb2 extends FakeApplication(additionalConfiguration =
Map(
("db.default.driver") -> "org.h2.Driver",
("db.default.url") -> (
// "jdbc:h2:mem:play-test-" + scala.util.Random.nextInt + // in memory database
"jdbc:h2:/tmp/play-test-" + scala.util.Random.nextInt + // file based db that can be accessed using h2-browse in activator
";MODE=PostgreSQL;DATABASE_TO_UPPER=false;DB_CLOSE_DELAY=-1")
))
running(AppWithTestDb2) {
"Application" should {
"send 404 on a bad request" in {
route(FakeRequest(GET, "/boum")) must beNone
}
"post signup request" in {
val res = route(FakeRequest(POST, "/user", FakeHeaders(), TestData.newUser)).get
status(res) must equalTo(OK)
contentType(res) must beSome("application/json")
contentAsJson(res) mustEqual TestData.newUser
}
"fail signup request for existing user" in {
val res1 = route(FakeRequest(POST, "/user", FakeHeaders(), TestData.newUser)).get
Await.result(res1, Duration.Inf)
val res = route(FakeRequest(POST, "/user", FakeHeaders(), TestData.newUser)).get
Await.result(res, Duration.Inf)
status(res) must equalTo(CONFLICT)
contentType(res) must beSome("application/json")
contentAsJson(res) mustEqual TestData.newUser
}
}
}
}
The problem here is that application starts and stops immediately and tests are executed without a running application:
[debug] c.j.b.BoneCPDataSource - JDBC URL = jdbc:h2:/tmp/play-test--437407884;MODE=PostgreSQL;DATABASE_TO_UPPER=false;DB_CLOSE_DELAY=-1, Username = zalando, partitions = 1, max (per partition) = 30, min (per partition) = 5, idle max age = 10 min, idle test period = 1 min, strategy = DEFAULT
[info] application - Application has started
[debug] application - Binding to Slick DAO implementations.
[info] application - Application shutdown...
[debug] c.j.b.BoneCPDataSource - Connection pool has been shut down
[info] ApplicationSpec
[info]
[info] Application should
[info] ! send 404 on a bad request
[error] RuntimeException: : There is no started application (Play.scala:71)
[error] play.api.Play$$anonfun$current$1.apply(Play.scala:71)
[error] play.api.Play$$anonfun$current$1.apply(Play.scala:71)
[error] play.api.Play$.current(Play.scala:71)
[error] play.api.test.RouteInvokers$class.route(Helpers.scala:305)
[error] play.api.test.Helpers$.route(Helpers.scala:403)
[error] ApplicationSpec$$anonfun$1$$anonfun$apply$7$$anonfun$apply$8$$anonfun$apply$9.apply(ApplicationSpec.scala:76)
[error] ApplicationSpec$$anonfun$1$$anonfun$apply$7$$anonfun$apply$8$$anonfun$apply$9.apply(ApplicationSpec.scala:76)
[error] ApplicationSpec$$anonfun$1$$anonfun$apply$7$$anonfun$apply$8.apply(ApplicationSpec.scala:76)
[error] ApplicationSpec$$anonfun$1$$anonfun$apply$7$$anonfun$apply$8.apply(ApplicationSpec.scala:76)
Here is my working solution
object AppWithTestDb2 extends FakeApplication(additionalConfiguration =
Map(
("db.default.driver") -> "org.h2.Driver",
("db.default.url") -> (
// "jdbc:h2:mem:play-test-" + scala.util.Random.nextInt + // in memory database
"jdbc:h2:/tmp/play-test-" + scala.util.Random.nextInt + // file based db that can be accessed using h2-browse in activator
";MODE=PostgreSQL;DATABASE_TO_UPPER=false;DB_CLOSE_DELAY=-1")
))
class SignupLoginSpec extends Specification { sequential
step(Play.start(AppWithTestDb2))
"application" should {
"post signup request" in {
val res = route(FakeRequest(POST, "/user", FakeHeaders(), TestData.newUser)).get
status(res) must equalTo(OK)
contentType(res) must beSome("application/json")
contentAsJson(res) mustEqual TestData.newUser
}
"fail signup request for existing user" in {
val res1 = route(FakeRequest(POST, "/user", FakeHeaders(), TestData.newUser)).get
Await.result(res1, Duration.Inf)
val res = route(FakeRequest(POST, "/user", FakeHeaders(), TestData.newUser)).get
Await.result(res, Duration.Inf)
status(res) must equalTo(CONFLICT)
contentType(res) must beSome("application/json")
contentAsJson(res) mustEqual TestData.newUser
}
}
step(Play.stop())
}
In specs2 there is a distinction between test declaration and test execution. When you write "application" should ... you just declare tests. The executable part is what is enclosed in the ... in ... part.
So when you declare running(AppWithTestDb2) { ... } you just create some tests inside the context of an AppTestDb2 application.
The general solution for what you want to achieve in specs2 is to use Steps like this:
class ApplicationSpec extends Specification { sequential
step("start application")
"Application" should {
"send 404 on a bad request" in { ... }
"post signup request" in { ... }
}
step("stop application")
}
Then, the way the specs2 execution model works, you will get your fake application started before all the tests start and terminated when all the tests are finished (whether you use sequential or not)
I am not a Play user but I suspect that you should be able to reuse the WithApplication class or something similar to create your start/stop steps. Otherwise there is a blog post here exploring a solution for the same problem.