Generating ScalaCheck tests with Cucumber JVM - Generic Functions - scala

To avoid X & Y problems, a little background:
I'm trying to set up a web project where I'm going to be duplicating business logic server and client side, client obviously in Javascript and the server in Scala. I plan to write business logic in Cucumber so I can make sure the tests and functionality line up on both sides. Finally, I'd like to have a crack at bringing ScalaCheck and JSCheck into this, generated input data rather than specified.
Basically, the statements would work like this:
Given statements select add generators.
When statements specify functions to act upon those values in sequence.
Then statements take the input data and the final result data and run a property.
The objective is to make this sort of thing composable so you could specify several generators, a set of actions to run on each of them, and then a set of properties that would each get run on the inputs and result.
Done this already in Javascript (technically Coffeescript), and of course with a dynamic language is straightforward to do. Basically what I want to be able to do in my scala step definitions is this, excuse the arbitrary test data:
class CucumberSteps extends ScalaDsl with EN
with ShouldMatchers with QuickCheckCucumberSteps {
Given("""^an list of integer between 0 and 100$""") {
addGenerator(Gen.containerOf[List, Int](Gen.choose(0,100)))
}
Given("""^an list of random string int 500 and 700$""") {
addGenerator(Gen.containerOf[List, Int](Gen.choose(500,700)))
}
When("""^we concatenate the two lists$""") {
addAction {(l1: List[Int], l2: List[Int]) => l1 ::: l2 }
}
Then("""^then the size of the result should equal the sum of the input sizes$""") {
runProperty { (inputs: (List[Int], List[Int]), result: (List[Int])) =>
inputs._1.size + inputs._2.size == result._1.size
}
}
}
So the key thing I want to do is create a trait QuickCheckCucumberSteps that will be the API, implementing addGenerator, addAction and runProperty.
Here's what I've roughed out so far, and where I get stuck:
trait QuickCheckCucumberSteps extends ShouldMatchers {
private var generators = ArrayBuffer[Gen[Any]]()
private var actions = ArrayBuffer[""AnyFunction""]()
def addGenerator(newGen: Gen[Any]): Unit =
generators += newGen
def addAction(newFun: => ""AnyFunction""): Unit =
actions += newFun
def buildPartialProp = {
val li = generators
generators.length match {
case 1 => forAll(li(0))_
case 2 => forAll(li(0), li(1))_
case 3 => forAll(li(0), li(1), li(2))_
case 4 => forAll(li(0), li(1), li(2), li(3))_
case _ => forAll(li(0), li(1), li(2), li(3), li(4))_
}
}
def runProperty(propertyFunc: => Any): Prop = {
val partial = buildPartialProp
val property = partial {
??? // Need a function that takes x number of generator inputs,
// applies each action in sequence
// and then applies the `propertyFunc` to the
// inputs and results.
}
val result = Test.check(new Test.Parameters.Default {},
property)
result.status match {
case Passed => println("passed all tests")
case Failed(a, l) => fail(format(pretty(result), "", "", 75))
case _ => println("other cases")
}
}
}
My key issue is this, I want to have the commented block become a function that takes all the added actions, apply them in order and then run and return the result of the property function. Is this possible to express with Scala's type system, and if so, how do I get started? Happy to do reading and earn this one, but I need at least a way forward as I don't know how to express it at this point. Happy to drop in my Javascript code if what I'm trying to make here isn't clear.

If I were you, I wouldn't put ScalaCheck generator code within your Cucumber Given/When/Then statements :). The ScalaCheck api calls are part of the "test rig" - so not under test. Try this (not compiled/tested):
class CucumberSteps extends ScalaDsl with EN with ShouldMatchers {
forAll(Gen.containerOf[List, Int](Gen.choose(0,100)),
Gen.containerOf[List, Int](Gen.choose(500,700)))
((l1: List[Int], l2: List[Int]) => {
var result: Int = 0
Given(s"""^a list of integer between 0 and 100: $l1 $""") { }
Given(s"""^a list of integer between 0 and 100: $l2 $""") { }
When("""^we concatenate the two lists$""") { result = l1 ::: l2 }
Then("""^the size of the result should equal the sum of the input sizes$""") {
l1.size + l2.size == result.size }
})
}

Related

Scala: for-comprehension for chain of operations

I have a task to transform the following code-block:
val instance = instanceFactory.create
val result = instance.ackForResult
to for-comprehension expression.
As for-comprehension leans on enumeration of elements, I tried to get around it with wrapper class:
case class InstanceFactoryWrapper(value:InstanceFactory) {
def map(f: InstanceFactory => Instance): Instance
= value.create()
}
where map-method must handle only one element and return a single result: Instance
I tested this approach with this expression:
for {
mediationApi <- InstanceFactoryWrapper(instanceFactoryWrapper)
}
But it does't work: IDEA recommends me to use foreach in this part. But "foreach" doesn't return anything, as opposed to map.
What am I doing wrong?
Simply put when working with List\Option\Either or other lang types comprehensions are useful to transform nested map\flatMap\withFilter into sequences.
Use custom classes in for-comprehension
But what about your own classes or other 3rd party ones?
You need to implement monadic operations in order to use them in for-comprehensions.
The bare minimum: map and flatMap.
Take the following example with a custom Config class:
case class Config[T](content: T) {
def flatMap[S](f: T => Config[S]): Config[S] =
f(content)
def map[S](f: T => S): Config[S] =
this.copy(content = f(content))
}
for {
first <- Config("..")
_ = println("Going through a test")
second <- Config(first + "..")
third <- Config(second + "..")
} yield third
This is how you enable for-comprehension.

Why does Scala type inference fail in one case but not the other?

Background: I'm using net.liftweb.record with MongoDB to access a database. At some point, I was in need of drawing a table of a collection of documents from the database (and render them as an ASCII table). I ran into very obscure type inference issues which are very easy to solve but nevertheless made me want to understand why they were happening.
Reproduction: For simplicity, I've reduced the code to (what I think is) an absolute minimum, so that it only depends on net.liftweb.record and none of the Mongo specific types. I've kept the real-life body of the function under question to make the example more realistic.
makeTable takes some apples, and some functions that map apples to columns. Columns can either be mapped to a real field on the apples, or a dynamically computed value (with a name). To be able to mix the two (real fields and dynamic values) in a single Seq, I defined a structural type Col.
To see how the code (below) behaves, try the following variants of the cols parameter to makeTable:
// OK:
cols = Seq(_.isDone)
cols = Seq(job => dynCol1)
cols = Seq(job => dynCol1, job => dynCol2)
// ERROR: found: Seq[Job => Object], required: Seq[Job => Test.Col]
cols = Seq(_.isDone, job => dynCol1)
cols = Seq(_.isDone, job => dynCol2)
cols = Seq(_.isDone, job => dynCol1, job => dynCol2)
...so whenever _.isDone (i.e. the column that maps to a physical field) is mixed with any other "flavor" of column, the error occurs (CASE 1). Alone it behaves well; other flavors of column also behave well when alone or mixed (CASE 2).
Intuitive workaround: marking cols as Seq[Job => Col] ALWAYS fixes the error.
Counter-intuitive workaround: explicitly marking any of the return values of the functions in the Seq as Col, or any of the functions as Job => Col, solves the issue.
The code:
import net.liftweb.record.{ Record, MetaRecord }
import net.liftweb.record.field.IntField
import scala.language.reflectiveCalls
class Job extends Record[Job] {
def meta = Job
object isDone extends IntField(this)
}
object Job extends Job with MetaRecord[Job]
object Test extends App {
type Col = { def name: String; def get: Any }
def makeTable[T](xs: Seq[T])(cols: Seq[T => Col]) = {
assert(xs.size >= 1)
val rows = xs map { x => cols { map { _(x).get } }
val header = cols map { _(xs.head).name }
(header +: rows)
}
val dynCol1 = new { def name = "dyncol1"; def get = "dyn1" }
val dynCol2 = new { def name = "dyncol2"; def get = "dyn2" }
val jobs = Seq(Job.createRecord, Job.createRecord)
makeTable(jobs)(Seq(
_.isDone,
job => dynCol1,
job => dynCol2
))
}
P.S. I'm not adding a lift or lift-record tag because I think this is not related to Lift and is simply a Scala question triggered by what happens to be a Lift-specific situation. Feel free to correct me if I'm wrong.

Is it possible to avoid mutable state using Cucumber-jvm Scala?

With the Cucumber tests, a feature expressed as Given, When and Then is usually implemented as three separate methods. These methods often need to share values, and this it seems that mutable variables are the way to do it.
Take the following simple example:
A feature:
Given the digit 2
When it is multiplied by 3
Then the result is 6
And the Cucumber methods:
class CucumberRunner extends ScalaDsl with EN with ShouldMatchers {
var digitUnderTest: Int = -1
Given("""^the digit (\d)$""") { digit: Int =>
digitUnderTest = digit
}
When("""^it is multiplied by 3$""") {
digitUnderTest = digitUnderTest * 3
}
Then("""^the result is (\d)$""") { result: Int =>
digitUnderTest should equal (result)
}
}
Is there any way, presumably built into Scala test or Cucumber-jvm for Scala, that allows me not to express digitUnderTest as a mutable variable?
Looking at cucumber-jvm examples in java and scala, I doubt it provides a way to passing data from step to step without storing it in a variable temporarily.
Since you cannot reassign a val in scala, the closest thing I can think of for you to get rid of the mutable var is to have a global map that holds temporary test data.
class CucumberRunner extends ScalaDsl with EN with ShouldMatchers {
Given("""^the digit (\d)$""") { digit: Int =>
GlobalTestData.save("my_unique_key_1", digit)
}
When("""^it is multiplied by 3$""") {
GlobalTestData.load("my_unique_key_1") match {
case Some(obj) => {
val result = obj.asInstanceOf[Int] * 3
GlobalTestData.save("my_unique_key_2", result)
}
case None => // throw exception or fail test
}
}
Then("""^the result is (\d)$""") { result: Int =>
GlobalTestData.load("my_unique_key_2") match {
case Some(obj) => obj.asInstanceOf[Int] should equal (result)
case None => // throw exception or fail test
}
}
}
And then the GlobalTestData:
object GlobalTestData {
val map = scala.collection.mutable.Map.empty[String, Any];
def save(key: String, value: Any) {
map.put(key, value)
}
def load(key: String): Option[Any] = map.get(key)
}
In this case, you need to carefully generate keys so they are the same across steps. Of course, you can use some vals to hold the values of these keys.
Also in this particular feature, why not combine Given and When steps:
When the digit 2 is multiplied by 3
Then the result is 6
This way you can save one slot in GlobalTestData.
Although in the ScalaDSL for CucumberJVM, a step is a function f:List[Any] => Any, the current implementation discards the results the each step execution, meaning that you have no way to use the result of a previous step in the next one.
Currently, the only way to share the result of one step is through some shared state, either at the step definition class or a more global context like zihaoyu suggested
(BTW, we use the shared mutable map method as well in a large project)

Scala actor kills itself inconsistently

I am a newbie to scala and I am writing scala code to implement pastry protocol. The protocol itself does not matter. There are nodes and each node has a routing table which I want to populate.
Here is the part of the code:
def act () {
def getMatchingNode (initialMatch :String) : Int = {
val len = initialMatch.length
for (i <- 0 to noOfNodes-1) {
var flag : Int = 1
for (j <- 0 to len-1) {
if (list(i).key.charAt(j) == initialMatch(j)) {
continue
}
else {
flag = 0
}
}
if (flag == 1) {
return i
}
}
return -1
}
// iterate over rows
for (ii <- 0 to rows - 1) {
for (jj <- 0 to 15) {
var initialMatch = ""
for (k <- 0 to ii-1) {
initialMatch = initialMatch + key.charAt(k)
}
initialMatch += jj
println("initialMatch",initialMatch)
if (getMatchingNode(initialMatch) != -1) {
Routing(0)(jj) = list(getMatchingNode(initialMatch)).key
}
else {
Routing(0)(jj) = "NULL"
}
}
}
}// act
The problem is when the function call to getMatchingNode takes place then the actor dies suddenly by itself. 'list' is the list of all nodes. (list of node objects)
Also this behaviour is not consistent. The call to getMatchingNode should take place 15 times for each actor (for 10 nodes).
But while debugging the actor kills itself in the getMatchingNode function call after one call or sometimes after 3-4 calls.
The scala library code which gets executed is this :
def run() {
try {
beginExecution()
try {
if (fun eq null)
handler(msg)
else
fun()
} catch {
case _: KillActorControl =>
// do nothing
case e: Exception if reactor.exceptionHandler.isDefinedAt(e) =>
reactor.exceptionHandler(e)
}
reactor.kill()
}
Eclipse shows that this code has been called from the for loop in the getMatchingNode function
def getMatchingNode (initialMatch :String) : Int = {
val len = initialMatch.length
for (i <- 0 to noOfNodes-1)
The strange thing is that sometimes the loop behaves normally and sometimes it goes to the scala code which kills the actor.
Any inputs what wrong with the code??
Any help would be appreciated.
Got the error..
The 'continue' clause in the for loop caused the trouble.
I thought we could use continue in Scala as we do in C++/Java but it does not seem so.
Removing the continue solved the issue.
From the book: "Programming in Scala 2ed" by M.Odersky
You may have noticed that there has been no mention of break or continue.
Scala leaves out these commands because they do not mesh well with function
literals, a feature described in the next chapter. It is clear what continue
means inside a while loop, but what would it mean inside a function literal?
While Scala supports both imperative and functional styles of programming,
in this case it leans slightly towards functional programming in exchange
for simplifying the language. Do not worry, though. There are many ways to
program without break and continue, and if you take advantage of function
literals, those alternatives can often be shorter than the original code.
I really suggest reading the book if you want to learn scala
Your code is based on tons of nested for loops, which can be more often than not be rewritten using the Higher Order Functions available on the most appropriate Collection.
You can rewrite you function like the following [I'm trying to make it approachable for newcomers]:
//works if "list" contains "nodes" with an attribute "node.key: String"
def getMatchingNode (initialMatch :String) : Int = {
//a new list with the corresponding keys
val nodeKeys = list.map(node => node.key)
//zips each key (creates a pair) with the corresponding index in the list and then find a possible match
val matchOption: Option[(String, Int)] = (nodeKeys.zipWithIndex) find {case (key, index) => key == initialMatch}
//we convert an eventual result contained in the Option, with the right projection of the pair (which contains the index)
val idxOption = matchOption map {case (key, index) => index} //now we have an Option[Int] with a possible index
//returns the content of option if it's full (Some) or a default value of "-1" if there was no match (None). See Option[T] for more details
idxOption.getOrElse(-1)
}
The potential to easily transform or operate on the Collection's elements is what makes continues, and for loops in general, less used in Scala
You can convert the row iteration in a similar way, but I would suggest that if you need to work a lot with the collection's indexes, you want to use an IndexedSeq or one of its implementations, like ArrayBuffer.

Scala - How to group a list of tuples without pattern matching?

Consider the following structure (in reality the structure is a bit more complex):
case class A(id:String,name:String) {
override def equals(obj: Any):Boolean = {
if (obj == null || !obj.isInstanceOf[A]) return false
val a = obj.asInstanceOf[A]
name == a.name
}
override def hashCode() = {
31 + name.hashCode
}
}
val a1 = A("1","a")
val a2 = A("2","a")
val a3 = A("3","b")
val list = List((a1,a2),(a1,a3),(a2,a3))
Now let's say I want to group all tuples with equal A's. I could implement it like this
list.groupBy {
case (x,y) => (x,y)
}
But, I don't like to use pattern matching here, because it's not adding anything here. I want something simple, like this:
list.groupBy(_)
Unfortunately, this doesn't compile. Not even when I do:
list.groupBy[(A,A)](_)
Any suggestions how to simplify my code?
list.groupBy { case (x,y) => (x,y) }
Here you are deconstructing the tuple into its two constituent parts, just to immediately reassemble them exactly like they were before. In other words: you aren't actually doing anything useful. The input and output are identical. This is just the same as
list.groupBy { t => t }
which is of course just the identity function, which Scala helpfully provides for us:
list groupBy identity
If you want to group the elements of a list accoding to their own equals method, you only need to pass the identity function to groupBy:
list.groupBy(x=>x)
It's not enough to write list.groupBy(_) because of the scope of _, that is it would be desugared to x => list.groupBy(x), which is of course not what you want.