Assert object instance in scala FlatSpec - scala

I am trying to create FlatSpec test that feels more like 'Scala' then 'Java'. Generally, I am interested how to assert an instance of class File.
Here is what I have:
File.scala
package org.demo.entries
class File(
val parentPath: String,
val name: String,
val contents: String)
AssertEntries.scala
package org.demo.entries
import org.scalatest.{FlatSpec, Matchers}
object AssertEntries extends FlatSpec with Matchers{
def assertFileEntry(expectedParentPath: String,
expectedName: String,
expectedContent: String,
actual: File) = {
actual should have (
'name (expectedName),
'parentPath (expectedParentPath),
'contents (expectedContent)
)
}
}
FileTest.scala
package org.demo.entries
import org.scalatest.{FlatSpec, Matchers}
import org.demo.entries.AssertEntries._
class FileTest extends FlatSpec with Matchers {
val PATH1: String = "unrelated"
val NAME1: String = "somename"
val CONTENT1: String = "somecontent"
"A file" should "be created" in {
val actual : File = new File(PATH1, NAME1, CONTENT1)
assertFileEntry(PATH1, NAME1, CONTENT1, actual) // Is there some better approach?
}
}
As you can see, I am using my own assertFileEntry method to assert the instance of the file. Generally, this approach is more like Java, but considering that Scala has different syntax, I was wondering, is there a different approach that looks more like Scala?
Copying contents of assertFileEntry every time I want to check an instance of File, seems not so convenient.
Updated background:
For the clarity, I have omitted the rest of the code. Generally File class is extending DirEntry class which has two subtypes: File and Directory. Directory have additional list that contains DirEntries.
I want to test if given Directory contains created files (and nested directories) by iterating through list of parent Directory and calling assertFileEntry (or assertDirectoryEntry) and assert each entry with expected values.
That is the reason why I created assertFileEntry method, to do all the assertions for given File instance there, but, at least to me, my solution seems more like a Java then Scala solution

Your approach seems fine enough if it works. I would personally create assertFileEntry as a private method inside FileTest if it's used nowhere else, and construct the tests like this:
package org.demo.entries
import org.scalatest.{FlatSpec, Matchers}
class FileSpec extends FlatSpec with Matchers {
val PATH1: String = "unrelated"
val NAME1: String = "somename"
val CONTENT1: String = "somecontent"
private def assertFileEntry(expectedParentPath: String,
expectedName: String,
expectedContent: String,
actual: File) = {
actual.name shouldBe expectedName
actual.contents shouldBe expectedContent
actual.parentPath shouldBe expectedParentPath
}
"A file" should "be created" in {
val actual: File = new File(PATH1, NAME1, CONTENT1)
assertFileEntry(PATH1, NAME1, CONTENT1, actual)
}
}
There are countless ways to do what you want in Scala though so just pick a method which works for you. It might be better to assert individual things in separate tests though if your tests get more complicated than this, so it's easier to locate which part of your code is failing.
Also, in Scala it's more conventional to name the test files like {class name}Spec rather than {class name}Test, for example FileSpec or ControllerSpec over FileTest or ControllerTest.

Related

How to test my scala json after creating the implicit classes

I am using specs2 and want to test my json reads that I created.
I have my case classes and implicits created like:
object ComputerImplicits {
implicit val partReads = Json.reads[Part]
implicit val computerReads = Json.reads[Computer]
}
I have a sample json file in my test/resources/computer.json folder.
I am loading the JSON file as a string like this:
val jsonString = Source.fromURL(getClass.getResource("/computer.json")).mkString
I brought the implicits in scope:
import ComputerImplicits._
Now how do I take my case classes and use the json string and attempt to parse it and match it to test if it is working correctly?
I am using Plays json macros https://www.playframework.com/documentation/2.8.x/ScalaJsonAutomated
Assuming you use Play JSON:
final class FooSpec extends org.specs2.mutable.Specification {
"Json" should {
"be ok" in {
Json.parse(jsonString).validate[YourType] must_=== JsSuccess(expectedVal)
}
}
}
Also implicit related to a type are usually declared in its companion object (rather than in a shared object).

meta-programming to parse json in scala

I need some hints to write a scala program that could read json file and create a case class at run time. As an example if we have json class like -
Employ{
name:{datatype:String, null:false}
age:{datatype:Int, null:true}
Address:{city: {datatype: String, null:true}, zip: {datatype: String, null:false}}
}
and this should create class like
case class Employ(name: String, age: Option[Int], address: Address}
case class Address(city: Option[String], zip:String}
would it be possible to do it in scala?
Yes, you can easily achieve this using TreeHugger. I did a similar thing for one of my work projects.
Below is a toy example which produces a Scala Akka Actor class. It needs to be cleaned up but, hopefully, you get the idea:
import argonaut.Argonaut._
import argonaut._
import org.scalatest.FunSuite
import treehugger.forest._
import definitions._
import treehuggerDSL._
class ConvertJSONToScalaSpec extends FunSuite {
test("read json") {
val input =
"""
|{
| "rulename" : "Rule 1",
| "condition" : [
| {
| "attribute" : "country",
| "operator" : "eq",
| "value" : "DE"
| }
| ],
| "consequence" : "route 1"
|}
""".stripMargin
val updatedJson: Option[Json] = input.parseOption
val tree =
BLOCK(
IMPORT(sym.actorImports),
CLASSDEF(sym.c).withParents(sym.d, sym.e) :=
BLOCK(
IMPORT(sym.consignorImport, "_"),
DEFINFER(sym.methodName) withFlags (Flags.OVERRIDE) := BLOCK(
CASE(sym.f DOT sym.methodCall APPLY (REF(sym.mc))) ==>
BLOCK(
sym.log DOT sym.logmethod APPLY (LIT(sym.logmessage)),
(IF (sym.declaration DOT sym.header DOT sym.consignor DOT sym.consignoreTID ANY_== LIT(1))
THEN (sym.sender APPLY() INFIX ("!", LIT(sym.okcm)))
ELSE
(sym.sender APPLY() INFIX ("!", LIT(sym.badcm)))
)
)
)
)
) inPackage (sym.packageName)
}
Essentially all you need to do is work out how to use the TreeHugger macros; each macro represents a specific keyword in Scala. It gives you a type-safe way to do your meta-programming.
There's also Scala Meta but I haven't used that.
Well... lets say you used some library like treehugger or scala meta or something else to generate the code string for case class. Now there are multiple approaches that you can take. To start with one of them, you can do the following.
// import the current runtime mirror as cm
import scala.reflect.runtime.{currentMirror => cm}
// you case code string
val codeString = """
case class Address(city: Option[String], zip:String)
Address(Some("CityName"), "zipcode")
"""
// get the toolbox from mirror
val tb = cm.mkToolBox()
// use tool box to convert string to Tree
val codeTree = tb.parse(codeString)
// eval your tree
val address = tb.eval(codeTree)
The problem is that the val address will have type Any. Also the universe still does not know about type Address so you will not be able to do address.asInstanceOf[Address].
You can solve this one by exploring things about ClassSymbol and ClassLoader and with enough luck may be able to solve many more issues that you will face by understanding more about how reflection works in Scala and Java. But that will be a high effort and no guaranty of success path.

Can you dynamically generate Test names for ScalaTest from input data?

I have a number of test data sets that run through the same ScalaTest unit tests. I'd love if each test data set was it's own set of named tests so if one data set fails one of the tests i know exactly which one it was rather than going to a single test and looking on what file it failed. I just can't seem to find a way for the test name to be generated at runtime. I've looked at property and table based testing and currently am using should behave like to share fixtures, but none of these seem to do what I want.
Have I not uncovered the right testing approach in ScalaTest or is this not possible?
You can write dynamic test cases with ScalaTest like Jonathan Chow wrote in his blog here: http://blog.echo.sh/2013/05/12/dynamically-creating-tests-with-scalatest.html
However, I always prefer the WordSpec testing definitions and this also works with dynamic test cases just like Jonathan mentions.
class MyTest extends WordSpec with Matchers {
"My test" should {
Seq(1,2,3) foreach { count =>
s"run test $count" in {
count should be(count)
}
}
}
}
When running this test it run 3 test cases
TestResults
MyTest
My test
run test 1
run test 2
run test 3
ps. You can even do multiple test cases in the same foreach function using the same count variable.
You could write a base test class, and extend it for each data set. Something like this:
case class Person(name: String, age: Int)
abstract class MyTestBase extends WordSpec with Matchers {
def name: String
def dataSet: List[Person]
s"Data set $name" should {
"have no zero-length names" in {
dataSet.foreach { s => s.name should not be empty }
}
}
}
class TheTest extends MyTestBase {
override lazy val name = "Family" // note lazy, otherwise initialization fails
override val dataSet = List(Person("Mom", 53), Person("Dad", 50))
}
Which produces output like this:
TheTests:
Data set Family
- should have no zero-length names
You can use scala string substitutions in your test names. Using behavior functions, something like this would work:
case class Person(name: String, age: Int)
trait PersonBehaviors { this: FlatSpec =>
// or add data set name as a parameter to this function
def personBehavior(person: => Person): Unit = {
behavior of person.name
it should s"have non-negative age: ${person.age}" in {
assert(person.age >= 0)
}
}
}
class TheTest extends FlatSpec with PersonBehaviors {
val person = Person("John", 32)
personBehavior(person)
}
This produces output like this:
TheTest:
John
- should have non-negative age: 32
What about using ScalaTest's clue mechanism so that any test failures can report as a clue which data set was being used?
You can use the withClue construct provided by Assertions,
which is extended by every style trait in ScalaTest, to add
extra information to reports of failed or canceled tests.
See also the documentation on AppendedClues

Non-scala source positions with scala macros

I have a scala macro which depends on an arbitrary xml file that is specified through a static string containing it's location.
def myMacro(path: String) = macro myMacroImpl
def myMacroImpl(c: Context)(path: c.Expr[String]): c.Expr[Any] = {
// load file specified by path and generate some code
...
}
This means, that if the xml file is malformed the macro will not be able to expand. In the moment I am providing an error message that contains a textual representation of the location of the error in the xml file. This however is obviously not the nicest solution.
Is it possible to provide source locations in different (possible non-scala) files for my generated code so that errors will point to the xml file instead of the scala file where the xml file is included? I don't see how I can create locations myself instead of altering existing.
This use case is definitely very interesting, and it looks like something that should be supported in the reflection API. Unfortunately, at the moment, there's no public API to achieve this, even though the internal, albeit quite low-level, machinery is in place.
import scala.language.experimental.macros
import scala.reflect.macros.blackbox.Context
import scala.reflect.io.AbstractFile
import scala.reflect.internal.util.BatchSourceFile
import scala.reflect.internal.util.OffsetPosition
class Impl(val c: Context) {
def impl: c.Tree = {
val filePath = "foo.txt"
val af = AbstractFile.getFile(filePath)
val content = scala.io.Source.fromFile(filePath).mkString
val sf = new BatchSourceFile(af, content)
val pos = new OffsetPosition(sf, 3).asInstanceOf[c.universe.Position]
c.abort(pos, "it works")
}
}
object Macros {
def foo: Any = macro Impl.impl
}
object Test extends App {
Macros.foo
}
Running this code on a simple text file produces the following result:
20:56 ~/Projects/Master/sandbox (master)$ cat foo.txt
hello
world
20:56 ~/Projects/Master/sandbox (master)$ scalac Test.scala
foo.txt:1: error: it works
hello
^
one error found
Please note that this solution involves scala.reflect.internal and a cast (both of which invalidate all compatibility guarantees that we provide for scala-reflect.jar), so it's not something that I would recommend for production code.

Scala pickling: how?

I'm trying to use "pickling" serialization is Scala, and I see the same example demonstrating it:
import scala.pickling._
import json._
val pckl = List(1, 2, 3, 4).pickle
Unpickling is just as easy as pickling:
val lst = pckl.unpickle[List[Int]]
This example raises some question. First of all, it skips converting of object to string. Apparently you need to call pckl.value to get json string representation.
Unpickling is even more confusing. Deserialization is an act of turning string (or bytes) into an object. How come this "example" demonstrates deserialization if there is no string/binry representation of object?
So, how do I deserialize simple object with pickling library?
Use the type system and case classes to achieve your goals. You can unpickle to some superior type in your hierarchy (up to and including AnyRef). Here is an example:
trait Zero
case class One(a:Int) extends Zero
case class Two(s:String) extends Zero
object Test extends App {
import scala.pickling._
import json._
// String that can be sent down a wire
val wire: String = Two("abc").pickle.value
// On the other side, just use a case class
wire.unpickle[Zero] match {
case One(a) => println(a)
case Two(s) => println(s)
case unknown => println(unknown.getClass.getCanonicalName)
}
}
Ok, I think I understood it.
import scala.pickling._
import json._
var str = Array(1,2,3).pickle.value // this is JSON string
println(str)
val x = str.unpickle[Array[Int]] // unpickle from string
will produce JSON string:
{
"tpe": "scala.Array[scala.Int]",
"value": [
1,
2,
3
]
}
So, the same way we pickle any type, we can unpickle string. Type of serialization is regulated by implicit formatter declared in "json." and can be replaced by "binary."
It does look like you will be starting with a pickle to unpickle to a case class. But the JSON string can be fed to the JSONPickle class to get the starting pickle.
Here's an example based on their array-json test
package so
import scala.pickling._
import json._
case class C(arr: Array[Int]) { override def toString = s"""C(${arr.mkString("[", ",", "]")})""" }
object PickleTester extends App {
val json = """{"arr":[ 1, 2, 3 ]}"""
val cPickle = JSONPickle( json )
val unpickledC: C = cPickle.unpickle[C]
println( s"$unpickledC, arr.sum = ${unpickledC.arr.sum}" )
}
The output printed is:
C([1,2,3]), arr.sum = 6
I was able to drop the "tpe" in from the test as well as the .stripMargin.trim on the input JSON from the test. It works all in one line, but I thought it might be more apparent split up. It's unclear to me if that "tpe" from the test is supposed to provide a measure of type safety for the incoming JSON.
Looks like the only other class they support for pickling is a BinaryPickle unless you want to roll your own. The latest scala-pickling snapshot jar requires quasiquotes to compile the code in this answer.
I tried someting more complicated this morning and discovered that the "tpe" is required for non-primatives in the incoming JSON - which points out that the serialized string really must be compatible with the pickler( which I mixed into the above code ):
case class J(a: Option[Boolean], b: Option[String], c: Option[Int]) { override def toString = s"J($a, $b, $c)" }
...
val jJson = """{"a": {"tpe": "scala.None.type"},
| "b":{"tpe": "scala.Some[java.lang.String]","x":"donut"},
| "c":{"tpe": "scala.Some[scala.Int]","x":47}}"""
val jPickle = JSONPickle( jJson.stripMargin.trim )
val unpickledJ: J = jPickle.unpickle[J]
println( s"$unpickledJ" )
...
where naturually, I had to use .value on a J(None, Some("donut"), Some(47)) to figure out how to create the jJson input value to prevent the unpickling from throwing an exception.
The output for J is like:
J(None, Some(donut), Some(47))
Looking at this test, it appears that if the incoming JSON is all primatives or case classes (or combinations) that the JSONPickle magic works, but some other classes like Options require extra "tpe" type information to unpickle correctly.