Good practices of writing and unit testing Utility methods in scala - scala

In particular situations, you need to have some utility methods that are required across different classes. To solve this situation, you create an Util object wherein you place all these methods
object AggregatorUtil {
def aggregateValues(list : List[BigDecimal]) = //some logic...
}
// Import everything in the Utilities object
import AggregatorUtil._
and then import whichever members of util are required in your class. However, the downside to this is that, as all your methods are inside the singleton object and it becomes tricky to mock the object and unit test methods of the class that use utility methods.
To solve this problem again, the only solution that came to mind was Extracting the functionality out to a trait and then mocking the trait.
Please let me know if there is any other approach for handling and testing of util methods and which one is a rather cleaner approach.
Thanks in advance !!!
Note: -I am using scalatest and mockito in my project.

If you need to mock, putting this all in a mocked-out trait is the way forward. If mocking is unnecessary though, avoid it. Mocking unnecessarily is... unnecessary. You'll just be wasting time and effort for something which provides no additional value.
Mocking is best used when you have complex functionality or functionality in other files which you want to treat as a black box and just assume it works as expected (you'd then typically unit test this stuff separately). If you can avoid it and use the functions' actual functionality though, you'll get a much more realistic view of what your application does and will spot new bugs/breaking changes quicker (if you have mocked out functionality and forget to update your mocks, you might not any spot new bugs you introduce).
A good example of when mocking is necessary is when you're mocking calls to a database in a MVC application (e.g. a Scala Play microservice). You obviously don't want to have to run an actual database when testing your code, so you'd typically mock out your connector layer and return dummy/mocked data from your connector functions.
An example of something you wouldn't mock is something like:
trait MyTrait {
def toInt(str: String): Int
}
val mockedTrait = mock[MyTrait]
when(mockedTrait.toInt(eq("3")).thenReturn(3)
It's a bit of a silly example, but I think it explains the point clearly - doing something like this would be ridiculous. Mocking isn't always the answer.

I mostly mock with Test-Implementation which I find more readable and you don't have to learn a mocking framework.
Here an example:
The Interface:
trait DataRepo {
def persist(data: DataObject): Future[DataObject]
def idents(): Future[List[String]]
def insertData(dataCont: DataObject): Future[Int]
...
}
The Mocked Interface:
object DataRepoMock extends DataRepo {
def persist(data: DataObject): Future[DataObject] = ??? // only implement when needed
def idents(): Future[List[String]] = Future.successful((0 to 10).map(_=>Random.nextInt(100)))
def insertData(dataCont: DataObject): Future[Int] = Future.successful(Random.nextInt(100))
...
}
You can also use all the Scala goodies, like Pattern Matching, to make your Mock react differently to input.
Here is an example that this is not just used by myself;):
EPFLx: scala-reactiveX see Lecture 2.5 Testing Actor Systems:
def fakeGetter(url:String, depth: Int):Props =
Props(new Getter(url, depth){
override def webClient: WebClient = FakeWebClient
})

Related

Unit Testing AKKA actors

I am doing a web application with Scala and Akka actors and I'm having some troubles with the tests.
In my case I need to taste an actor who talks with the Database. To do the unit testing I would like to use a Fake Database but I can't replace the new with my desired fake object.
Let's see some code:
Class MyActor extends Actor {
val database = new Database()
def receive = { ... }
}
And in the tests I would like to inject a FakeDatabase object instead Database. I've been looking in Internet but the best that I found is:
Add a parameter to the constructor.
Convert the val database to a var so in the test I could access the attribute by the underlying and replace it.
Both solutions solve the problem but are very dirty.
Isn't a better way to solve the problem?
Thanks!
The two primary options for this scenario are:
Dependency Injection Use a DI framework to inject a real or mock service as needed. In Akka: http://letitcrash.com/post/55958814293/akka-dependency-injection
Cake Pattern This is a Scala-specific way of achieving something akin to dependency injection without actually relying on injection. See: Akka and cake pattern
Echoing the advice here, I wouldn't call injecting the database in the constructor dirty. It might have plenty of benefits, including decoupling actor behaviour from the particular database instance.
However if you know there is only ONE database you will be always using in your production code, then think about defining a package level accessible constructor and a companion object returning a Props object without parameters by default.
Example below:
object MyActor {
def props() : Props = Props(new MyActor(new Database()))
}
class MyActor private[package](database : IDatabase) extends Actor {
def receive = { ... }
}
In this case you will still be able to inject the test database in your tests case (given the same package structure), but prevent users of your code from instantiating MyActor with unexpected database instance.

Dependency injection with Scala

I was searching a way of doing dependency injection in Scala kind of like Spring or Unity in C# and I found nothing really interesting.
MacWire: I don't understand the benefit as we have to give the class in wire[CASS]. So what's the point if you give the implementation when you call wire? I can do new CASS it will be the same.
Cake pattern with self type: Seems to not answer what I'm searching for.
So I decided to make my implementation and ask you what do you think because it's surprising me that nothing like this has been done before. Maybe my implementation have lot's of issues in real life also.
So here is an example:
trait Messenger {
def send
}
class SkypeMessenger extends Messenger {
def send = println("Skype")
}
class ViberMessenger extends Messenger {
def send = println("Viber")
}
I want here to inject everywhere in my app the implementation configured in only one place:
object App {
val messenger = Inject[Messenger]
def main(args: Array[String]) {
messenger.send
}
}
Note the Inject[Messenger] that I define like below with the config I want (prod or dev):
object Inject extends Injector with DevConfig
trait ProdConfig {
this: Injector =>
register[Messager](new SkypeMessager)
register[Messager](new ViberMessager, "viber")
}
trait DevConfig {
this: Injector =>
register[Messager](new ViberMessager)
register[Messager](new ViberMessager, "viber")
}
And finally here is the Injector which contains all methods apply and register:
class Injector {
var map = Map[String, Any]()
def apply[T: ClassTag] =
map(classTag[T].toString).asInstanceOf[T]
def apply[T: ClassTag](id: String) =
map(classTag[T].toString + id).asInstanceOf[T]
def register[T: ClassTag](instance: T, id: String = "") = {
map += (classTag[T].toString + id -> instance)
instance
}
}
To summaries:
I have a class Injector which is a Map between interfaces/traits (eventually also an id) and an instance of the implementation.
We define a trait for each config (dev, prod...) which contains the registers. It also have a self reference to Injector.
And we create an instance of the Injector with the Config we want
The usage is to call the apply method giving the Interface type (eventually also an id) and it will return the implementation's instance.
What do you think?
You code looks a lot like dependency injection in Lift web framework. You can consult Lift source code to see how it's implemented or just use the framework. You don't have to run a Lift app to use its libraries. Here is a small intro doc. Basically you should be looking at this code in Lift:
package net.liftweb.http
/**
* A base trait for a Factory. A Factory is both an Injector and
* a collection of FactorMaker instances. The FactoryMaker instances auto-register
* with the Injector. This provides both concrete Maker/Vender functionality as
* well as Injector functionality.
*/
trait Factory extends SimpleInjector
You can also check this related question: Scala - write unit tests for objects/singletons that extends a trait/class with DB connection where I show how Lift injector is used.
Thanks guys,
So I make my answer but the one from Aleksey was very good.
I understand better the Cake Pattern with this sample:
https://github.com/freekh/play-slick/tree/master/samples/play-slick-cake-sample
Take a look also to the other implementations without DI and compare:
https://github.com/freekh/play-slick/tree/master/samples/
And so the cake pattern doesn't have a centralized config like we can have with my shown lift style DI. I will anyway use the Cake pattern as it fits well with Slick.
What I didn't like with Subcut is the implicits everywhere. I know there is a way to avoid them but it looks like a fix to me.
Thanks
To comment on MacWire, you are right that you could just use new - and that's the whole point :). MacWire is there only to let you remove some boilerplate from your code, by not having to enumerate all the dependencies again (which is already done in the constructor).
The main idea is that you do the wiring at "the end of the world", where you assemble your application (or you could divide that into trait-modules, but that's optional). Otherwise you just use constructors to express dependencies. No magic, no frameworks.

How do I give global access to an object in Scala without making it a singleton or passing it to everything?

I have a Logger class that logs events in my application. While I only need one instance of the logger in this application, I want this class to be reusable, so I don't want to make it a singleton and couple it with my specific needs for this application.
I want to be able to access this Logger instance from anywhere in the application without having to create a new one every time or pass it around to every class that might need to log something. What I currently do is have an ApplicationUtils singleton that I use as the point of access for the application's Logger:
object ApplicationUtils {
lazy val log : Logger = new Logger()
}
Then I have a Loggable trait that I add to classes that need the Logger:
trait Loggable {
protected[this] lazy val log = ApplicationUtils.log
}
Is this a valid approach for what I am trying to accomplish? It feels a little hack-y. Is there a better approach I could be using? I'm pretty new to Scala.
Be careful when putting functionality in objects. That functionality is easily testable, but if you need to test clients of that code to make sure they interact with it correctly (via mocks and spies), you're stuck 'cause objects compile to final classes and thus cannot be mocked.
Instead, use this pattern:
trait T { /* code goes here */ }
object T extends T /* pass this to client code from main sources */
Now you can create Mockito mocks / spies for trait T in your test code, pass that in and confirm that the interactions of the code under test with the trait T code are what they should be.
If you have code that's a client of T and whose interactions with it don't require testing, you can directly reference object T.
To address what you're trying to do (rather than what you're asking), take a look at TypeSafe's scalalogging package. It provides a Logging trait that you can use like so:
class MyClass extends Logging {
logger.debug("This is very convenient ;-)")
}
It's a macro-based wrapper for SLF4J, so something like logger.debug(...) gets compiled as if (logger.isDebugEnabled) logger.debug(...).

Scala Compiler Plugin Deconstruction

I've been trying to write a Scala (2.10.0) compiler plugin that analyzes some parts of a traversed code.
This is what I originally had:
class MyPlugin (val global: Global) extends Plugin {
import global._
val name = "myPlugin"
val components = List[PluginComponent](MyComponent)
private object MyComponent extends PluginComponent {
val global: MyPlugin.this.global.type = MyPlugin.this.global
val runsAfter = List ("refchecks")
val phaseName = "codeAnalysis"
def newPhase (_prev: Phase) = new AnalysisPhase (_prev)
class AnalysisPhase (prev: Phase) extends StdPhase (prev) {
override def name = phaseName
def apply (unit: CompilationUnit) {
codeTraverser traverse unit.body
printLinesToFile(counters.map{case (k,v) => k + "\t" + v},out)
}
def codeTraverser = new ForeachTreeTraverser (tree => /* Analyze tree */)
}
}
}
This code works as expected, however I don't like it because I cannot decouple the code traverser method from this object. I would like to write a separate CodeTraverser class that will perform the analysis on a given Tree. This, among other things may help me test this code better.
The main problem is that unit.body is of an internal Tree type inside scala.reflect.internal.Trees. If I could work with scala.reflect.api.Trees#Tree instead of the internal version I could decouple the traverser functionality and even test it quite easily.
I've tried to find a way to convert between the two, but to no avail. Is it even possible? From looking at their source code, many things looks too similar for this to be impossible.
You're probably struggling with the cake pattern that the compiler is implemented with and a lot of path-dependency that comes with it. I've gone through this some time ago when I was writing some really beefy macro and wanted to refactor a bunch of functions out of macro implementation into separate utility class. I found this to be quite an annoying issue.
Here's how I would implement your Traverser in a separate class:
class MyPluginUtils[G <: Global with Singleton](global: G) {
import global._
class AnalyzingTraverser extends ForeachTreeTraverser(tree => /* analyze */)
}
Now, inside your plugin you have to use it like this:
val utils = new MyPluginUtils[global.type](global)
import utils.{global => _, _}
val traverser = new AnalyzingTraverser
As you can see, it's not the most intuitive thing in the world (i.e. this is confusing as hell), but this is the best that I could come up with that actually worked, and I tried a lot of things before finally settling on this one. I would be really happy to see some nicer way to do this.
AFAIK, such extensibility is one of the general problems with the cake pattern (as used in scalac implementation). I've seen other people also complain about this.
The component (the SubComponent or PluginComponent) must be created with the global member initialized early (that is, as an early definition).
Don't forget to review the one-question faq. I may go set google calendar to remind me to do that every Monday morning.
For an example, see the continuations plugin.
The component is defined with a utility class mixed in.
The utility class follows the usual cake recipe. (Leave it as an abstract dependency and let the compiler ensure that everything was mixed correctly.)
Here is a recent edit showing more early definitions, as a demonstration that this usage is not anomalous.
val anfPhase = new {
val global = SelectiveCPSPlugin.this.global
val cpsEnabled = pluginEnabled
override val enabled = cpsEnabled
} with SelectiveANFTransform {
val runsAfter = List("pickler")
}
(In future, they plan to deprecate early definitions in favor of parameterized traits when they are available in the language.)
More generally, global, i.e., "the compiler", is routinely instantiated for testing the compiler itself. I haven't seen it mocked, but computeInternalPhases is the template method for picking what phases are assembled modulo plugins.
There is a current effort to reduce internal dependencies, for the purpose of testing, as a window onto the difficulties involved.

Not able to use Mockito ArgumentMatchers in Scala

I am using ScalaMock and Mockito
I have this simple code
class MyLibrary {
def doFoo(id: Long, request: Request) = {
println("came inside real implementation")
Response(id, request.name)
}
}
case class Request(name: String)
case class Response(id: Long, name: String)
I can easily mock it using this code
val lib = new MyLibrary()
val mock = spy(lib)
when(mock.doFoo(1, Request("bar"))).thenReturn(Response(10, "mock"))
val response = mock.doFoo(1, Request("bar"))
response.name should equal("mock")
But If I change my code to
val lib = new MyLibrary()
val mock = spy(lib)
when(mock.doFoo(anyLong(), any[Request])).thenReturn(Response(10, "mock"))
val response = mock.doFoo(1, Request("bar"))
response.name should equal("mock")
I see that it goes inside the real implementation and gets a null pointer exception.
I am pretty sure it goes inside the real implementation without matchers too, the difference is that it just doesn't crash in that case (any ends up passing null into the call).
When you write when(mock.doFoo(...)), the compiler has to call mock.doFoo to compute the parameter that is passed to when.
Doing this with mock works, because all implementations are stubbed out, but spy wraps around the actual object, so, the implementations are all real too.
Spies are frowned upon in mockito world, and are considered code smell.
If you find yourself having to mock out some functionality of your class while keeping the rest of it, it is almost surely the case when you should just split it into two separate classes. Then you'd be able to just mock the whole "underlying" object entirely, and have no need to spy on things.
If you are still set on using spies for some reason, doReturn would be the workaround, as the other answer suggests. You should not pass null as the vararg parameter though, it changes the semantics of the call. Something like this should work:
doReturn(Response(10, "mock"), Array.empty:_*).when(mock).doFoo(any(), any())
But, I'll stress it once again: this is just a work around. The correct solution is to use mock instead of spy to begin with.
Try this
doReturn(Response(10, "mock"), null.asInstanceOf[Array[Object]]: _*).when(mock.doFoo(anyLong(), any[Request]))