How to enforce scala macro annotation constraints on annotated classes? - scala

I am trying to implement a "simple" util library that puts annotated scalatest suite in a Suites instance to give them a certain running context.
trait MyContext extends BeforeAndAfterAll {
//declarations of values
override def beforeAll() = {/* run various init procedures */}
override def afterAll() = {/* tear everything down */}
}
That part works and I can use it if I code the Suites instance myself.
What I would like to code is scala annotations with a macro that will take all annotated org.scalatest.Suite subtypes and generate the Suites class as in:
class testThisInContext extends StaticAnnotation{ /* ... */}
#testThisInContext
class TestOne extends WordSpec {}
#testThisInContext
class TestTwo extends FlatSuite {}
And would then generated:
class InContextSuites extends Suites(new TestOne, new TestTwo) with MyContext {}
and also modify the classes by adding the #org.scalatest.DoNotDiscover annotation to them (to avoid execution out of context).
I need a way to interrupt the application of the macro (and throw an error) when the annotated class is not a subclass of Suite (which would make the generated class not compile).
I also have not figured out how to type check annotations in the modifiers instance of a ClassDef (in order to add an annotation if needed).

Related

Is there a per-test non-specific mock reset pattern using Scala+PlaySpec+Mockito?

I'm writing a unit test suite for a Scala Play application and I'm wondering if there's anything analogous to java's
#Mock
private Foo foo;
#Autowired/InjectMocks
private Bar fixture;
#BeforeMethod
public void setUp() {
MockitoAnnotations.initMocks(this);
}
For auto-mocking the annotated class and resetting it after every test
Currently I'm settling for
TestClass extends PlaySpec with BeforeAndAfterEach
private val foo = mock[Foo]
override def fakeApplication(): Application =
new GuiceApplicationBuilder().overrides(bind[Foo].toInstance(foo)).build
override protected def beforeEach(): Unit = {
reset(foo)
}
A cursory attempt at using the java annotations in the scala test was not successful. My current approach works, I just want to make sure there's not a nicer one.
mockito-scala solves this problem from version 0.1.1, as it provides a trait (org.mockito.integrations.scalatest.ResetMocksAfterEachTest) that helps to automatically reset any existent mock after each test is run
The trait has to be mixed after org.mockito.MockitoSugar in order to work, otherwise your test will not compile
So your code would look like this
TestClass extends PlaySpec with MockitoSugar with ResetMocksAfterEachTest
private val foo = mock[Foo]
override def fakeApplication(): Application =
new GuiceApplicationBuilder().overrides(bind[Foo].toInstance(foo)).build
The main advantage being not having to remember to manually reset each one of the mocks...
If for some reason you want to have a mock that is not reset automatically while using this trait, then it should be
created via the companion object of org.mockito.MockitoSugar so is not tracked by this mechanism
Disclaimer: I'm a developer of that library

Easymock new object and handle its function call (No PowerMock)

I have a class which needs to be unit tested. The structure is as follows:
public class classToBeTested{
public returnType function1(){
//creation of variable V
return new classA(V).map();
}
}
The class classA is as follows:
public class classA{
//Constructor
public returnType map(){
//Work
}
}
I am creating unit tests in Scala using FunSuite, GivenWhenThen and EasyMock.
My test structure is as follows:
class classToBeTested extends FunSuite with GivenWhenThen with Matchers with EasyMockSugar {
test(""){
Given("")
val c = new classToBeTested()
expecting{
}
When("")
val returnedResponse = c.function1()
Then("")
//checking the returned object
}
}
What do I need to write in expectation?
How can I handle the above scenario?
Note: PowerMock can't be used.
Answer:
Thanks, #Henri. After a lot of searching and the answer provided by #Henri refactoring the code is the best way to handle this situation. Reason below:
The unit test cannot mock the objects that are created by the new call(without PowerMock). And hence to test the code, we need to write the unit tests according to the conditions present in the class being used(here classA) in the class to be tested(here classToBeTested). Hence while testing classToBeTested, we need to be aware of the functioning and structure of classA and create the test cases respectively.
As now the test cases are dependent on the structure of methods in classA, implying that classToBeTested and classA are tightly coupled. And hence by TDD approach, we need to refactor the code.
In the example above:
rather than using
classA object = new classA(V);
it's better to provide the object to the method(eg: Autowiring the classA object in Spring MVC).
Open to suggestions. Also if somebody could give a better explanation please do that.
You can't. The instantiation of what you want to mock is in the class you want to test. So without powermock, you need refactoring to make it work.
The minimum would be to extract the class creation into another method
public class classToBeTested{
public returnType function1(){
//creation of variable V
return getClassA(V).map();
}
protected classA getClassA(VClass v) {
return new classA(v);
}
Then, you can do a partial mock. I don't know how to do it in scala, so the code below is probably wrong but I hope you will get the idea.
class classToBeTested extends FunSuite with GivenWhenThen with Matchers with EasyMockSugar {
test(""){
Given("")
val c = partialMockBuilder(classToBeTested.class).addMockedMethod("getClassA").build()
val a = mock(classA.class)
expecting{
expect(c.getClassA()).andReturn(a)
expect(a.map()).andReturn(someReturnType)
}
When("")
val returnedResponse = c.function1()
Then("")
//checking the returned object
// whatever you need to check
}
}

Dependency Injection to Play Framework 2.5 modules

I have a module class with the following signature:
class SilhouetteModule extends AbstractModule with ScalaModule {
I would like to inject configuration:
class SilhouetteModule #Inject() (configuration: Configuration) extends AbstractModule with ScalaModule {
But it fails with the following error.
No valid constructors
Module [modules.SilhouetteModule] cannot be instantiated.
The Play documentation mentions that
In most cases, if you need to access Configuration when you create a component, you should inject the Configuration object into the component itself or...
, but I can't figure out how to do it successfully. So the question is, how do I inject a dependency into a module class in Play 2.5?
There are two solutions to solve your problem.
First one (and the more straight forward one):
Do not extend the com.google.inject.AbstractModule. Instead use the play.api.inject.Module. Extending that forces you to override def bindings(environment: Environment, configuration: Configuration): Seq[Binding[_]]. Within that method you could do all your bindings and you get the configuration inserted as a method-parameter.
Second one (and the more flexible one):
Depending on your needs of the components you want to inject, you could define a provider for the component you want to bind. In that provider you could inject whatever you want. E.g.
import com.google.inject.Provider
class MyComponentProvider #Inject()(configuration:Configuration) extends Provider[MyComponent] {
override def get(): MyComponent = {
//do what ever you like to do with the configuration
// return an instance of MyComponent
}
}
Then you could bind your component within your module:
class SilhouetteModule extends AbstractModule {
override def configure(): Unit = {
bind(classOf[MyComponent]).toProvider(classOf[MyComponentProvider])
}
}
The advantage of the second version, is that you are able to inject what ever you like. In the first version you get "just" the configuration.
Change your constructor signature from:
class SilhouetteModule #Inject() (configuration: Configuration) extends AbstractModule with ScalaModule
to:
class SilhouetteModule(env: Environment, configuration: Configuration) extends AbstractModule with ScalaModule
see here for more info:
https://github.com/playframework/playframework/issues/8474

Dependency injection with abstract class and object in Play Framework 2.5

I'm trying to migrate from Play 2.4 to 2.5 avoiding deprecated stuff.
I had an abstract class Microservice from which I created some objects. Some functions of the Microservice class used play.api.libs.ws.WS to make HTTP requests and also play.Play.application.configuration to read the configuration.
Previously, all I needed was some imports like:
import play.api.libs.ws._
import play.api.Play.current
import play.api.libs.concurrent.Execution.Implicits.defaultContext
But now you should use dependency injection to use WS and also to use access the current Play application.
I have something like this (shortened):
abstract class Microservice(serviceName: String) {
// ...
protected lazy val serviceURL: String = play.Play.application.configuration.getString(s"microservice.$serviceName.url")
// ...and functions using WS.url()...
}
An object looks something like this (shortened):
object HelloWorldService extends Microservice("helloWorld") {
// ...
}
Unfortunately I don't understand how I get all the stuff (WS, configuration, ExecutionContect) into the abstract class to make it work.
I tried to change it to:
abstract class Microservice #Inject() (serviceName: String, ws: WSClient, configuration: play.api.Configuration)(implicit context: scala.concurrent.ExecutionContext) {
// ...
}
But this doesn't solve the problem, because now I have to change the object too, and I can't figure out how.
I tried to turn the object into a #Singleton class, like:
#Singleton
class HelloWorldService #Inject() (implicit ec: scala.concurrent.ExecutionContext) extends Microservice ("helloWorld", ws: WSClient, configuration: play.api.Configuration) { /* ... */ }
I tried all sorts of combinations, but I'm not getting anywhere and I feel I'm not really on the right track here.
Any ideas how I can use things like WS the proper way (not using deprecated methods) without making things so complicated?
This is more related to how Guice handles inheritance and you have to do exactly what you would do if you were not using Guice, which is declaring the parameters to the superclass and calling the super constructor at your child classes. Guice even suggest it at its docs:
Wherever possible, use constructor injection to create immutable objects. Immutable objects are simple, shareable, and can be composed.
Constructor injection has some limitations:
Subclasses must call super() with all dependencies. This makes constructor injection cumbersome, especially as the injected base class changes.
In pure Java, it will means doing something like this:
public abstract class Base {
private final Dependency dep;
public Base(Dependency dep) {
this.dep = dep;
}
}
public class Child extends Base {
private final AnotherDependency anotherDep;
public Child(Dependency dep, AnotherDependency anotherDep) {
super(dep); // guaranteeing that fields at superclass will be properly configured
this.anotherDep = anotherDep;
}
}
Dependency injection won't change that and you will just have to add the annotations to indicate how to inject the dependencies. In this case, since Base class is abstract, and then no instances of Base can be created, we may skip it and just annotate Child class:
public abstract class Base {
private final Dependency dep;
public Base(Dependency dep) {
this.dep = dep;
}
}
public class Child extends Base {
private final AnotherDependency anotherDep;
#Inject
public Child(Dependency dep, AnotherDependency anotherDep) {
super(dep); // guaranteeing that fields at superclass will be properly configured
this.anotherDep = anotherDep;
}
}
Translating to Scala, we will have something like this:
abstract class Base(dep: Dependency) {
// something else
}
class Child #Inject() (anotherDep: AnotherDependency, dep: Dependency) extends Base(dep) {
// something else
}
Now, we can rewrite your code to use this knowledge and avoid deprecated APIs:
abstract class Microservice(serviceName: String, configuration: Configuration, ws: WSClient) {
protected lazy val serviceURL: String = configuration.getString(s"microservice.$serviceName.url")
// ...and functions using the injected WSClient...
}
// a class instead of an object
// annotated as a Singleton
#Singleton
class HelloWorldService(configuration: Configuration, ws: WSClient)
extends Microservice("helloWorld", configuration, ws) {
// ...
}
The last point is the implicit ExecutionContext and here we have two options:
Use the default execution context, which will be play.api.libs.concurrent.Execution.Implicits.defaultContext
Use other thread pools
This depends on you, but you can easily inject an ActorSystem to lookup the dispatcher. If you decide to go with a custom thread pool, you can do something like this:
abstract class Microservice(serviceName: String, configuration: Configuration, ws: WSClient, actorSystem: ActorSystem) {
// this will be available here and at the subclass too
implicit val executionContext = actorSystem.dispatchers.lookup("my-context")
protected lazy val serviceURL: String = configuration.getString(s"microservice.$serviceName.url")
// ...and functions using the injected WSClient...
}
// a class instead of an object
// annotated as a Singleton
#Singleton
class HelloWorldService(configuration: Configuration, ws: WSClient, actorSystem: ActorSystem)
extends Microservice("helloWorld", configuration, ws, actorSystem) {
// ...
}
How to use HelloWorldService?
Now, there are two things you need to understand in order to proper inject an instance of HelloWorldService where you need it.
From where HelloWorldService gets its dependencies?
Guice docs has a good explanation about it:
Dependency Injection
Like the factory, dependency injection is just a design pattern. The core principle is to separate behaviour from dependency resolution.
The dependency injection pattern leads to code that's modular and testable, and Guice makes it easy to write. To use Guice, we first need to tell it how to map our interfaces to their implementations. This configuration is done in a Guice module, which is any Java class that implements the Module interface.
And then, Playframework declare modules for WSClient and for Configuration. Both modules gives Guice enough information about how to build these dependencies, and there are modules to describe how to build the dependencies necessary for WSClient and Configuration. Again, Guice docs has a good explanation about it:
With dependency injection, objects accept dependencies in their constructors. To construct an object, you first build its dependencies. But to build each dependency, you need its dependencies, and so on. So when you build an object, you really need to build an object graph.
In our case, for HelloWorldService, we are using constructor injection to enable Guice to set/create our object graph.
How HelloWorldService is injected?
Just like WSClient has a module to describe how an implementation is binded to an interface/trait, we can do the same for HelloWorldService. Play docs has a clear explanation about how to create and configure modules, so I won't repeat it here.
But after creating an module, to inject a HelloWorldService to your controller, you just declare it as a dependency:
class MyController #Inject() (service: Microservice) extends Controller {
def index = Action {
// access "service" here and do whatever you want
}
}
In scala,
-> If you do not want to explicitly forward all the injected parameters to the base constructor, you can do it like that :
abstract class Base {
val depOne: DependencyOne
val depTwo: DependencyTwo
// ...
}
case class Child #Inject() (param1: Int,
depOne: DependencyOne,
depTwo: DependencyTwo) extends Base {
// ...
}

Spark Unit Testing: How to initialize sc only once for all the Suites using FunSuite

I want to write spark unit test cases and I am using FunSuite for it.
But i want that my sparkContext is initialized only once , used by all the Suites and then is killed when all Suites completes.
abstract class baseClass extends FunSuite with BeforeAndAfter{
before {
println("initialize spark context")
}
after {
println("kill spark context")
}
}
#RunWith(classOf[JUnitRunner])
class A extends baseClass{
test("for class A"){
//assert
}
#RunWith(classOf[JUnitRunner])
class B extends baseClass{
test(for class b){
//assert
}
}
but when i run sbt test
I can see println statement baseClass has been called from both the tests. Obsiously When the object is created for both the classes A and B , Abstract
baseclass is called.
But then how can we achieve my purpose i.e spark context is iniliazed only once while all the test cases are run
Option 1: Use the excellent https://github.com/holdenk/spark-testing-base library that does exactly that (and provides many other nice treats). After following the readme, it's as simle as mixing-in SharedSparkContext instead of your baseClass, and you'll have an sc: SparkContext value ready to use in your test
Option 2: to do it yourself, you'd want to mix-in BeforeAndAfterAll and not BeforeAndAfter, and implement beforeAll and afterAll, which is exactly what the above-mentioned SharedSparkContext does.
I strongly recommend using the spark-testing-base library in order to manage the lifecycle of a sparkContext or sparkSession during your tests.
You won't have to pollute your tests by overriding the beforeAll, afterAll methods and managing the lifecycle of the sparkSession/sparkContext.
You can share one sparkSession/sparkContext for all the tests by overriding the following method :
def reuseContextIfPossible: Boolean = true
for more details : https://github.com/holdenk/spark-testing-base/wiki/SharedSparkContext
I hope it helps!
If you really want to share the context between suites - you'll have to make it static. Then you can use a lazy value to make it start on first use. As for shutting it down - you can leave it to the automatic Shutdown hook created each time a context is created.
It would look something like:
abstract class SparkSuiteBase extends FunSuite {
lazy val sparkContext = SparkSuiteBase.sparkContext
}
// putting the Spark Context inside an object allows reusing it between tests
object SparkSuiteBase {
private lazy val sparkContext = ??? // create the context here
}