Play framework - testing data access layer without started application - scala

Is there a way to write tests for the data access objects (DAOs) in play framework 2.x without starting an app?
Tests with fake app are relatively slow even if the database is an in-memory H2 as the docs suggests.

After experiencing similar issues with execution time of tests using FakeAplication I switched to a different approach. Instead of creating one fake app per test I start a real instance of the application and run all my tests against it. With a large test suite there is a big win in total execution time.
http://yefremov.net/blog/fast-functional-tests-play/

For unit testing, a good solution is mocking. If you are using Play 2.4 and above, Mockito is already built in, and you do not have to import mockito separately.
For integration testing, you cannot run tests without fake application, since sometimes your DAOs probably require application context information, for example the information defined in application.conf. In this case, you must setup a FakeApplication with fake application configuration so that DAOs have that information.
This sample repo,https://github.com/luongbalinh/play-mongo/tree/master/test, contains tests at service and controller layers, including both unit tests with Mockito and integration tests. Integration tests for DAOs should be very similar to the service tests. Hopefully, it gives you a hint of how to use Mockito to write DAO tests.

Turns out the Database object can be constructed directly from the Databases factory, so ended up having a trait like this one:
trait DbTests extends BeforeAndAfterAll with SuiteMixin { this: Suite =>
val dbUrl = sys.env.getOrElse("DATABASE_URL",
"jdbc:postgresql://localhost:5432/testuser=user&password=pass")
val database = Databases("org.postgresql.Driver", dbUrl, "tests")
override def afterAll() = {
database.shutdown()
}
}
then use it the following way:
class SampleDaoTest extends DbTests {
val myDao = new MyDao(database) //construct the dao, database is injected so can be passed
"read form db" in {
myDao.read(id = 123) mustEqual MyClass(123)
}
}

Related

Cannot mock the RedisClient class - method pipeline overrides nothing

I have a CacheService class that uses an instance of the scala-redis library
class CacheService(redisClient: RedisClient) extend HealthCheck {
private val client = redisClient
override def health: Future[ServiceHealth] = {
client.info
...
}
In my unit test, I'm mocking the client instance and testing the service
class CacheServiceSpec extends AsyncFlatSpec with AsyncMockFactory {
val clientMock = mock[RedisClient]
val service = new CacheService(clientMock)
"A cache service" must "return a successful future when healthy" in {
(clientMock.info _).expects().returns(Option("blah"))
service.health map {
health => assert(health.status == ServiceStatus.Running)
}
}
}
yet I'm getting this compilation error
Error:(10, 24) method pipeline overrides nothing.
Note: the super classes of <$anon: com.redis.RedisClient> contain the following, non final members named pipeline:
def pipeline(f: PipelineClient => Any): Option[List[Any]]
val clientMock = mock[RedisClient]
My research so far indicates ScalaMock 4 is NOT capable of mocking companion objects. The author suggests refactoring the code with Dependency Injection.
Am I doing DI correctly (I chose constructor args injection since our codebase is still relatively small and straightforward)? Seems like the author is suggesting putting a wrapper over the client instance. If so, I'm looking for an idiomatic approach.
Should I bother with swapping out for another redis library? The libraries being actively maintained, per redis.io's suggestion, use companion objects as well. I personally think this is is not a problem of these libraries.
I'd appreciate any further recommendations. My goal here is to create a health check for our external services (redis, postgres database, emailing and more) that is at least testable. Criticism is welcomed since I'm still new to the Scala ecosystem.
Am I doing DI correctly (I chose constructor args injection since our
codebase is still relatively small and straightforward)? Seems like
the author is suggesting putting a wrapper over the client instance.
If so, I'm looking for an idiomatic approach.
Yes, you are right and this seems to be a known issue(link1). Ideally, there needs to a wrapper around the client instance. One approach could be to create a trait that has a method say connect and extend it to RedisCacheDao and implement the connect method to give you the client instance whenever you require. Then, all you have to do is to mock this connection interface and you will be able to test.
Another approach could be to use embedded redis for unit testing though usually, it is used for integration testing.You can start a simple redis server where the tests are running via code and close it once the testing is done.
Should I bother with swapping out for another redis library? The
libraries being actively maintained, per redis.io's suggestion, use
companion objects as well. I personally think this is is not a problem
of these libraries.
You can certainly do that. I would prefer Jedis as it is easy and it's performance is better than scala-redis(while performing mget).
Let me know if it helps!!

Mocking DynamoDB in Spark with Mockito

I want to mock Utilities function dynamoDBStatusWrite so that when my spark program will run, it will not hit the DynamoDB.
Below is my mocking and test case stuff
class FileConversion1Test extends FlatSpec with MockitoSugar with Matchers with ArgumentMatchersSugar with SparkSessionTestWrapper {
"File Conversion" should "convert the file to" in {
val utility = mock[Utilities1]
val client1 = mock[AmazonDynamoDB]
val dynamoDB1 =mock[DynamoDB]
val dynamoDBFunc = mock[Utilities1].dynamoDBStatusWrite("test","test","test","test")
val objUtilities1 = new Utilities1
FieldSetter.setField(objUtilities1,objUtilities1.getClass.getDeclaredField("client"),client1)
FieldSetter.setField(objUtilities1,objUtilities1.getClass.getDeclaredField("dynamoDB"),dynamoDB1)
FieldSetter.setField(objUtilities1,objUtilities1.getClass.getField("dynamoDBStatusWrite"),dynamoDBFunc)
when(utility.dynamoDBStatusWrite("test","test","test","test")).thenReturn("pass")
assert(FileConversion1.fileConversionFunc(spark,"src/test/inputfiles/userdata1.csv","parquet","src/test/output","exec1234567","service123")==="passed")
}
}
My spark program should not try to connect dynamoDB. but is trying to connect
You have 2 problems there, for starters the fact that you mock something doesn't replace it automatically in your system, you need to build your software so components are injected and so in the test you would be providing a mock version of them. ie, fileConversionFunc should receive another parameter with the connector to Dynamo.
That said, it's considered a bad practice to mock library/3rd party classes, what you should do there is to create your own component that encapsulates the interaction with Dynamo, and then mock your component, as it's an API you control.
You can find a detailed explanation of why here

Not able to mock the getTimestamp method on mocked Row

I am writing an application that interacts with Cassandra using Scala. While performing unit testing, I am using mockito wherein I am mocking the resultSet and row
val mockedResultSet = mock[ResultSet]
val mockedRow = mock[Row]
Now while mocking the methods of the mockedRow, such as
doReturn("mocked").when(mockedRow).getString("ColumnName")
works fine. However, I am not able to mock the getTimestamp method of the mockedRow. I have tried 2 approaches but was not successful.
First approach
val testDate = "2018-08-23 15:51:12+0530"
val formatter = new SimpleDateFormat("yyyy-mm-dd HH:mm:ssZ")
val date: Date = formatter.parse(testDate)
doReturn(date).when(mockedRow).getTimestamp("ColumnName")
and second approach
when(mockedRow.getTimestamp("column")).thenReturn(Timestamp.valueOf("2018-08-23 15:51:12+0530"))
Both of them return null i.e it does not return the mocked value of the getTimestamp method. I am using cassandra driver core 3.0 dependency in my project.
Any help would br highly appreciated. Thanks in advance !!!
Mocking objects you don't own is usually considered a bad practice, that said, what you can do to try to see what's going on is to verify the interactions with the mock, i.e.
verify(mockedRow).getTimestamp("column")
Given you are getting null out of the mock, that statement should fail, but the failure will show all the actual calls received by the mock (and it's parameters), which should help you to debug.
A way to minimize this kind of problems is to use a mockito session, in standard mockito they can only be used through a JUnit runner, but with mockito-scala you can use them manually like this
MockitoScalaSession().run {
val mockedRow = mock[Row]
when(mockedRow.getTimestamp("column")).thenReturn(Timestamp.valueOf("2018-08-23 15:51:12+0530"))
//Execute your test
}
That code will check that the mock is not being called with anything that hasn't been stubbed for, it will also tell you if you had provided stubs that weren't actually used and a few more things.
If you like that behaviour (and you are using ScalaTest) you can apply it automatically to every test by using MockitoFixture
I'm a developer of mockito-scala btw

How to unit test a playframework 2 application [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Mock Objects in Play[2.0]
I am learning Scala and playframework, while developing a simple application. One thing frustrates me. I have strong C# background, and used to unit testing in classical terms - mocking underlying services and test only the code in the given class.
And the question is - how to unit test a playframework application written is Scala? The way of testing proposed by playframework manual - is an integration tests, which are good, but not the thing I need. Particulary - how to mock the data access layer?
Creating mock objects is usually needed when you can not isolate your tests by having to load too many dependencies in your application before testing. You don't have that limitation when you test your data access layer in Play 2.X. Therefor, all you need to do is to use Specs2 Specification and load the in-memory database using FakeApplication(additionalConfiguration = inMemoryDatabase()
A complete test could then be written like this:
class ProjectSpec extends Specification {
"Project model" should {
"be created with id and name" in {
running(FakeApplication(additionalConfiguration = inMemoryDatabase())) {
val beforeCount = Project.count
val project = Project.create(Project("Test name", "Test description"))
project.id must beSome
project.name must equalTo("Test name")
Project.count must equalTo(beforeCount + 1L)
}
}
}
}

Grails (2.0.3) with rest plugin works! But unit test fails. Why?

Pardon if this is a n00b question, I'm a n00b to grails...
I have installed the "rest" plugin using grails install-plugin rest.
My service class has this code (redacted):
def index() {
def data
withRest(uri:'http://localhost:8090/some/valid/url/running/here/') {
auth.basic 'admin', 'admin'
def response = get(path: 'something', query: [format: 'json'])
data = response.data
}
return data
}
If I run grails console, instantiate my service class and call service.index(), I get my expected JSON result. This code works as expected. It works through a controller. It even works through a controller via an integration test.
Here is my unit test:
void testIndex() {
def response = service.index()
assertEquals(response.total, 2)
assertEquals(response.receipts.size, 2)
assertEquals(response.receipts.collectEntries{ [it.id, [id: it.id]] }, [1: [id:1], 2:[id:2]])
}
This fails with an error:
groovy.lang.MissingMethodException: No signature of method: torch.ReceiptService.withRest() is applicable for argument types: (java.util.LinkedHashMap, torch.ReceiptService$_index_closure1) values: [[uri:http://localhost:8090/some/valid/url/running/here/], ...]
So it seems that when the test is running, the plugin is not active. I have not done any additional configuration regarding the plugin. I don't really understand why the test environment should have this class compiled differently.
My intention was to mock the network interface once I got this going, since it has an external dependency. But I'm taking it a step at a time.
Do I need to inject a mock withRest() to even run the test? Or is something else amiss?
Thanks!
This is how unit tests work. No plugins are active, there's no Spring, Hibernate, etc. You're just running a class and so everything has to be mocked. Seems like REST is a poor candidate for mocking though, since then you'd just be testing the mocks.
I'd probably test it with a functional test. This is unfortunately less convenient than unit tests, but bad tests aren't very useful :) You could configure it so the url is looked up in the config to allow a different URL for testing so you don't need to hit an external service, and then setup a real REST service for testing that returns known values.