Mocking database with Slick in ScalaTest + Mockito and testing UPDATE - scala

The documentation for unit testing a Scala application https://www.playframework.com/documentation/2.4.x/ScalaTestingWithScalaTest talks about mocking the database access using Mockito. While this method works very well to test methods that get information from the database, I'm not seeing a clear solution how to test methods that insert, update or delete data.
This is what I have setup so far:
trait UserRepository { self: HasDatabaseConfig[JdbcProfile] =>
import driver.api._
class UserTable(tag: Tag) extends Table[userModel](tag, "users") {
def id = column[Int]("id", O.PrimaryKey, O.AutoInc )
def email = column[String]("email")
def * = (id.?, email) <> (userModel.tupled, userModel.unapply _)
}
def allUsers() : Future[Seq[userModel]]
def update(user: userModel) : Future[Int]
}
class SlickUserRepository extends UserRepository with HasDatabaseConfig[JdbcProfile] {
import driver.api._
protected val dbConfig = DatabaseConfigProvider.get[JdbcProfile](Play.current)
private val users = TableQuery[UserTable]
override def allUsers(): Future[Seq[userModel]] = {
db.run(users.result)
}
def update(user: userModel): Future[Int] = {
db.run(userTableQuery.filter(_.id === user.id).update(user))
}
}
class UserService(userRepository: UserRepository) {
def getUserById(id: Int): Future[Option[userModel]] = {
userRepository.allUsers().map { users =>
users.find(_.id.get == id)
}
// TODO, test this...
def updateUser(user: userModel): Future[Int] = {
userRepository.update(user)
}
}
And then my tests:
class UserSpec extends PlaySpec with MockitoSugar with ScalaFutures {
"UserService" should {
val userRepository = mock[UserRepository]
val user1 = userModel(Option(1), "user1#test.com")
val user2 = userModel(Option(2), "user2#test.com")
// mock the access and return our own results
when(userRepository.allUsers) thenReturn Future {Seq(user1, user2)}
val userService = new UserService(userRepository)
"should find users correctly by id" in {
val future = userService.getUserById(1)
whenReady(future) { user =>
user.get mustBe user1
}
}
"should update user correctly" in {
// TODO test this
}
}
I suppose I need to mock out the 'update' method and create a stub that takes the argument and updates the mocked data. However, my skills in Scala are limited and I can't wrap my head around it. Is there perhaps a better way?
Thanks!

I'd recommend two unit test classes here. One for testing the logic in the UserService class. Another test class that tests the logic the UserRepository class (for this one use a dummy test class that extends the trait). Since the SlickUserRepository class has its own test coverage this allows the UserService test class to use a mock[UserRepository] in its own tests without degrading coverage and its tests only focus on its class' logic.
Doing this really simplifies the UserService tests so I won't dwell on those.
For the SlickUserRepository tests, I would recommend restructuring the logic in the SlickUserRepository class.
I'd recommend separating the logic living inside the db.run and have it as a separate method that constructs an action. This allows you to write direct tests on that logic that lives inside the "db.run{}".
You will find having the db.run integrated inside your update method as it is now will impair your ability to construct transactions that encompass multiple table calls. DbActions need to be chained together and run in one db.run(myDbAction.transactionally) to be transactional. Thats why I personally have my db.run logic in a business logic layer and not directly inside the persistence layer like it is in your example.
where ever you do have a db.run call its nice to place it as a separate method so you can easily spy around the call:
def run[M](action: DBIO[M]): Future[M] = {
db.run(action)
}
futures don't need mocked. Simply defined these as result you want:
Future.failed("your intanciated exception")
Future.success("your intanciated success class")

Related

Mock new object creation in Scala

I want to write unit test for below scala class.
In the below implementation, QueryConfig is final case class.
class RampGenerator {
def createProfile(queryConfig: QueryConfig): String = {
new BaseQuery(queryConfig).pushToService().getId
}
}
The unit test I have written is this
#RunWith(classOf[JUnitRunner])
class RampGeneratorTest extends FlatSpec with Matchers {
"createProfile" must "succeed" in {
val rampGenerator = new RampGenerator()
val queryConfig = QueryConfig("name", "account", “role")
val baseQuery = mock(classOf[BaseQuery])
val profile = mock(classOf[Profile])
when(new BaseQuery(queryConfig)).thenReturn(baseQuery)
when(baseQuery.pushToService()).thenReturn(profile)
when(profile.getId).thenReturn("1234")
val id = rampGenerator.createProfile(queryConfig)
assert(id.equals("1234"))
}
}
Currently it gives below exception, which is expected, since I don't have mocked class used in when. How do I mock the new instance creation?
org.mockito.exceptions.misusing.MissingMethodInvocationException:
when() requires an argument which has to be 'a method call on a mock'.
For example:
when(mock.getArticles()).thenReturn(articles);
There are two options:
Use powermockito to mock the constructor (see this question for details)
Externalize object creation
A bit more on the second option - this is actually a testing technique that helps in a variety of situations (a couple of examples: yours, creating akka actors and asserting on hierarchies) - so it might be useful to just have it in the "toolbox".
In your case it'll look something like this:
class RampGenerator(queryFactory: QueryFactory) {
def createProfile(queryConfig: QueryConfig) = queryFactory.buildQuery(queryConfig).pushToService().getId()
}
class QueryFactory() {
def buildQuery(queryConfig: QueryConfig): BaseQuery = ...
}
#RunWith(classOf[JUnitRunner])
class RampGeneratorTest extends FlatSpec with Matchers {
"createProfile" must "succeed" in {
val rampGenerator = new RampGenerator()
val queryConfig = QueryConfig("name", "account", “role")
val queryFactory = mock(classOf[QueryFactory])
val profile = mock(classOf[Profile])
val baseQuery = mock(classOf[BaseQuery])
when(queryFactory.buildQuery(queryConfig)).thenReturn(baseQuery)
when(baseQuery.pushToService()).thenReturn(profile)
when(profile.getId).thenReturn("1234")
val id = rampGenerator.createProfile(queryConfig)
assert(id.equals("1234"))
}
}
Please note query factory does not have to be a separate factory class/hierarchy of classes (and certainly does not require something as heavyweight as abstract factory pattern - although you can use it). In particular, my initial version was just using queryFactory: QueryConfig => BaseQuery function, but mockito cannot mock functions...
If you prefer to inject factory method directly (via function), Scalamock has support for mocking functions

Play-Slick: Is it possible to improve this design (pattern) ... and how to call it?

I'm using Play-Slick versions 2.5.x and 3.1.x respectively. I use Slick's code generator and produce the Slick model from an existing database. Actually I'm shy to admit that I'm DB-design driven and not class-design driven.
This is the initial setup:
Generated Slick model under generated.Tables._
Generic Slick dao implementation
Service layer that builds on top of the Generic Slick dao
These are the forces behind the pattern which I temporary called "Pluggable Service" because it allows plugging in the service layer functionality to the model:
Play's controllers and views must only see the Service layer (and not the Dao's) e.g. UserService
Generated model e.g. UserRow is expected to comply to business layer interfaces e.g. Deadbolt-2's Subject but not implement it directly. To be able to implement it one needs "too much" e.g. the UserRow model type, the UserDao and potentially some business context.
Some of the UserService methods naturally apply to the model UserRow instance e.g. loggedUser.roles or loggedUser.changePassword
Therefore I have:
generated.Tables.scala Slick model classes:
case class UserRow(id: Long, username: String, firstName: String,
lastName : String, ...) extends EntityAutoInc[Long, UserRow]
dao.UserDao.scala Dao extensions and customizations specific to the User model:
#Singleton
class UserDao #Inject()(protected val dbConfigProvider: DatabaseConfigProvider)
extends GenericDaoAutoIncImpl[User, UserRow, Long] (dbConfigProvider, User) {
//------------------------------------------------------------------------
def roles(user: UserRow) : Future[Seq[Role]] = {
val action = (for {
role <- SecurityRole
userRole <- UserSecurityRole if role.id === userRole.securityRoleId
user <- User if userRole.userId === user.id
} yield role).result
db.run(action)
}
}
services.UserService.scala service that facades all user operations to the rest of the Play application:
#Singleton
class UserService #Inject()(auth : PlayAuthenticate, userDao: UserDao) {
// implicitly executes a DBIO and waits indefinitely for
// the Future to complete
import utils.DbExecutionUtils._
//------------------------------------------------------------------------
// Deadbolt-2 Subject implementation expects a List[Role] type
def roles(user: UserRow) : List[Role] = {
val roles = userDao.roles(user)
roles.toList
}
}
services.PluggableUserService.scala finally the actual "Pluggable" pattern that dynamically attaches service implementations to the model type:
trait PluggableUserService extends be.objectify.deadbolt.scala.models.Subject {
override def roles: List[Role]
}
object PluggableUserService {
implicit class toPluggable(user: UserRow)(implicit userService: UserService)
extends PluggableUserService {
//------------------------------------------------------------------------
override def roles: List[Role] = {
userService.roles(user)
}
}
Finally one can do in the controllers:
#Singleton
class Application #Inject() (implicit
val messagesApi: MessagesApi,
session: Session,
deadbolt: DeadboltActions,
userService: UserService) extends Controller with I18nSupport {
import services.PluggableUserService._
def index = deadbolt.WithAuthRequest()() { implicit request =>
Future {
val user: UserRow = userService.findUserInSession(session)
// auto-magically plugs the service to the model
val roles = user.roles
// ...
Ok(views.html.index)
}
}
Is there any Scala way that could help not having to write the boilerplate code in the Pluggable Service object? does the Pluggable Service name makes sense?
One of the common variant may be a parent trait for your controllers that has something along these lines:
def MyAction[A](bodyParser: BodyParser[A] = parse.anyContent)
(block: (UserWithRoles) => (AuthenticatedRequest[A]) => Future[Result]): Action[A] = {
deadbolt.WithAuthRequest()(bodyParser) { request =>
val user: UserRow = userService.findUserInSession(session)
// this may be as you had it originally
// but I don't see a reason not to
// simply pull it explicitly from db or
// to have it in the session together with roles in the first place (as below UserWithRoles class)
val roles = user.roles
block(UserWithRoles(user, roles))(request)
}
The elephant in the room here is how you get userService instance. Well you would need to explicitly require it in your controller constructor (in the same way you do with DeadboltActions). Alternatively you may bundle DeadboltActions, UserService and what else into one class (e.g. ControllerContext?) and inject this single instance as one constructor parameter (but that's probably another discussion...).
After that your controller code would be like this:
def index = MyAction() { implicit user => implicit request =>
Future {
// ...
Ok(views.html.index)
}
}
both user and request is implicit which helps to pass into into inner parts of your application (which is often the case - you bring user object to perform some business logic).
It doesn't get rid of your PluggableUserService per se (logic is still there) but it may help you to easier reuse same logic everywhere in your controllers (as in my experience, you need to have both User together with Roles more often than not in any real application).
EDIT: I got a feeling I didn't quite get your question. You want to avoid boilerplate in PluggableUserService or you want to avoid scattering this conversion with use of PluggableUserService everywhere, in every controller (IMHO 2nd option is something to be avoided)?

Chain functions in different way

Scala functions has following methods for chaining:
fn1.andThen(fn2)
fn1.compose(fn2)
But how can be written this case:
I have function cleanUp() which has to be called always as last step.
And I have a bunch of other functions, like that:
class Helper {
private[this] val umsHelper = new UmsHelper()
private[this] val user = umsHelper.createUser()
def cleanUp = ... // delete user/ and other entities
def prepareModel(model: TestModel) = {
// create model on behalf of the user
}
def commitModel() = {
// commit model on behalf of the user
}
}
And some external code can use code something like this:
val help = new Helper()
help.prepareModel()
help.commitModel()
// last step should be called implicitly cleanUp
How this can be written in a functional way, that chaining will always
call cleanUp function implicitly as last step?
Note: I see it as analogue of destructor in C++. Some chaining (doesn't matter how this chain is done) fn1 andLater fn2 andLater fn3 have to call as last step cleanUp (fn1 andLater fn2 andLater fn3 andLater cleanUp). Wrong with directly writing cleanUp method is there is a big chance someone will miss this step and user will be leaked (will be stayed in database)
This is a more advanced alternative:
When you hear "context" and "steps", there's a functional pattern that directly comes to mind: Monads. Rolling up your own monad instance can simplify the user-side of putting valid steps together, while providing warranties that the context will be cleaned up after them.
Here, we are going to develop a "CleanableContext" construction that follows that pattern.
We base our construct on the most simple monad, one whose only function is to hold a value. We're going to call that Context
trait Context[A] { self =>
def flatMap[B](f:A => Context[B]): Context[B] = f(value)
def map[B](f:A => B): Context[B] = flatMap(f andThen ((b:B) => Context(b)))
def value: A
}
object Context {
def apply[T](x:T): Context[T] = new Context[T] { val value = x }
}
Then we have a CleanableContext, which is capable of "cleaning up after itself" provided some 'cleanup' function:
trait CleanableContext[A] extends Context[A] {
override def flatMap[B](f:A => Context[B]): Context[B] = {
val res = super.flatMap(f)
cleanup
res
}
def cleanup: Unit
}
And now, we have an object that's able to produce a cleanable UserContext that will take care of managing the creation and destruction of users.
object UserContext {
def apply(x:UserManager): CleanableContext[User] = new CleanableContext[User] {
val value = x.createUser
def cleanup = x.deleteUser(value)
}
}
Let's say that we have also our model and business functions already defined:
trait Model
trait TestModel extends Model
trait ValidatedModel extends Model
trait OpResult
object Ops {
def prepareModel(user: User, model: TestModel): Model = new Model {}
def validateModel(model: Model): ValidatedModel = new ValidatedModel {}
def commitModel(user: User, vmodel: ValidatedModel): OpResult = new OpResult {}
}
Usage
With that reusable machinery in place, our users can express our process in a succinct way:
import Ops._
val ctxResult = for {
user <- UserContext(new UserManager{})
validatedModel <- Context(Ops.prepareModel(user, testModel)).map(Ops.validateModel)
commitResult <- Context(commitModel(user, validatedModel))
} yield commitResult
The result of the process is still encapsulated, and can be taken "out" from the Context with the value method:
val result = ctxResult.value
Notice that we need to encapsulate the business operations into a Context to be used in this monadic composition. Note as well that we don't need to manually create nor cleanup the user used for the operations. That's taken care of for us.
Furthermore, if we needed more than one kind of managed resource, this method could be used to take care of managing additional resources by composing different contexts together.
With this, I just want to provide another angle to the problem. The plumbing is more complex, but it creates a solid ground for users to create safe processes through composition.
I think that the core of the question is "how to keep a resource within a managed context". i.e. provide users with a way to use the resource and prevent it to 'leak' outside its context.
One possible approach is to provide a functional access to the managed resource, where the API requires functions to operate over the resource in question. Let me illustrate this with an example:
First, we define the domain of our model: (I've added some subtypes of Model to make the example more clear)
trait User
trait Model
trait TestModel extends Model
trait ValidatedModel extends Model
trait OpResult
// Some external resource provider
trait Ums {
def createUser: User
def deleteUser(user: User)
}
Then we create a class to hold our specific context.
class Context {
private val ums = new Ums{
def createUser = new User{}
def deleteUser(user: User) = ???
}
def withUserDo[T](ops: User => T):T = {
val user = ums.createUser
val result = ops(user)
ums.deleteUser(user)
result
}
}
The companion object provides (some) operations on the managed resource. Users can provide their own functions as well.
object Context {
def prepareModel(model: TestModel): User => Model = ???
val validateModel: Model => ValidatedModel = ???
val commitModel: ValidatedModel => OpResult = ???
}
We can instantiate our context and declare operations on it, using a classic declaration, like:
val ctx = new Context
val testModel = new TestModel{}
val result = ctx.withUserDo{ user =>
val preparedModel = prepareModel(testModel)(user)
val validatedModel = validateModel(preparedModel)
commitModel(validatedModel)
}
Or, given the desire in the question to use functional composition, we could rewrite this as:
val result = ctx.withUserDo{
prepareModel(testModel) andThen validateModel andThen commitModel
}
Use autoClean this will automatically call cleanUp at the end.
create a HelperStuff trait which contains all the necessary functions.
Inside the Helper object create a private implementation of the HelperStuff and then have a method method called autoClean which does the work keeping the Helper instance private and safe way from the rouge users.
Helper.autoClean { helperStuff =>
//write all your code here. clean up will happen automatically
helper.foo()
helper.commitModel()
}
Here is the autoClean function for you
trait HelperStuff {
def foo(): Unit
def commitModel: Unit
def cleanUp(): Unit
}
object Helper {
private class Helper extends HelperStuff {
def foo(): Unit = println("foo")
def cleanUp(): Unit = println("cleaning done")
}
private val helper = new Helper()
def autoClean[T](code: HelperStuff => T): T = {
val result = code(helper)
helper.cleanUp()
result
}
}

How to setup my mock Dao classes when using Guice?

A typical service looks like this:
trait BaseService extends LazyLogging {
def getDb() = {
DatabaseHelper.getDb // database for the scala slick library
}
}
abstract class UserService extends BaseService {
def getById(userId: Int): Option[User]
}
class UserServiceImpl #Inject(val userDao: UserDao) extends UserService = {
def getById(userId: Int): Option[User] = {
getDb().withSession { implicit session =>
return userDao.getById(userId)
}
}
}
Using Guice I wire up my objects like:
class ServiceModule extends ScalaModule {
def configure() {
bind[UserDao].to[UserDaoImpl]
bind[UserService].to[UserServiceImpl]
}
}
Now when I am unit testing using scalatest, I am a bit confused how I can de-couple the database access since I want to mock the database responses.
My spec looks like:
class UserServiceSpec extends UnitSpec with MockitoSugar {
val userService = injector.getInstance(classOf[UserService])
describe("UserServiceSpec") {
it("should do someting") {
val abc = userService.doSomething();
abc.name should be("abc")
}
}
}
My UnitSpec class wires up my Guice.
I am confused, where should I create the mock objects (using mockito) and how should I wire them using Guice? In the ServiceModule or?
My design seems wrong since my BaseService has a connection to the database, I need to refactor that out somehow.
Looking for a way to get out of this bad design I currently seem to have, ideas?
You can move the db connection to DAO layer. Your application should have three layers: controller -> service -> DAO. All the service layer needs to know is functionalities provided by DAO, i.e. CRUD operations; it should not know anything about db connection since that is DAO's responsibility.
I am not really sure about Slick framework, but for Play framework with Guice, it allows to disable "real" bindings (injected dependencies) that you expect when the application is running and enable bindings that are only for testing like this:
implicit override lazy val app = new GuiceApplicationBuilder()
.configure(appConfig)
.disable(classOf[ReactiveMongoModule], classOf[CommonModule])
.bindings(bind(classOf[ReactiveMongoApi]).toInstance(api))
.bindings(TestDaoModule())
.build()
This is a complete integration test (controller layer), hope that it helps: https://github.com/luongbalinh/play-mongo/blob/master/test/controllers/UserControllerTest.scala

Cake pattern: how to get all objects of type UserService provided by components

This question may help you understand my needs.
Cake pattern: one component per implementation, or one component per trait?
I have a Scala application using multiple UserService implementations which will be provided by component(s?).
I wonder if there is a way in another component to "scan" the application so that I can retrieve a set of all components providing an object which implement the trait UserService?
So that I can iterate over all the UserService interfaces provided by my cake built application?
I guess I can have a component which build a list of UserService according to its dependency, but is it possible to have this component building the list without having any hardcoded dependency?
You can simply have a list of UserService instances right into UserServiceComponent, and have the base UserService register itself in this list.
trait UserServiceComponent {
private val _userServices = collection.mutable.Buffer[UserService]()
def userServices: Seq[UserService] = _userServices.synchronized {
_userServices.toList // defensive copy
}
private def registerUserService( service: UserService ) = _userServices.synchronized {
_userServices += service
}
trait UserService {
registerUserService( this )
def getPublicProfile(id: String): Either[Error, User]
}
val mainUserService: UserService
}
trait DefaultUserServiceComponent extends UserServiceComponent { self: UserRepositoryComponent =>
protected class DefaultUserService extends UserService {
// NOTE: no need to register the service, this is handled by the base class
def getPublicProfile(id: String): Either[Error, User] = userRepository.getPublicProfile(id)
}
val mainUserService: UserService = new DefaultUserService
}