Testing Play controller with query parameter - scala

given a simple Play 2.6.24 controller that handles a query parameter, i.e.
class MyController #Inject()(val controllerComponents: ControllerComponents) extends BaseController {
def foo(name: Option[String]= None) = { ... }
}
how should I test that foo works properly and gets the correct query parameter, using ScalaTest, and without instantiating a FakeApplication ?
The closest I've come with, is the following:
class MyControllerSpec extends FlatSpec with StubControllerComponentsFactory with Results with Matchers {
private val controller = new MyController(stubControllerComponents())
it should "read `name` query parameter" in {
val result = controller.foo().apply(FakeRequest("GET", "/?name=test"))
contentAsString(result) shouldBe ...
}
}
but, as you know, this won't work: FakeRequest content will get ignored, and I must explicitly call the controller as controller.foo(Some("test")).apply(...) which doesn't look good, nor logical (why do I have to still provide a FakeRequest which won't get used?).
I have a feeling that, since it's done at routing level, I will have to use a FakeApplication... but maybe I'm wrong, and somebody knows a way to achieve what I'm looking for.
Thanks!

As far as I know, you can't test the query parameters provided in the FakeRequest unless you call the router which requires the actual app.
As you explained, controller.foo(Some("test")).apply(...) would work because query parameters are extracted into method arguments, but it doesn't actually add much value to your tests.
Instantiating the app and testing with the router allows you to verify that clients will be able to call your method in the way it is supposed to be supported, which includes the path, request methods, headers, etc. This also makes it simpler to detect incompatible changes if you ever update how the controller renders the result.

Related

Scala: how to avoid passing the same object instance everywhere in the code

I have a complex project which reads configurations from a DB through the object ConfigAccessor which implements two basic APIs: getConfig(name: String) and storeConfig(c: Config).
Due to how the project is currently designed, almost every component needs to use the ConfigAccessor to talk with the DB. Thus, being this component an object it is easy to just import it and call its static methods.
Now I am trying to build some unit tests for the project in which the configurations are stored in a in-memory hashMap. So, first of all I decoupled the config accessor logic from its storage (using the cake pattern). In this way I can define my own ConfigDbComponent while testing
class ConfigAccessor {
this: ConfigDbComponent =>
...
The "problem" is that now ConfigAccessor is a class, which means I have to instantiate it at the beginning of my application and pass it everywhere to whoever needs it. The first way I can think of for passing this instance around would be through other components constructors. This would become quite verbose (adding a parameter to every constructor in the project).
What do you suggest me to do? Is there a way to use some design pattern to overcome this verbosity or some external mocking library would be more suitable for this?
Yes, the "right" way is passing it in constructors. You can reduce verbosity by providing a default argument:
class Foo(config: ConfigAccessor = ConfigAccessor) { ... }
There are some "dependency injection" frameworks, like guice or spring, built around this, but I won't go there, because I am not a fan.
You could also continue utilizing the cake pattern:
trait Configuration {
def config: ConfigAccessor
}
trait Foo { self: Configuration => ... }
class FooProd extends Foo with ProConfig
class FooTest extends Foo with TestConfig
Alternatively, use the "static setter". It minimizes changes to existing code, but requires mutable state, which is really frowned upon in scala:
object Config extends ConfigAccessor {
#volatile private var accessor: ConfigAccessor = _
def configurate(cfg: ConfigAccessor) = synchronized {
val old = accessor
accessor = cfg
old
}
def getConfig(c: String) = Option(accessor).fold(
throw new IllegalStateException("Not configurated!")
)(_.getConfig(c))
You can retain a global ConfigAccessor and allow selectable accessors like this:
object ConfigAccessor {
private lazy val accessor = GetConfigAccessor()
def getConfig(name: String) = accessor.getConfig(name)
...
}
For production builds you can put logic in GetConfigAccessor to select the appropriate accessor based on some global config such as typesafe config.
For unit testing you can have a different version of GetConfigAccessor for different test builds which return the appropriate test implementation.
Making this value lazy allows you to control the order of initialisation and if necessary do some non-functional mutable stuff in the initialisation code before creating the components.
Update following comments
The production code would have an implementation of GetConfigAccessor something like this:
object GetConfigAccessor {
private val useAws = System.getProperties.getProperty("accessor.aws") == "true"
def apply(): ConfigAccessor =
if (useAws) {
return new AwsConfigAccessor
} else {
return new PostgresConfigAccessor
}
}
Both AwsConfigAccessor and PostgresConfigAccessor would have their own unit tests to prove that they conform to the correct behaviour. The appropriate accessor can be selected at runtime by setting the appropriate system property.
For unit testing there would be a simpler implementation of GetConfigAccessor, something like this:
def GetConfigAccessor() = new MockConfigAccessor
Unit testing is done within a unit testing framework which contains a number of libraries and mock objects that are not part of the production code. These are built separately and are not compiled into the final product. So this version of GetConfigAccessor would be part of that unit testing code and would not be part of the final product.
Having said all that, I would only use this model for reading static configuration data because that keeps the code functional. The ConfigAccessor is just a convenient way to access global constants without having them passed down in the constructor.
If you are also writing data then this is more like a real DB than a configuration. In that case I would create custom accessors for each component that give access to different parts of the DB. That way it is clear which parts of the data are updated by each component. These accessors would be passed down to the component and can then be unit tested with the appropriate mock implementation as normal.
You may need to partition your data into static config and dynamic config and handle them separately.

How do I give global access to an object in Scala without making it a singleton or passing it to everything?

I have a Logger class that logs events in my application. While I only need one instance of the logger in this application, I want this class to be reusable, so I don't want to make it a singleton and couple it with my specific needs for this application.
I want to be able to access this Logger instance from anywhere in the application without having to create a new one every time or pass it around to every class that might need to log something. What I currently do is have an ApplicationUtils singleton that I use as the point of access for the application's Logger:
object ApplicationUtils {
lazy val log : Logger = new Logger()
}
Then I have a Loggable trait that I add to classes that need the Logger:
trait Loggable {
protected[this] lazy val log = ApplicationUtils.log
}
Is this a valid approach for what I am trying to accomplish? It feels a little hack-y. Is there a better approach I could be using? I'm pretty new to Scala.
Be careful when putting functionality in objects. That functionality is easily testable, but if you need to test clients of that code to make sure they interact with it correctly (via mocks and spies), you're stuck 'cause objects compile to final classes and thus cannot be mocked.
Instead, use this pattern:
trait T { /* code goes here */ }
object T extends T /* pass this to client code from main sources */
Now you can create Mockito mocks / spies for trait T in your test code, pass that in and confirm that the interactions of the code under test with the trait T code are what they should be.
If you have code that's a client of T and whose interactions with it don't require testing, you can directly reference object T.
To address what you're trying to do (rather than what you're asking), take a look at TypeSafe's scalalogging package. It provides a Logging trait that you can use like so:
class MyClass extends Logging {
logger.debug("This is very convenient ;-)")
}
It's a macro-based wrapper for SLF4J, so something like logger.debug(...) gets compiled as if (logger.isDebugEnabled) logger.debug(...).

Scala/Play: access current url in a model

I have a simple Play application in which I need to check url being called and use different database accordingly. I know that it's easy to access current url in the controller, but for this to work I need to access it in the model.
Passing the url from controller to each call of a model method would be too big of an inconvenience. Is there any other way to solve this problem?
Play Framework 2.2.1 / Scala 2.10.3
UPDATE: This is my basic example
Controller (Application.scala):
package controllers
import play.api._
import play.api.mvc._
import models.Data
object Application extends Controller {
def index = Action {
//Call to model method - model should somehow get the URL without it being passed as a param
val smth: String = Data.getSmth()
Ok(smth);
}
}
Model (Data.scala):
package models
object Data {
def getSmth: Option[String] = DB.withSession {
val db = //this is where I need the url to decide which database to use
sql"""SELECT #$db.smth FROM smthTable""".as[String].firstOption
}
}
So, this is by design in the Play Scala API - there is no magic context, if you want data you will have to pass it along to whatever piece of your code that needs it.
You will have to take the url as a parameter of some kind, you could do it like this:
case class MyModel(someData: String, requestUrl: String)
object MyModel {
def apply(someData: String, request: Request) =
new MyModel(someData, request.url)
}
This would clearly express the dependency, but in your particular app you might call this from every request and want to avoid having to repeat providing that parameter, in that case you can use Scala implicits which makes the compiler look for a matching implicit instance that is of the same type in the current scope (you can read more about this here: http://www.scala-lang.org/old/node/114).
object MyModel {
def apply(someData: String)(implicit request: Request) =
new MyModel(someData, request.url)
}
which could then be called from a controller action like this
def myAction = Action { implicit request =>
val model = MyModel("blablabla")
...
}
Of course it may be a bad idea to tightly couple your model to the play Request API and you should probably introduce your own class to represent this 'context', you could then implicitly convert from Request to YourContext in you controllers and have the model implicitly use YourContext instead.
If all this sounds like gibberish to you, you should probably start with actually learning Scala before trying to build a web app in Scala. There are lots of good books nowadays ('Scala for the impatient' for example) as well as a multitude of good online resources (the neophytes guide to scala is a good one).
Good luck!

How to re-boot liftweb?

I have my Boot.scala with boot method in it where i do my setup.
At the end, I make the call to LiftRules.statelessDispatchTable and append an new instance of my class that extends the RestHelper, which has the serve block.
At some point, I get a signal and need to change this class, so i need to make another call into the statelessDispatchTable to remove the original one and add a new one.
What's a good way to do this?
Thanks!
EDIT: I AM GOING TO UPDATE THE QUESTION WITH THE ANSWER I GOT FROM DAVID POLLAK:
You can't. Once your app is started, there's no way to change LiftRules.
However, the stuff you're adding to statelessDispatchTable is a PartialFunction[Req, Box[LiftResponse]] so you can write a PartialFunction that looks like:
object RestThing1 extends RestHelper { .... }
object RestThing2 extends RestHelper {....}
object MyDynamicRestThing extends PartialFunction[Req, Box[LiftResponse]] {
def isDefinedAt(in: Req): Boolean = if (testCondition) RestThing1.isDefinedAt(in) else RestThing2.isDefinedAt(in)
def apply(in: Req): Box[LiftRequest] = if (testCondition) RestThing1.apply(in) else RestThing2.apply(in)
}
LiftRules.statelessDispatchTable.append(MyDynamicRestThing)
You could create a second-level dispatch...e.g., an object that receives the requests, then according to some other logic proxies the requests on to the real handler. Then you don't have to mess with the top-level dispatch table at all.
Would really make sense to do this if what you are needing to do is toggle it based on a signal (e.g. it will revert back at some point), or if there is additional logic that would benefit from being in a proper abstraction.

Not able to use Mockito ArgumentMatchers in Scala

I am using ScalaMock and Mockito
I have this simple code
class MyLibrary {
def doFoo(id: Long, request: Request) = {
println("came inside real implementation")
Response(id, request.name)
}
}
case class Request(name: String)
case class Response(id: Long, name: String)
I can easily mock it using this code
val lib = new MyLibrary()
val mock = spy(lib)
when(mock.doFoo(1, Request("bar"))).thenReturn(Response(10, "mock"))
val response = mock.doFoo(1, Request("bar"))
response.name should equal("mock")
But If I change my code to
val lib = new MyLibrary()
val mock = spy(lib)
when(mock.doFoo(anyLong(), any[Request])).thenReturn(Response(10, "mock"))
val response = mock.doFoo(1, Request("bar"))
response.name should equal("mock")
I see that it goes inside the real implementation and gets a null pointer exception.
I am pretty sure it goes inside the real implementation without matchers too, the difference is that it just doesn't crash in that case (any ends up passing null into the call).
When you write when(mock.doFoo(...)), the compiler has to call mock.doFoo to compute the parameter that is passed to when.
Doing this with mock works, because all implementations are stubbed out, but spy wraps around the actual object, so, the implementations are all real too.
Spies are frowned upon in mockito world, and are considered code smell.
If you find yourself having to mock out some functionality of your class while keeping the rest of it, it is almost surely the case when you should just split it into two separate classes. Then you'd be able to just mock the whole "underlying" object entirely, and have no need to spy on things.
If you are still set on using spies for some reason, doReturn would be the workaround, as the other answer suggests. You should not pass null as the vararg parameter though, it changes the semantics of the call. Something like this should work:
doReturn(Response(10, "mock"), Array.empty:_*).when(mock).doFoo(any(), any())
But, I'll stress it once again: this is just a work around. The correct solution is to use mock instead of spy to begin with.
Try this
doReturn(Response(10, "mock"), null.asInstanceOf[Array[Object]]: _*).when(mock.doFoo(anyLong(), any[Request]))