I have a similar problem as mentioned in link: How to mock a function within Scala object using Mockito?
I have an object FooInner with method stringData, which takes int and give string output
object FooInner {
def stringData(data: Int): String = {
data match {
case 1 => "one"
case _ => "else"
}
}
}
Another object, FooOuter, calls FooInner to get string data and perform some operation on it.
object FooOuter {
def dummyCall(data: Int): String = {
FooInner.stringData(data) + "some operation"
}
}
My aim is to test method FooOuter.dummyCall for string data, NOT returned by FooInner.stringData
For same, I followed above mentioned post and created trait
trait TFooInner {
def stringData(data: Int): String
}
Changed signature of FooInner to
object FooInner extends TFooInner {..}
And created test class FooTests
class FooTests extends FlatSpec with Matchers with MockitoSugar{
"dummyCall" should "mocked data" in {
val service = mock[TFooInner]
when(service.stringData(1)).thenReturn("1") // mocked service data -> 1
assert(FooOuter.dummyCall(1) === "1-some operation") // TestFailedException: "[one]-some operation" did not equal "[1]-some operation"
}
}
, but I am still not able to get mocked service data "1".
I have following questions:
How can I make FooOuter testable with data, not returned from FooInner
Is it right functional style to code in Scala? What I feel is FooOuter is now tightly-coupled/dependent on FooInner
Scala: 2.11
Mockito: 1.9.5
You have to alter both Inner and Outer objects to make them testable.
In general, you should never invoke object methods directly, if you might want to stub them for testing. Such methods should be implemented and accessed via instance calls:
class FooInner {
def stringData(data: Int): String = { ... }
}
object FooInner extends FooInner
class FooOuter(inner: FooInner) {
def dummyCall(data: Int): String = inner.stringData(data) + "bar"
}
object FooOuter extends FooOuter(FooInner)
Now, in your test, you can do
val inner = mock[FooInner]
val testMe = new FooOuter(inner)
when(inner.stringData(any)).thenReturn("foo")
testMe.dummyCall(1) shouldBe "foobar"
verify(inner).stringData(1)
Related
I have a Scala application, where pretty much every object extends a specific trait, which holds all the main functions and variables used by pretty much the entire system.
I want to add a --testing flag to my app's command line variables, which will shift the the results of some of the functions in the trait.
Putting it simply, I'd like the variable accepted in the main to have an affect that alters something in the trait before it is extended by the objects - without sending it explicitly to all objects.
Any ideas how that can be performed?
I doubt you really want to dynamically modify a trait, and I am not sure if it possible that all your classes inheriting that trait would be affected. I don't know enough about the compiler and byte code.
A way to accomplish something similar would be to have your trait take a parameter, and make your trait act conditionally on the parameter.
trait Foo {
val testing: Boolean
def fn1(): Unit = {
if (testing) {
println("testing")
} else {
println("production")
}
}
}
class Bar(val testing: Boolean) extends Foo {
def fn2(): Unit = {
fn1()
}
}
new Bar(true).fn2()
new Bar(false).fn2()
Your question is broad and this is just my 5 cents.
Update
trait Foo {
def fn1(): Unit = {
if (Foo.testing) {
println("testing")
} else {
println("production")
}
}
}
object Foo {
var testing: Boolean = false
}
class Bar extends Foo {
def fn2(): Unit = {
fn1()
}
}
object SOApp extends App {
new Bar().fn2()
Foo.testing = true
new Bar().fn2()
}
Consider passing the 'testing' flag to the trait's initializer block like this:
trait MyTrait {
var testMode: Boolean = _
def doSomething(): Unit = {
if (testMode)
println("In Test Mode")
else
println("In Standard Mode")
}
}
// IMPORTANT: Your best bet would be to create some Config object
// that is loaded and initialized in a main method.
// Define test-specific Config class:
case class Config(testMode: Boolean) {
def isTestMode: Boolean = this.testMode
}
// Instantiate in main method:
val config = new Config(true)
// Later, extend the trait:
class MyObj extends MyTrait { testMode = config.isTestMode() }
// Then just invoke
new MyObject().doSomething()
I've built a microservice using Scala and Play and now I need to create a new version of the service that returns the same data as the previous version of the service but in a different JSON format. The service currently uses implicit Writes converters to do this. My controller looks something like this, where MyJsonWrites contains the implicit definitions.
class MyController extends Controller with MyJsonWrites {
def myAction(query: String) = Action.async {
getData(query).map {
results =>
Ok(Json.toJson(results))
}
}
}
trait MyJsonWrites {
implicit val writes1: Writes[SomeDataType]
implicit val writes2: Writes[SomeOtherDataType]
...
}
Now I need a new version of myAction where the JSON is formatted differently. The first attempt I made was to make MyController a base class and have subclasses extend it with their own trait that has the implicit values. Something like this.
class MyNewContoller extends MyController with MyNewJsonWrites
This doesn't work though because the implicit values defined on MyNewJsonWrites are not available in the methods of the super class.
It would be ideal if I could just create a new action on the controller that somehow used the converters defined in MyNewJsonWrites. Sure, I could change the trait to an object and import the implicit values in each method but then I'd have to duplicate the method body of myAction so that the implicits are in scope when I call Json.toJson. I don't want to pass them as implicit parameters to a base method because there are too many of them. I guess I could pass a method as a parameter to the base method that actually does the imports and Json.toJson call. Something like this. I just thought maybe there'd be a better way.
def myBaseAction(query: String, toJson: Seq[MyResultType] => JsValue) = Action.async {
getData(query).map {
results =>
Ok(Json.toJson(results))
}
}
def myActionV1(query: String) = {
def toJson(results: Seq[MyResultType]) = {
import MyJsonWritesV2._
Json.toJson(results)
}
myBaseAction(query, toJson)
}
Instead of relying on scala implicit resolution, you can call your writes directly:
def myBaseAction(query: String, writes: Writes[MyResultType]) = Action.async {
getData(query).map { results =>
val seqWrites: Writes[Seq[MyResultType]] = Writes.seq(writes)
Ok(seqWrites.writes(results))
}
}
def myActionV1(query: String) = myBaseAction(query, MyJsonWritesV1)
def myActionV2(query: String) = myBaseAction(query, MyJsonWritesV2)
Imagine I have a service:
class ServiceA(serviceB: ServiceB) {
import Extractor._
def send(typeA: A) = serviceB.send(typeA.extract)
}
object Extractor {
implicit class Extractor(type: A) {
def extract = ???
}
}
I want the extract method to be an implicitly defined because it doesn't directly relate to A type/domain and is a solution specific adhoc extension.
Now I would like to write a very simple unit test that confirms that serviceB.send is called.
For that, I mock service and pass a mocked A to send. Then I could just assert that serviceB.send was called with the mocked A.
As seen in the example, the send method also does some transformation on typeA parameter so I would need to mock extract method to return my specified value. However, A doesn't have extract method - it comes from the implicit class.
So the question is - how do I mock out the implicit class as in the example above as imports are not first class citizens.
If you want to specify a bunch of customised extract methods, you can do something like this:
sealed trait Extractor[T] {
// or whatever signature you want
def extract(obj: T): String
}
object Extractor {
implicit case object IntExtractor extends Extractor[Int] {
def extract(obj: Int): String = s"I am an int and my value is $obj"
}
implicit case object StringExtractor extends Extractor[String] {
def extract(obj: String): String = s"I am "
}
def apply[A : Extractor](obj: A): String = {
implicitly[Extractor[A]].extract(obj)
}
}
So you have basically a sealed type family that's pre-materialised through case objects, which are arguably only useful in a match. But that would let you decouple everything.
If you don't want to mix this with Extractor, call them something else and follow the same approach, you can then mix it all in with a context bound.
Then you can use this with:
println(Extractor(5))
For testing, simply override the available implicits if you need to. A bit of work, but not impossible, you can simply control the scope and you can spy on whatever method calls you want.
e.g instead of import Extractor._ have some other object with test only logic where you can use mocks or an alternative implementation.
Is it possible to add a member variable to a class from outside the class? (Or mimic this behavior?)
Here's an example of what I'm trying to do. I already use an implicit conversion to add additional functions to RDD, so I added a variable to ExtendedRDDFunctions. I'm guessing this doesn't work because the variable is lost after the conversion in a rdd.setMember(string) call.
Is there any way to get this kind of functionality? Is this the wrong approach?
implicit def toExtendedRDDFunctions(rdd: RDD[Map[String, String]]): ExtendedRDDFunctions = {
new ExtendedRDDFunctions(rdd)
}
class ExtendedRDDFunctions(rdd: RDD[Map[String, String]]) extends Logging with Serializable {
var member: Option[String] = None
def getMember(): String = {
if (member.isDefined) {
return member.get
} else {
return ""
}
}
def setMember(field: String): Unit = {
member = Some(field)
}
def queryForResult(query: String): String = {
// Uses member here
}
}
EDIT:
I am using these functions as follows: I first call rdd.setMember("state"), then rdd.queryForResult(expression).
Because the implicit conversion is applied each time you invoke a method defined in ExtendedRDDFunctions, there is a new instance of ExtendedRDDFunctions created for every call to setMember and queryForResult. Those instances do not share any member variables.
You have basically two options:
Maintain a Map[RDD, String] in ExtendedRDDFunctions's companion object which you use to assign the member value to an RDD in setMember. This is the evil option as you introduce global state and open pitfalls for a whole range of errors.
Create a wrapper class that contains your member value and is returned by the setMember method:
case class RDDWithMember(rdd: RDD[Map[String, String]], member: String) extends RDD[Map[String, String]] {
def queryForResult(query: String): String = {
// Uses member here
}
// methods of the RDD interface, just delegate to rdd
}
implicit class ExtendedRDDFunctions(rdd: RDD[Map[String, String]]) {
def setMember(field: String): RDDWithMember = {
RDDWithMember(rdd, field)
}
}
Beside the omitted global state, this approach is also more type safe because you cannot call queryForResult on instances that do not have a member. The only downsides are that you have to delegate all members of RDD and that queryForResult is not defined on RDD itself.
The first issue can probably be addressed with some macro magic (search for "delegate" or "proxy" and "macro").
The later issue can be resolved by defining an additional extension method in ExtendedRDDFunctions that checks if the RDD is a RDDWithMember:
implicit class ExtendedRDDFunctions(rdd: RDD[Map[String, String]]) {
def setMember(field: String): RDDWithMember = // ...
def queryForResult(query: String): Option[String] = rdd match {
case wm: RDDWithMember => Some(wm.queryForResult(query))
case _ => None
}
}
import ExtendedRDDFunctions._
will import all attributes and functions from Companion object to be used in the body of your class.
For your usage look for delagate pattern.
I'm new to the Play framework and scala and I'm trying to inject a dependency inside a companion object.
I have a simple case class, like:
case class Bar(foo: Int) {}
With a companion object like:
object Bar {
val myDependency =
if (isTest) {
// Mock
}
else
{
// Actual implementation
}
val form = Form(mapping(
"foo" -> number(0, 100).verifying(foo => myDependency.validate(foo)),
)(Bar.apply)(Bar.unapply))
}
This works fine, but it's not really a clean way to do it. I'd like to be able to inject the dependency at build time so that I can inject different mock objects when testing and different real implementations in development and production.
What's the best way to achieve this?
Any help really appreciated. Thanks!
Along the lines of the Cake, we can try to change your example to
trait Validator {
def validate(foo: Int): Boolean
}
trait TestValidation {
val validator = new Validator {
def validate(foo: Int): Boolean = ...
}
}
trait ImplValidation {
val validator = new Validator {
def validate(foo: Int): Boolean = ...
}
}
trait BarBehavior {
def validator: Validator
val form = Form(mapping(...))(Bar.apply)(Bar.unapply)
}
//use this in your tests
object TestBar extends BarBehavior with TestValidation
//use this in production
object ImplBar extends BarBehavior with ImplValidation
You should additionally try and test if this example fits well within the Play Framework, too