Unit testing Scala using mockito - scala

I'm very much new to doing unit testing in Scala using Mockito. I'm getting an error in the thenReturn statement.
it should "read null when readFromPostgresTarget is called with some
random driver" in {
Given("a null query is sent as query")
val query = ""
val pgObject = mock[PersistenceObject]
val postgresPersistenceObject =
mock[PostgressPersistenceServiceTrait]
val mockDF = mock[DataFrame]
When("it is passed to readFromPostgresTarget")
when(postgresPersistenceObject.readFromPostgresTarget(any[String],mock[Spark
Session], pgObject)).thenReturn(mockDF)
assert(postgresPersistenceObject.readFromPostgresTarget(query,
sparkSession, pgObject) === any[DataFrame])
Then("a null value should be returned")
verify(postgresPersistenceObject, times(1))
}
I'm getting the error-
overloaded method value thenReturn with alternatives:
(x$1: Unit,x$2: Unit*)org.mockito.stubbing.OngoingStubbing[Unit] <and>
(x$1: Unit)org.mockito.stubbing.OngoingStubbing[Unit]
cannot be applied to (org.apache.spark.sql.DataFrame)
.thenReturn(mockDF)
I tried making the mockDF in thenReturn(mockDF) to thenReturn(any[DatafRame]), it's not fixing the issue.
I tried passing a SparkSession instead of the mock it isn't working.
I can't figure what mistake I'm doing.

To avoid those problems (related to Scala/Java interop) you should use mockito-scala

Related

Scala Mockito: Discarded non-unit value for Unit declaration

Can someone please explain what this means
Error:(32, 28) discarded non-Unit value
dataFrameReader.load() wasCalled once
I've looked at some online articles and I don't quite understand it.
This is my code snippet from a ScalaTest with Scala Mockito
...
val dataFrameReader = mock[DataFrameReader]
dataFrameReader.format(anyString) shouldReturn dataFrameReader
dataFrameReader.option(anyString, anyString) shouldReturn dataFrameReader
dataFrameReader.load() wasCalled once
If I take out the wasCalled once then it works fine
I don't understand what this means though as I am invoking "wasCalled" on what load() returns and wasCalled once resolves to a unit
What am I missing here?
Assuming you are mocking DataFrameReader.load from Apache Spark, then its return type is actually DataFrame and not Unit:
def load(): DataFrame
On the other hand, return type of wasCalled is indeed Unit:
def wasCalled(t: Times)(implicit order: VerifyOrder): Unit
Thus we have a situation similar to
def f(): Unit = {
g() // g returns DataFrame which gets discarded by f
}
def g(): DataFrame
which gets flagged by compiler if scalacOptions += "-Ywarn-value-discard" is set.
The issue has been resolved since mockito-scala 1.2.2.

Spark-Cassandra-connector issue: value write is not a member of Unit in BOresultDf.write [duplicate]

The following code works fine until I add show after agg. Why is show not possible?
val tempTableB = tableB.groupBy("idB")
.agg(first("numB").as("numB")) //when I add a .show here, it doesn't work
tableA.join(tempTableB, $"idA" === $"idB", "inner")
.drop("idA", "numA").show
The error says:
error: overloaded method value join with alternatives:
(right: org.apache.spark.sql.Dataset[_],joinExprs: org.apache.spark.sql.Column,joinType: String)org.apache.spark.sql.DataFrame <and>
(right: org.apache.spark.sql.Dataset[_],usingColumns: Seq[String],joinType: String)org.apache.spark.sql.DataFrame
cannot be applied to (Unit, org.apache.spark.sql.Column, String)
tableA.join(tempTableB, $"idA" === $"idB", "inner")
^
Why is this behaving this way?
.show() is a function with, what we call in Scala, a side-effect. It prints to stdout and returns Unit(), just like println
Example:
val a = Array(1,2,3).foreach(println)
a: Unit = ()
In scala, you can assume that everything is a function and will return something. In your case, Unit() is being returned and that's what's getting stored in tempTableB.
As #philantrovert has already answered with much detailed explanation. So I shall not explain.
What you can do if you want to see whats in tempTableB then you can do so after it has been assigned as below.
val tempTableB = tableB.groupBy("idB")
.agg(first("numB").as("numB"))
tempTableB.show
tableA.join(tempTableB, $"idA" === $"idB", "inner")
.drop("idA", "numA").show
It should work then

Using scalamock: Could not find implicit value for evidence parameter of type error

I am writing unit tests for my spark/scala application. I am using scalamock as well to mock objects, specifically Session / Session Factory.
In one of my test classes, I try to mock the Session. Ex:
val mockedSession = mock[Session]
However, I get this error:
could not find implicit value for evidence parameter of type
org.scalamock.util.Defaultable[org.hibernate.SimpleNaturalldLoadAccess]
I am getting similar errors no matter the object I mock. The format looks correct.
From the section "Advanced Topics / Raw Types" in the documentation:
"mocking a java method with raw type" should "work" in {
implicit val d = new Defaultable[java.util.Enumeration[_]] {
override val default = null
}
implicit val d2 = new Defaultable[java.util.Map[_, _]] {
override val default = null
}
val mockedRaw = mock[RawTypeInterface]
}
In my case simply importing the offending type resolved the error.

issue with Blueprints API, addVertex

I've just got an issue with the BluePrints API.
I've executed the following Scala commands:
val dbBasename = "C:\\Users\\taatoal1\\tmp\\orientdb\\databases\\"
val dbpath = "test_ingest"
val (uname, pwd) = ("admin", "admin")
val graph = new OrientGraph(s"plocal:$dbpath", uname, pwd)
graph.addVertex("class:Employee")
and I got the following error
<console>:14: error: ambiguous reference to overloaded definition,
both method addVertex in class OrientBaseGraph of type (x$1: Any, x$2: <repeated...>[Object])com.tinkerpop.blueprints.impls.orient.OrientVertex
and method addVertex in class OrientBaseGraph of type (x$1: Any)com.tinkerpop.blueprints.impls.orient.OrientVertex
match argument types (String)
graph.addVertex("class:Employee")
^
Do you have any idea what I did wrong?
Thanks in advance
In the end I found out that there is another version of addVertex that takes two strings as parameters: the class name and the cluster name.

are there any tricks to working with overloaded methods in specs2?

i've been getting beat up attempting to match on an overloaded method.
i'm new to scala and specs2, so that is likely one factor ;)
so i have a mock of this SchedulerDriver class
and i'm trying to verify the content of the arguments that are being
passed to the signature of this launchTasks method:
http://mesos.apache.org/api/latest/java/org/apache/mesos/SchedulerDriver.html#launchTasks(java.util.Collection,%20java.util.Collection)
i have tried the answers style like so:
val mockSchedulerDriver = mock[SchedulerDriver]
mockSchedulerDriver.launchTasks(haveInterface[Collection[OfferID]], haveInterface[Collection[TaskInfo]]) answers { i => System.out.println(s"i=$i") }
and get
ambiguous reference to overloaded definition, both method launchTasks in trait SchedulerDriver of type (x$1: org.apache.mesos.Protos.OfferID, x$2: java.util.Collection[org.apache.mesos.Protos.TaskInfo])org.apache.mesos.Protos.Status and method launchTasks in trait SchedulerDriver of type (x$1: java.util.Collection[org.apache.mesos.Protos.OfferID], x$2: java.util.Collection[org.apache.mesos.Protos.TaskInfo])org.apache.mesos.Protos.Status match argument types (org.specs2.matcher.Matcher[Any],org.specs2.matcher.Matcher[Any])
and i have tried the capture style like so:
val mockSchedulerDriver = mock[SchedulerDriver]
val offerIdCollectionCaptor = capture[Collection[OfferID]]
val taskInfoCollectionCaptor = capture[Collection[TaskInfo]]
there was one(mockSchedulerDriver).launchTasks(offerIdCollectionCaptor, taskInfoCollectionCaptor)
and get:
overloaded method value launchTasks with alternatives: (x$1: org.apache.mesos.Protos.OfferID,x$2: java.util.Collection[org.apache.mesos.Protos.TaskInfo])org.apache.mesos.Protos.Status <and> (x$1: java.util.Collection[org.apache.mesos.Protos.OfferID],x$2: java.util.Collection[org.apache.mesos.Protos.TaskInfo])org.apache.mesos.Protos.Status cannot be applied to (org.specs2.mock.mockito.ArgumentCapture[java.util.Collection[mesosphere.mesos.protos.OfferID]], org.specs2.mock.mockito.ArgumentCapture[java.util.Collection[org.apache.mesos.Protos.TaskInfo]])
any guidance or suggestions on how to approach this appreciated...!
best,
tony.
You can use the any matcher in that case:
val mockSchedulerDriver = mock[SchedulerDriver]
mockSchedulerDriver.launchTasks(
any[Collection[OfferID]],
any[Collection[TaskInfo]]) answers { i => System.out.println(s"i=$i")
The difference is that any[T] is a Matcher[T] and the overloading resolution works in that case (whereas haveInterface is a Matcher[AnyRef] so it can't direct the overloading resolution).
I don't understand why the first alternative didn't work, but the second alternative isn't working because scala doesn't consider implicit functions when resolving which overloaded method to call, and the magic that lets you use a capture as though it were the thing you captured depends on an implicit function call.
So what if you make it explicit?
val mockSchedulerDriver = mock[SchedulerDriver]
val offerIdCollectionCaptor = capture[Collection[OfferID]]
val taskInfoCollectionCaptor = capture[Collection[TaskInfo]]
there was one(mockSchedulerDriver).launchTasks(
offerIdCollectionCaptor.capture, taskInfoCollectionCaptor.capture)