ReactiveMongo ConnectionNotInitialized In Test After Migrating to Play 2.5 - scala

After migrating my Play (Scala) app to 2.5.3, some tests of my code using ReactiveMongo that once passed now fail in the setup.
Here is my code using ScalaTest:
def fixture(testMethod: (...) => Any) {
implicit val injector = new ScaldiApplicationBuilder()
.prependModule(new ReactiveMongoModule)
.prependModule(new TestModule)
.buildInj()
def reactiveMongoApi = inject[ReactiveMongoApi]
def collection: BSONCollection = reactiveMongoApi.db.collection[BSONCollection](testCollection)
lazy val id = BSONObjectID.generate
//Error occurs at next line
Await.result(collection.insert(Person(id = id, slug = "test-slug", firstName = "Mickey", lastName = "Mouse")), 10.seconds)
...
}
At the insert line, I get this:
reactivemongo.core.errors.ConnectionNotInitialized: MongoError['Connection is missing metadata (like protocol version, etc.) The connection pool is probably being initialized.']
I have tried a bunch of things like initializing collection with a lazy val instead of def. But nothing has worked.
Any insight into how to get my tests passing again is appreciated.

With thanks to #cchantep, the test runs as expected by replacing this code above:
def collection: BSONCollection = reactiveMongoApi.db.collection[BSONCollection](testCollection)
with this code
def collection: BSONCollection = Await.result(reactiveMongoApi.database.map(_.collection[BSONCollection](testCollection)), 10.seconds)
In other words, reactiveMongoApi.database (along with the appropriate changes because of the Future) is the way to go.

Related

How to use Mockito to create a mock api in scala

I'm using other teams api(let's name it otherTeamAPI) to call data, so in my function, my code looks like this:
def getData(host:String, port:Int, date: String): Map[String, String] = {
val data = new otherTeamAPI(host,port)
val latestData = data.getLatestData(date)
}
Could someone teach me how to use Mockito to do the same thing to get data in unit test? I'm not sure whether to use something like below to new an api:
val otherTeamAPI = Mock[otherTeamAPI]
otherTeamAPI.getLatestData(date)
How to get data everytime i trigger my function getData? Do i need to do somthing new a mock otherTeamAPI?
Your code, written as is, is not testable. You have to be able to pass your method an instance of the OtherTeamAPI so that your production code uses a real instance but test code can use a fake one (a "mock").
How you pass this instance depends on the structure of the rest of your code: either as a parameter of this method getData or as an attribute of the class that contains it.
The first one would look like this:
def getData(api: OtherTeamApi, date: String): Map[String, String] = {
val latestData = api.getLatestData(date)
// ...
}
And then in your test, you can do something like:
val fakeApi = mock[OtherTeamAPI]
when(fakeApi.getLatestData(anyString())).the return(...)
val result = getData(fakeApi, ...)
// Then assert on result
This is a high level answer. You'll need to learn more about Mockito to find out what you want to do.

Syntax error when trying to access a MySql database via Slick

I'm trying to get Slick set up to access a MySQL database, and having a bit of trouble. I've put together the following code:
case class FieldValue(fieldName: String, fieldValue: String)
class FieldValues(tag: Tag) extends Table[FieldValue](tag, "FieldValues") {
def fieldName = column[String]("fieldName")
def fieldValue = column[String]("fieldValueID", O.PrimaryKey)
def * = (fieldName, fieldValue) <> (FieldValue.tupled, FieldValue.unapply)
}
object SlickTest extends App {
val db = Database.forConfig("validation_db")
val simplesql = sql"select fieldValueID from FieldValues".as[(String)]
val simplequeryf = db.run(simplesql)
val simplequeryout = Await.result(simplequeryf, 1 second)
simplequeryout.foreach(println)
lazy val slickquery = TableQuery[FieldValues]
val slickqueryf = db.run(slickquery.result)
val slickqueryout = Await.result(slickqueryf, 1 second)
slickqueryout.foreach(println)
}
As you'll see, it runs two queries - the first uses simple SQL (and works fine), whereas the second uses the Slick method. The second, unfortunately, throws up the following error:
java.sql.SQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that c
orresponds to your MySQL server version for the right syntax to use near '"FieldValues"' at line 1
The fact that it quotes the table name in double quotes within single quotes makes me wonder if it's related to MySql's ANSI_QUOTES option, but I'm unsure. For reference, here's my application.conf:
validation_db = {
driver = "com.mysql.cj.jdbc.Driver",
url = "jdbc:mysql://127.0.0.1:3306/validation?serverTimezone=UTC",
user = "root",
password = "password",
connectionPool = disabled,
useSSL=false
}
Any help greatly appreciated!
The way Slick composes SQL, and how fields are quoted or not, is controlled by importing a "profile". There's a different profile for each relational database.
One possibility is that you may see this error if you have imported the incorrect Slick profile.
For MySQL, you should have an import in your code equivalent to:
import slick.jdbc.MySQLProfile.api._

Scala Unit testing for ProcessAllWindowFunction

After Reading the official flink testing documentation (https://ci.apache.org/projects/flink/flink-docs-release-1.9/dev/stream/testing.html)
I was able to develop tests for a ProcessFunction, using a Test Harness, something like this:
pendingPartitionBuilder = new PendingPartitionBuilder(":::some_name", "")
testHarness =
new OneInputStreamOperatorTestHarness[StaticAdequacyTilePublishedData, PendingPartition](
new ProcessOperator[StaticAdequacyTilePublishedData, PendingPartition](pendingPartitionBuilder)
)
testHarness.open()
now, I’m trying to do the same for a ProcessAllWindowFunction, that looks like this:
class MapVersionValidationDistributor(batchSize: Int) extends
ProcessAllWindowFunction[MapVersionValidation, Seq[StaticAdequacyTilePublishedData],TimeWindow] {
lazy val state: ValueState[Long] = getRuntimeContext .getState(new ValueStateDescriptor[Long]("latestMapVersion", classOf[Long]))
(...)
First I realized I can’t use TestHarness for ProcessAllWindowFunction, because it doesn’t have a processElement method. In this case, what unit test strategy should I follow?
EDIT: At the moment my test code looks like this:
val collector = mock[Collector[Seq[StaticAdequacyTilePublishedData]]]
val mvv = new MapVersionValidationDistributor(1)
val input3 = Iterable(new MapVersionValidation("123",Seq(TileValidation(1,true,Seq(1,3,4)))))
val ctx = mock[mvv.Context]
val streamContext = mock[RuntimeContext]
mvv.setRuntimeContext(streamContext)
mvv.open(mock[Configuration])
mvv.process(ctx,input3,collector)
and I'm getting this error:
Unexpected call: <mock-3> RuntimeContext.getState[T](ValueStateDescriptor{name=latestMapVersion, defaultValue=null, serializer=null}) Expected: inAnyOrder { }
You don't really need test harness to unit test the process method of the ProcessAllWindowFunction. The process function takes 3 arguments: Context, Iterable[IN], Collector[OUT]. You can use some library depending on the language used to mock the Context. You can also easily implement or mock the Collector depending on your prerefences here. And the Iterable[IN] is just an Iterable containing the elements of Your window, that would be passed to the function after the window is triggered.

Compiled query doesn't recognize 'exists' method

I am facing a lot of trouble while updating my application from play 2.3.x to play 2.4.11.
I started by updating play-slick from version 0.8.1 to 1.1.1, which implies updating slick from 2.1.0 to 3.1.0.
I have a generic class which aggregates the basic method like findById.
The problem I am facing at this moment is:
I had this method working as well:
def existsById(id: Long)(implicit s: Session): DBIO[Boolean] =
tableReference.filter(_.id === id).exists.result
I decided to use compiled queries, so I did as following:
private val queryById = Compiled((id: Rep[Option[Long]]) => tableReference.filter(_.id === id))
def existsById(id: Option[Long])(implicit s: Session): DBIO[Boolean] =
queryById(id).exists.result
and now, I am getting an error saying that
Cannot resolve symbol exists
Am I doing it wrong? or is it a bug?
After you've "lifted" a Query into a Compiled you have to use map to transfrom it to a diferent Query. For example:
val existsById = queryById.map(q => (id: Rep[Long]) => q(id).exists)

Implementing a java interface in Scala

I have the following code for building a cache using google collections:
val cache = new MapMaker().softValues().expiration(30,
TimeUnit.DAYS).makeComputingMap(
new com.google.common.base.Function[String,Int] {
def apply(key:String):Int ={
1
}
})
And I am getting the following error message:
error: type mismatch;
found : java.lang.Object with
com.google.common.base.Function[java.lang.String,Int]{ ... }
required: com.google.common.base.Function[?, ?]
new com.google.common.base.Function[String,Int] {
^
I am wondering why the types don't match ?
The actual code is:
import com.google.common.collect.MapMaker
trait DataCache[V] {
private val cache = new MapMaker().softValues().makeComputingMap(
new com.google.common.base.Function[String,V] {
def apply(key:String):V = null.asInstanceOf[V]
})
def get(key:String):V = cache.get(key)
}
Kind regards,
Ali
PS - I am using google-collections v1
You need to supply type parameters to the final method call. You are going through the raw type interface and scala cannot reconstruct the type information.
val cache = new MapMaker().softValues().expiration(30,
TimeUnit.DAYS).makeComputingMap[String, Int](
new com.google.common.base.Function[String,Int] {
def apply(key:String):Int ={
1
}
})
Does the following works?
new com.google.common.base.Function[_,_] {
If that doesn't work, you may wish to keep the declaration as it is right now, and then add a : com.google.common.base.Function[_, _] after it, like this:
val cache = new MapMaker().softValues().expiration(30,
TimeUnit.DAYS).makeComputingMap(
new com.google.common.base.Function[String,Int] {
def apply(key:String):Int ={
1
}
}: com.google.common.base.Function[_, _])
I have heard that some Google stuff use raw types, which are rather hard to integrate well with Scala. And, in fact, should be banished back to hell, where they came from, but that's just imho.
Also, if you could compile that with -explaintypes, we may get a better notion of what is failing.