I am trying to start writing test for mongodb in my play application.
It seems that I can not get it working because the connection is shutdown before the save is executed.
Here are the results
[info] UserDaoMongoSpec:
[info] UserDao
[info] - application - ReactiveMongoApi starting...
[info] - application - ReactiveMongoApi successfully started with DB 'test'! Servers:
[localhost:27017]
[info] - should save users and find them by userId
[info] - application - ReactiveMongoApi stopping...
[info] - application - ReactiveMongoApi connections stopped. [Success(Closed)]
[INFO] [12/24/2015 15:36:43.961] [reactivemongo-akka.actor.default-dispatcher-4] [akka://reactivemongo/user/Monitor-3] Message [reactivemongo.core.actors.Close$] from Actor[akka://reactivemongo/deadLetters] to Actor[akka://reactivemongo/user/Monitor-3#1192398481] was not delivered. [1] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.
[info] ScalaTest
[info] Run completed in 3 seconds, 989 milliseconds.
[info] Total number of tests run: 1
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 1, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 1, Failed 0, Errors 0, Passed 1
This is the code of the test, which is not testing anything, but I am trying first to write the database.
package test.daos
import scala.concurrent.ExecutionContext.Implicits.global
import daos.impl.UserDaoMongo
import org.scalatest._
import play.api.test._
import play.api.test.Helpers._
import org.scalatestplus.play._
class UserDaoMongoSpec extends DaosApplicationSpecOneAppPerTest {
"UserDao" should {
"save users and find them by userId" in {
val userDao = new UserDaoMongo
val future = for {
_ <- userDao.save(credentialsTestUser)
maybeUser <- userDao.find(credentialsTestUser.id)
} yield maybeUser.map(_ == credentialsTestUser)
}
}
}
And the dao implementation
package daos.impl
import java.util.UUID
import scala.concurrent.Future
import play.api.Play.current
import play.api.libs.concurrent.Execution.Implicits.defaultContext
import play.api.libs.json.Json
import play.modules.reactivemongo.ReactiveMongoApi
import play.modules.reactivemongo.json._
import play.modules.reactivemongo.json.collection.JSONCollection
import com.mohiva.play.silhouette.api.LoginInfo
import models.{User, Profile}
import models.User._
import daos.api.UserDao
import play.api.Logger
class UserDaoMongo extends UserDao {
lazy val reactiveMongoApi = current.injector.instanceOf[ReactiveMongoApi]
val users = reactiveMongoApi.db.collection[JSONCollection]("users")
def find(loginInfo:LoginInfo):Future[Option[User]] =
users.find(Json.obj("profiles.loginInfo" -> loginInfo)).one[User]
def find(userId:UUID):Future[Option[User]] ={
Logger.debug("AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA")
users.find(Json.obj("id" -> userId)).one[User]
}
def save(user:User):Future[User] =
users.insert(user).map(_ => user)
def confirm(loginInfo:LoginInfo):Future[User] = for {
_ <- users.update(Json.obj(
"profiles.loginInfo" -> loginInfo
), Json.obj("$set" -> Json.obj("profiles.$.confirmed" -> true)))
user <- find(loginInfo)
} yield user.get
def link(user:User, profile:Profile) = for {
_ <- users.update(Json.obj(
"id" -> user.id
), Json.obj("$push" -> Json.obj("profiles" -> profile)))
user <- find(user.id)
} yield user.get
def update(profile:Profile) = for {
_ <- users.update(Json.obj(
"profiles.loginInfo" -> profile.loginInfo
), Json.obj("$set" -> Json.obj("profiles.$" -> profile)))
user <- find(profile.loginInfo)
} yield user.get
}
What could be that I am doing wrong?
Thank you
As says in the comments, it turns out that the mistake was that I was not waiting for the future completion. I decided to use ScalaTest specific function to make tests wait for the futures completions
Here is the code
"UserDao" should {
"save users and find them by userId" in withUserDao { userDao =>
val future = for {
user <- userDao.save(credentialsTestUser)
maybeUser <- userDao.find(credentialsTestUser.id)
} yield {
maybeUser.map(_ == credentialsTestUser)
}
whenReady (future) { result =>
result.get must be (true)
}
}
}
and just in case, I override the default behaviour because the default time was not enough
implicit override val patienceConfig =
PatienceConfig(timeout = Span(2, Seconds), interval = Span(5, Millis))
Thank you
Related
I am trying to learn concurrency in Scala and using Scala futures to generate a dataset with random string. I want to create an application which should generate a file with any number of records and it should be scalable.
Code:
import java.util.concurrent.{ExecutorService, Executors}
import scala.util.{Failure, Random, Success}
import scala.concurrent.duration._
object datacreator {
implicit val ec: ExecutionContext = new ExecutionContext {
val threadPool: ExecutorService = Executors.newFixedThreadPool(4)
def execute(runnable: Runnable) {
threadPool.submit(runnable)
}
def reportFailure(t: Throwable) {}
}
def getRecord : String = {
"Random string"
}
def main(args: Array[String]): Unit = {
val filename = args(0)
val number_of_records = args(1)
val file_Object = new FileWriter(filename, true)
val data: Future[Iterable[String]] = Future {
for (i <- 1 to number_of_records.toInt)
yield getRecord
}
val result = data.map{
result => result.foreach(record => file_Object.write(record))
}
result.onComplete{
case Success(value) => {
println("Success")
file_Object.close()
}
case Failure(e) => e.printStackTrace()
}
}
}
I am facing the following issues:
When I am running the program using SBT it is writing results to the file but not terminating as going in infinite mode.
[info] Loading project definition from /Users/cw0155/PersonalProjects/datagen/project
[info] Loading settings for project datagen from build.sbt ...
[info] Set current project to datagenerator (in build file:/Users/cw0155/PersonalProjects/datagen/)
[info] running com.generator.DataGenerator xyz.csv 100
Success
| => datagen / Compile / runMain 255s
When I am running the program using Jar as:
scala -cp target/scala-2.13/datagenerator_2.13-0.1.jar com.generator.DataGenerator "pqr.csv" "1000"
It is waiting infinite time and not writing to the file.
Any help is much appreciated :)
Try this version
bar.scala
import scala.concurrent.{Await, Future, ExecutionContext}
import scala.concurrent.duration._
import scala.util.{Success, Failure}
import ExecutionContext.Implicits.global
import java.io.FileWriter
object bar {
def getRecord: String = "Random string\n"
def main(args: Array[String]): Unit = {
val filename = args(0)
val number_of_records = args(1)
val data: Future[Iterable[String]] = Future {
for (i <- 1 to number_of_records.toInt)
yield getRecord
}
val file_Object = new FileWriter(filename, true)
val result = data.map( r => r.foreach(record => file_Object.write(record)) )
result.onComplete {
case Success(value) =>
println("Success")
file_Object.close()
case Failure(e) =>
e.printStackTrace()
}
Await.result( result, 10.second )
}
}
Your original version gave me the expected output when I ran it like so
bash-3.2$ scala bar.scala /dev/fd/1 10
Success
Random string
Random string
Random string
Random string
Random string
Random string
Random string
Random string
Random string
Random string
However without the Await.result your program can exit before the future finishes.
I have written this slick DAO and its unit test in specs2.
My code has race conditions. When I run the same tests, I get different outputs.
The race conditions exist even though in both the functions I do Await.result(future, Duration.Inf)
DAO
package com.example
import slick.backend.DatabasePublisher
import slick.driver.H2Driver.api._
import scala.concurrent.ExecutionContext.Implicits.global
import slick.jdbc.meta._
import scala.concurrent._
import ExecutionContext.Implicits.global
import scala.concurrent.duration._
case class Person(id: Int, firstname: String, lastname: String)
class People(tag: Tag) extends Table[Person](tag, "PEOPLE") {
def id = column[Int]("PERSON_ID", O.PrimaryKey)
def firstname = column[String]("PERSON_FIRST_NAME")
def lastname = column[String]("PERSON_LAST_NAME")
def * = (id, firstname, lastname) <> (Person.tupled, Person.unapply _)
}
object PersonDAO {
private def createList(numRows: Int) : List[Person] = {
def recFunc(counter: Int, result: List[Person]) : List[Person] = {
counter match {
case x if x <= numRows => recFunc(counter + 1, Person(counter, "test" + counter, "user" + counter) :: result)
case _ => result
}
}
recFunc(1, List[Person]())
}
val db = Database.forConfig("test1")
val people = TableQuery[People]
def createAndPopulate(numRows: Int) = {
val action1 = people.schema.create
val action2 = people ++= Seq(createList(numRows) : _* )
val combined = db.run(action1 andThen action2)
val future1 = combined.map { result =>
result map {x =>
println(s"number of rows inserted $x")
x
}
}
Await.result(future1, Duration.Inf).getOrElse(0)
}
def printAll() = {
val a = people.result
val b = db.run(a)
val y = b map { result =>
result map {x => x}
}
val z = Await.result(y, Duration.Inf)
println(z)
println(z.length)
z
}
}
Unit Test
import org.specs2.mutable._
import com.example._
class HelloSpec extends Specification {
"This usecase " should {
"should insert rows " in {
val x = PersonDAO.createAndPopulate(100)
x === 100
}
}
"This usecase " should {
"return 100 rows" in {
val x = PersonDAO.printAll()
val y = PersonDAO.printAll()
y.length === 100
}
}
}
When I run this same code using activator test I see 2 different types of outputs on different runs
sometimes the code gets exception
number of rows inserted 100
[info] HelloSpec
[info]
[info] This usecase should
[info] + should insert rows
[info]
[info] This usecase should
[info] ! return 100 rows
[error] JdbcSQLException: : Table PEOPLE not found; SQL statement:
[error] select x2."PERSON_ID", x2."PERSON_FIRST_NAME", x2."PERSON_LAST_NAME" from "PEOPLE" x2 [42S02-60] (Message.java:84)
[error] org.h2.message.Message.getSQLException(Message.java:84)
[error] org.h2.message.Message.getSQLException(Message.java:88)
[error] org.h2.message.Message.getSQLException(Message.java:66)
Sometimes the 1st function call returns 0 rows and the 2nd function call returns 100 values
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
number of rows inserted 100
Vector()
0
Vector(Person(100,test100,user100), Person(99,test99,user99), Person(98,test98,user98), Person(97,test97,user97), Person(96,test96,user96), Person(95,test95,user95), Person(94,test94,user94), Person(93,test93,user93), Person(92,test92,user92), Person(91,test91,user91), Person(90,test90,user90), Person(89,test89,user89), Person(88,test88,user88), Person(87,test87,user87), Person
I don't understand why does my code have these race conditions because I block on future in each method.
Your assumption that two test cases should run serial, one after the other is not right. The test cases are running parallel. Just use sequential to verify that thats the case.
When running the following akka streaming FlowGraph not all the emitted Chars are received by all Sinks.
package sample.stream
import java.io.{ FileOutputStream, PrintWriter }
import akka.actor.ActorSystem
import akka.stream.ActorFlowMaterializer
import akka.stream.scaladsl.{ Broadcast, FlowGraph, Sink, Source }
import scala.concurrent.forkjoin.ThreadLocalRandom
import scala.util.{ Failure, Success, Try }
object Sample {
def main(args: Array[String]): Unit = {
println("start")
implicit val system = ActorSystem("Sys")
import system.dispatcher
implicit val materializer = ActorFlowMaterializer()
var counter = -1
val countSource: Source[Char, Unit] = Source(() => Iterator.continually { counter += 1; (counter + 'A').toChar }.take(11))
var counter1 = 0
val consoleSink1 = Sink.foreach[Char] { counter =>
println("sink1:" + counter1 + ":" + counter)
counter1 += 1
Thread.sleep(100)
//Thread.sleep(300)
}
var counter2 = 0
val consoleSink2 = Sink.foreach[Char] { counter =>
println("sink2:" + counter2 + ":" + counter)
counter2 += 1
Thread.sleep(200)
}
val materialized = FlowGraph.closed(consoleSink1, consoleSink2)((x1, x2) => x1) { implicit builder =>
(console1, console2) =>
import FlowGraph.Implicits._
val broadcast = builder.add(Broadcast[Char](2))
countSource ~> broadcast ~> console1
broadcast ~> console2
}.run()
// ensure the output file is closed and the system shutdown upon completion
materialized.onComplete {
case Success(_) =>
system.shutdown()
case Failure(e) =>
println(s"Failure: ${e.getMessage}")
system.shutdown()
}
println("waiting the remaining ones")
//scala.concurrent.Await.ready(materialized, scala.concurrent.duration.DurationInt(100).seconds)
//system.shutdown()
println("end")
}
}
After running the following output is generated
[info] Running sample.stream.Sample
[info] start
[info] waiting the remaining ones
[info] end
[info] sink2:0:A
[info] sink1:0:A
[info] sink1:1:B
[info] sink1:2:C
[info] sink2:1:B
[info] sink1:3:D
[info] sink2:2:C
[info] sink1:4:E
[info] sink1:5:F
[info] sink2:3:D
[info] sink1:6:G
[info] sink1:7:H
[info] sink2:4:E
[info] sink2:5:F
[info] sink1:8:I
[info] sink1:9:J
[info] sink2:6:G
[info] sink2:7:H
[info] sink1:10:K
The second sink doesn't receive the 8th, 9th and 10th values: IJK but still the entire flow is ended.
What should I do to wait for both Sinks to consume all the data?
I discovered that if I change the (x1,x2)=>x1 to (x1,x2)=>x2 this will wait. That is the same with sleeping 300ms in the first sink.
The function that you pass to a second parameter list of FlowGraph.closed determines what materialized value is returned when you run the flow. So when you pass in (x1,x2)=>x1 you return a future which is completed when the first sink gets all elements and then the callback on that future shuts down the actor system without the second sink having a chance receiving all of the elements.
Instead, you should get both futures out and shutdown the system only when both futures are completed.
You can actually see how this approach is used in some of the akka-stream tests here.
Pretty new to scala and play and I have been assigned a task to test someone else app,which is running fine btw.Please check if my tests are right and what is the error.
This is employeeEntry.scala file in models
package models
import models.database.Employee
import play.api.db.slick.Config.driver.simple._
import play.api.db.slick._
import play.api.Play.current
case class EmployeeEntry(eid :Int, ename: String, eadd: String, emob: String)
object Employee {
val DBemp = TableQuery[Employee]
def savedat(value: EmployeeEntry):Long = {
DB.withSession { implicit session =>
DBemp+=EmployeeEntry(eid=value.eid,ename=value.ename,eadd=value.eadd,emob=value.emob)
}}
/*val query = for (c <- Employee) yield c.ename
val result = DB.withSession {
implicit session =>
query.list // <- takes session implicitly
}*/
//val query = for (c <- Employee) yield c.ename
def getPersonList: List[EmployeeEntry] = DB.withSession { implicit session => DBemp.list }
def Update: Int = DB.withSession { implicit session =>
(DBemp filter (_.eid === 1) map (s => (s.ename,s.eadd))) update ("test","khair")}
def delet :Int =DB.withSession {
implicit session => (DBemp filter (_.eid === 1)).delete
}
}
And this is file Employee.scala in models/database
package models.database
import models._
import models.EmployeeEntry
import play.api.db.slick.Config.driver.simple._
import scala.slick.lifted._
class Employee(tag:Tag) extends Table[EmployeeEntry](tag,"employee")
{
//val a = "hello"
def eid = column[Int]("eid", O.PrimaryKey)
def ename = column[String]("name", O.DBType("VARCHAR(50)"))
def emob = column[String]("emob",O.DBType("VARCHAR(10)"))
def eadd =column[String]("eadd",O.DBType("VARCHAR(100)"))
def * = (eid, ename, emob, eadd) <> (EmployeeEntry.tupled, EmployeeEntry.unapply)
}
Finally this is my test I am running,which is failing :
import play.api.db.slick.Config.driver.simple._
import play.api.db.slick._
import play.api.Play.current
import org.scalatest.FunSpec
import org.scalatest.matchers.ShouldMatchers
import models.database.Employee
import scala.slick.lifted._
import models._
import models.EmployeeEntry
//import scala.slick.driver.H2Driver.simple._
class databasetest extends FunSpec with ShouldMatchers{
describe("this is to check database layer"){
it("can save a row"){
val a = EmployeeEntry(1006,"udit","schd","90909090")
Employee.savedat(a) should be (1)
}
it("getpersonlist"){
Employee.getPersonList.size should be (1)
}
}
}
The test is failing and error is
java.lang.RuntimeException: There is no started application
at scala.sys.package$.error(package.scala:27)
at play.api.Play$$anonfun$current$1.apply(Play.scala:71)
at play.api.Play$$anonfun$current$1.apply(Play.scala:71)
at scala.Option.getOrElse(Option.scala:120)
at play.api.Play$.current(Play.scala:71)
at models.Employee$.getPersonList(EmployeeEntry.scala:27)
at databasetest$$anonfun$1$$anonfun$apply$mcV$sp$2.apply$mcV$sp(databasetest.scala:39)
at databasetest$$anonfun$1$$anonfun$apply$mcV$sp$2.apply(databasetest.scala:39)
at databasetest$$anonfun$1$$anonfun$apply$mcV$sp$2.apply(databasetest.scala:39)
at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
By default play provides spec2 testing framework.so no need to add scalatest framework for unit testing.For database access layer testing required a connection with database so start a fake application.(this is not running code, just an idea to write unit test)
for more detail take a look play doc : https://www.playframework.com/documentation/2.3.x/ScalaFunctionalTestingWithSpecs2
import org.specs2.mutable.Specification
import models.database.Employee
import models._
import models.EmployeeEntry
import play.api.test.FakeApplication
import play.api.test.Helpers.running
import play.api.Play.current
class databasetest extends Specification {
"database layer" should {
"save a row" in {
running(FakeApplication()) {
val a = EmployeeEntry(1006,"udit","schd","90909090")
Employee.savedat(a) must be equalTo (1)
}
}
"get list" in {
running(FakeApplication()) {
Employee.getPersonList.size must be equalTo (1)
}
}
}
I have created a simple iteratee to download a file using WS as explained in this link.
Consider the following snippet:
import java.nio.ByteBuffer
import java.nio.channels.FileChannel
import org.specs2.mutable.Specification
import org.specs2.time.NoTimeConversions
import play.api.libs.Files.TemporaryFile
import play.api.libs.iteratee.{Done, Input, Cont, Iteratee}
import play.api.libs.ws.WS
import scala.concurrent.Await
import scala.concurrent.duration._
import scala.concurrent.ExecutionContext.Implicits.global
class DummySpec extends Specification with NoTimeConversions {
def fileChannelIteratee(channel: FileChannel): Iteratee[Array[Byte], Unit] = Cont {
case Input.EOF =>
println("Input.EOF")
channel.close()
Done(Unit, Input.EOF)
case Input.El(bytes) =>
println("Input.El")
val buf = ByteBuffer.wrap(bytes)
channel.write(buf)
fileChannelIteratee(channel)
case Input.Empty =>
println("Input.Empty")
fileChannelIteratee(channel)
}
"fileChannelIteratee" should {
"work" in {
val file = TemporaryFile.apply("test").file
val channel = FileChannel.open(file.toPath)
val future = WS.url("http://www.example.com").get(_ => fileChannelIteratee(channel)).map(_.run)
Await.result(future, 10.seconds)
file.length !== 0
}
}
}
Calling .map(_.run) after WS.get seems to have no effect here as the iteratee does not seem to receive Input.EOF. It prevents me from being able to close the channel. This is the output I get:
Input.El
[info] DummySpec
[info]
[info] fileChannelIteratee should
[info] x work (941 ms)
[error] '0' is equal to '0' (DummySpec.scala:37)
[info]
[info]
[info] Total for specification DummySpec
[info] Finished in 948 ms
[info] 1 example, 1 failure, 0 error
What am I doing wrong?
I am using Play Framework 2.2.2.
Thanks in advance.
I was opening the FileChannel in a wrong way. It seems to default to read mode according to this link when no parameters are given.
The exception thrown from channel.write was being swallowed by map operation as the return type of the whole operation is Future[Future[Unit]]. The outer Future is in a successful state even if the internal one fails in this case. flatMap should be used instead.