I am kinda new with Scala and as said in the title, i am trying to mock a class.
DateServiceTest.scala
#RunWith(classOf[JUnitRunner])
class DateServiceTest extends FunSuite with MockitoSugar {
val conf = new SparkConf().setAppName("Simple Application").setMaster("local")
val sc = new SparkContext(conf)
implicit val sqlc = new SQLContext(sc)
val m = mock[ConfigManager]
when(m.getParameter("dates.traitement")).thenReturn("10")
test("mocking test") {
val instance = new DateService
val date = instance.loadDates
assert(date === new DateTime())
}
}
DateService.scala
class DateService extends Serializable with Logging {
private val configManager = new ConfigManager
private lazy val datesTraitement = configManager.getParameter("dates.traitement").toInt
def loadDates() {
val date = selectFromDatabase(datesTraitement)
}
}
Unfortunately when I run the test, datesTraitement returns null instead of 10, but m.getparameter("dates.traitement") does return 10.
Maybe i am doing some kind of anti pattern somewhere but I don't know where, please keep in mind that I am new with all of this and I didn't find any proper example specific to my case on internet.
Thanks for any help.
I think the issue is your mock is not injected, as you create ConfigManager inline in the DateService class.
Instead of
class DateService extends Serializable with Logging {
private val configManager = new ConfigManager
}
try
class DateService(private val configManager: ConfigManager) extends Serializable with Logging
and in your test case inject the mocked ConfigManager when you construct DateService
class DateServiceTest extends FunSuite with MockitoSugar {
val m = mock[ConfigManager]
val instance = new DateService(m)
}
Related
My service is running fine until I hit api endpoint request with this error-
Cannot load play.application.loader[play.application.loader [class java.lang.Class}] does not implement interface play.api.ApplicationLoader or interface play.ApplicationLoader.]
my service Loader:-
class LagomPersistentEntityLoader extends LagomApplicationLoader {
override def load(context: LagomApplicationContext): LagomApplication =
new LagomPersistentEntityApplication(context) with AkkaDiscoveryComponents
override def loadDevMode(context: LagomApplicationContext): LagomApplication =
new LagomPersistentEntityApplication(context) with LagomDevModeComponents
override def describeService: Option[Descriptor] = Some(readDescriptor[LagomTestingEntity])
}
trait UserComponents
extends LagomServerComponents
with SlickPersistenceComponents
with HikariCPComponents
with AhcWSComponents {
override lazy val jsonSerializerRegistry: JsonSerializerRegistry = UserSerializerRegistry
lazy val userRepo: UserRepository = wire[UserRepository]
readSide.register(wire[UserEventsProcessor])
clusterSharding.init(
Entity(UserState.typeKey){ entityContext =>
UserBehaviour(entityContext)
}
)
}
abstract class LagomPersistentEntityApplication(context: LagomApplicationContext)
extends LagomApplication(context)
with UserComponents {
implicit lazy val actorSystemImpl: ActorSystem = actorSystem
implicit lazy val ec: ExecutionContext = executionContext
override lazy val lagomServer: LagomServer = serverFor[LagomTestingEntity](wire[LagomTestingEntityImpl])
lazy val bodyParserDefault: Default = wire[Default]
}
application.conf:-
play.application.loader = org.organization.service.LagomTestingEntityImpl
lagom-persistent-entity.cassandra.keyspace = lagom-persistent-entity
cassandra-journal.keyspace = ${lagom-persistent-entity.cassandra.keyspace}
cassandra-snapshot-store.keyspace = ${lagom-persistent-entity.cassandra.keyspace}
lagom.persistent.read-side.cassandra.keyspace = ${lagom-persistent-entity.cassandra.keyspace}
This service has both read side and write side support and If anyone need more info in code then please ask because I really wants to understand where the application is failing.
You should set play.application.loader to the name of your loader class, rather than your persistent entity class:
play.application.loader = org.organization.service.LagomPersistentEntityLoader
I'm assuming that LagomPersistentEntityLoader is in the same package as LagomTestingEntityImpl. If not, then adjust the fully-qualified class name as needed.
I've created a class containing a function that processes a spark dataframe.
class IsbnEncoder(df: DataFrame) extends Serializable {
def explodeIsbn(): DataFrame = {
val name = df.first().get(0).toString
val year = df.first().get(1).toString
val isbn = df.first().get(2).toString
val isbn_ean = "ISBN-EAN: " + isbn.substring(6, 9)
val isbn_group = "ISBN-GROUP: " + isbn.substring(10, 12)
val isbn_publisher = "ISBN-PUBLISHER: " + isbn.substring(12, 16)
val isbn_title = "ISBN-TITLE: " + isbn.substring(16, 19)
val data = Seq((name, year, isbn_ean),
(name, year, isbn_group),
(name, year, isbn_publisher),
(name, year, isbn_title))
df.union(spark.createDataFrame(data))
}
}
The problem is I don't know how to create a dataframe within the class without creating a new instance of spark = sparksession.builder().appname("isbnencoder").master("local").getorcreate(). This is defined in another class in a separate file that includes this file and uses this class(the one I've included). Obviously, my code is getting errors because the compiler doesn't know what spark is.
You can create a trait that extends from serializable and create spark session as a lazy variable and then through out your project in all the objects that you create, you can extend that trait and it will give you sparksession instance.
import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.DataFrame
trait SparkSessionWrapper extends Serializable {
lazy val spark: SparkSession = {
SparkSession.builder().appName("TestApp").getOrCreate()
}
//object with the main method and it extends SparkSessionWrapper
object App extends SparkSessionWrapper {
def main(args: Array[String]): Unit = {
val readdf = ReadFileProcessor.ReadFile("testpath")
readdf.createOrReplaceTempView("TestTable")
val viewdf = spark.sql("Select * from TestTable")
}
}
object ReadFileProcessor extends SparkSessionWrapper{
def ReadFile(path: String) : DataFrame = {
val df = spark.read.format("csv").load(path)
df
}
}
As you are extending the SparkSessionWrapper on both the Objects that you created, spark session would get initialized when first time spark variable is encountered in the code and then you refer it on any object that extends that trait without passing that as a parameter to the method. It works or give you a experience that is similar to notebook.
I need some education for class, object with Serialization
Say, I have a spark main job, which maps a dataframe to another dataframe:
def main(args: Array[String]){
val ss = SparkSession.builder
.appName("test")
.getOrCreate()
val mydf = ss.read("myfile")
// if call from object
val newdf = mydf.map(x=>Myobj.myfunc(x))
//if call from class
val myclass = new Myclass()
val newdf = mydf.map(x=>myclass.myfunc(x))
}
object Myobj {
def myfunc(x:Int):Int = {
x + 1
}
}
class Myclass{
def myfunc(x:Int):Int = {
x + 1
}
}
My questions are:
Which closure should I use to define myfunc within? an object or a class? What is the difference in terms of performance.
Should I extends Serializable for the object or class. Why?
I want to print/log some message from the object/class, what should I do?
Thanks
I am using Play 2.6 and Slick 3.2 with MySql database. But the problem is that I am not able to write test cases using h2 database. The problem is that, by using guice, I am not able to inject DatabaseConfigProvider in my repo. Following is my repo class implementation:
class CompanyRepoImpl #Inject() (dbConfigProvider: DatabaseConfigProvider)
(implicit ec: ExecutionContext) extends CompanyRepo {
val logger: Logger = LoggerFactory.getLogger(this.getClass())
private val dbConfig = dbConfigProvider.get[JdbcProfile]
import dbConfig._
import profile.api._
private val company = TableQuery[Companies]
------
}
My test case class:
class CompanyRepoSpec extends AsyncWordSpec with Matchers with BeforeAndAfterAll {
private lazy val injector: Injector = new GuiceApplicationBuilder()
.overrides(bind(classOf[CompanyRepo]).to(classOf[CompanyRepoImpl]))
.in(Mode.Test)
.injector()
private lazy val repo: CompanyRepo = injector.instanceOf[CompanyRepo]
private lazy val dbApi = injector.instanceOf[DBApi]
override protected def beforeAll(): Unit = {
Evolutions.applyEvolutions(database = dbApi.database("test"))
}
override protected def afterAll(): Unit = {
Evolutions.cleanupEvolutions(database = dbApi.database("test"))
}
-----------------------
}
Testing application.test.conf
play.evolutions.db.test.enabled=true
play.evolutions.autoApply=true
slick.dbs.test.profile="slick.jdbc.H2Profile$"
slick.dbs.test.db.driver="org.h2.Driver"
slick.dbs.test.db.url="jdbc:h2:mem:test;MODE=MySQL"
May be, my configuration are not valid for testing, so, how can we write test cases using scala, slick and play.
I just found a bug on a class serialization in spark.
=> Now, I want to make a unit-test, but I don't see how?
Notes:
the failure appends in a (de)serialized object which has been broadcasted.
I want to test exactly what spark will do, to assert it will work once deployed
the class to serialize is a standard class (not case class) which extends Serializer
Looking into spark broadcast code, I found a way. But it uses private spark code, so it might becomes invalid if spark changes internally. But still it works.
Add a test class in a package starting by org.apache.spark, such as:
package org.apache.spark.my_company_tests
// [imports]
/**
* test data that need to be broadcast in spark (using kryo)
*/
class BroadcastSerializationTests extends FlatSpec with Matchers {
it should "serialize a transient val, which should be lazy" in {
val data = new MyClass(42) // data to test
val conf = new SparkConf()
// Serialization
// code found in TorrentBroadcast.(un)blockifyObject that is used by TorrentBroadcastFactory
val blockSize = 4 * 1024 * 1024 // 4Mb
val out = new ChunkedByteBufferOutputStream(blockSize, ByteBuffer.allocate)
val ser = new KryoSerializer(conf).newInstance() // Here I test using KryoSerializer, you can use JavaSerializer too
val serOut = ser.serializeStream(out)
Utils.tryWithSafeFinally { serOut.writeObject(data) } { serOut.close() }
// Deserialization
val blocks = out.toChunkedByteBuffer.getChunks()
val in = new SequenceInputStream(blocks.iterator.map(new ByteBufferInputStream(_)).asJavaEnumeration)
val serIn = ser.deserializeStream(in)
val data2 = Utils.tryWithSafeFinally { serIn.readObject[MyClass]() } { serIn.close() }
// run test on data2
data2.yo shouldBe data.yo
}
}
class MyClass(i: Int) extends Serializable {
#transient val yo = 1 to i // add lazy to make the test pass: not lazy transient val are not recomputed after deserialization
}