I want to start testcontainers from a docker-compose file (postgres and kafka instance), before the play application (with slick) starts up. I want this, so I can write a end to end test. I can not seem to figure out how this is possible with Play.
import java.io.File
import com.dimafeng.testcontainers.DockerComposeContainer.ComposeFile
import com.dimafeng.testcontainers.{DockerComposeContainer, ForAllTestContainer}
import com.typesafe.config.ConfigFactory
import org.scalatest.{BeforeAndAfterAll, FunSpec}
import org.scalatestplus.play.guice.GuiceFakeApplicationFactory
import play.api.inject.guice.GuiceApplicationBuilder
import play.api.{Application, Configuration, Environment, Mode}
trait TestFunSpec extends FunSpec with BeforeAndAfterAll with GuiceFakeApplicationFactory {
override def fakeApplication(): Application = new GuiceApplicationBuilder()
.in(Environment(new File("."), getClass.getClassLoader, Mode.Test))
.loadConfig(_ => Configuration(ConfigFactory.load("test.conf")))
.build
}
class TestIntegrationSpec extends TestFunSpec with ForAllTestContainer {
override val container = DockerComposeContainer(ComposeFile(Left(new File("docker-compose.yml"))))
it("should test something") {
assert(true)
}
}
Scala version 2.12.10
Testcontainer version 0.35.0
Play slick version 5.0.0
When I execute the test without "TestFunSpec", the docker-compose spins up my services correctly. When I add the play application "TestFunSpec" in scope, the application tries to startup, when doing so, it tries to connect with postgres, which is not yet existing (as the testcontainers are started afterwards).
Thnx in advance.
Update: see answer section for an elaborate answer.
After some deep dive in the Play test suite mechanism, I came up with a workable setup.
Step 1, define your test container suite with an AppProvider:
import com.dimafeng.testcontainers.{Container, ForAllTestContainer}
import org.scalatest.Suite
import org.scalatestplus.play.AppProvider
trait PlayTestContainer extends Suite with AppProvider with ForAllTestContainer {
override val container: Container
}
Step 2, create an abstract PlayTestContainerIntegrationSpec which extends the above trait:
import java.io.File
import com.typesafe.config.ConfigFactory
import org.scalatest.concurrent.{IntegrationPatience, ScalaFutures}
import org.scalatest.{BeforeAndAfterAll, BeforeAndAfterEach, TestData}
import org.scalatestplus.play.PlaySpec
import org.scalatestplus.play.guice.GuiceOneAppPerTest
import play.api.inject.guice.GuiceApplicationBuilder
import play.api.{Application, Configuration, Environment, Mode}
abstract class PlayTestContainerIntegrationSpec
extends PlaySpec
with PlayTestContainer
with GuiceOneAppPerTest
with ScalaFutures
with IntegrationPatience
with BeforeAndAfterEach
with BeforeAndAfterAll {
override def newAppForTest(testData: TestData): Application = application()
def application(): Application =
new GuiceApplicationBuilder()
.in(Environment(new File("."), getClass.getClassLoader, Mode.Test))
.loadConfig(_ => Configuration(ConfigFactory.load("test.conf")))
.build
}
As you can see, we include the "PlayTestContainer" trait and we override the "newAppForTest" function for the construction of a play application.
Step 3, create a specific integration test, extend the above abstract PlayTestContainerIntegrationSpec, and override the container, to your specific need:
class TestIntegrationSpec extends PlayTestContainerIntegrationSpec {
override val container = DockerComposeContainer(ComposeFile(Left(new File("docker-compose.yml"))))
"should test something" in {
assert(true === true)
}
}
Hope this helps.
Related
I try to upgrade a Scala/Play project to Play 2.7, Scala 2.12.11, Specs2 4.5.1.
In the project there is the following Specs2 test that I cannot understand in the sense of its Specs2 specification structure (could be that the Specs2 API changed a lot since the test was written).
When I looked at the structure of specifications in the current API, I could not see any example of is method combined together with should.
What was it supposed to mean?
How can I rewrite such a specification in the current Specs2 API?
I also noticed that the test code used import org.specs2.mutable.Specification instead of import org.specs2.Specification which is supposed to be used when using the is method.
And it uses def is(implicit ee: ExecutionEnv), instead of def is.
Here is the old test:
package services
import org.specs2.concurrent.ExecutionEnv
import org.specs2.mutable.Specification
import org.specs2.specification.mutable.ExecutionEnvironment
import play.api.inject.guice.GuiceApplicationBuilder
import play.api.test.WithApplication
import play.modules.reactivemongo._
import scala.concurrent.duration._
class StatisticsServiceSpec() extends Specification with ExecutionEnvironment {
def is(implicit ee: ExecutionEnv) = {
"The StatisticsService" should {
"compute and publish statistics" in new WithApplication() {
val repository = new MongoStatisticsRepository(configuredAppBuilder.injector.instanceOf[ReactiveMongoApi])
val wsTwitterService = new WSTwitterService
val service = new DefaultStatisticsService(repository, wsTwitterService)
val f = service.createUserStatistics("elmanu")
f must beEqualTo(()).await(retries = 0, timeout = 5.seconds)
}
}
def configuredAppBuilder = {
import scala.collection.JavaConversions.iterableAsScalaIterable
val env = play.api.Environment.simple(mode = play.api.Mode.Test)
val config = play.api.Configuration.load(env)
val modules = config.getStringList("play.modules.enabled").fold(
List.empty[String])(l => iterableAsScalaIterable(l).toList)
new GuiceApplicationBuilder().
configure("play.modules.enabled" -> (modules :+
"play.modules.reactivemongo.ReactiveMongoModule")).build
}
}
}
To simplify the code down to the actual Specs2 API, I think it could be reduced to something like this:
package services
import org.specs2.concurrent.ExecutionEnv
import org.specs2.mutable.Specification
import scala.concurrent.Future
import play.api.test.WithApplication
import scala.concurrent.duration._
class StatisticsServiceSpec(implicit ee: ExecutionEnv) extends Specification /* with ExecutionEnvironment */ {
def is(implicit ee: ExecutionEnv) = {
"The StatisticsService" should {
"compute and publish statistics" in new WithApplication() {
val f = Future(1)
f must beEqualTo(1).await(retries = 0, timeout = 5.seconds)
}
}
}
}
Pay attention that I removed the ExecutionEnvironment trait, since it seems to have been removed from the library.
Now, the code finally compiles, but when I try to run the test, there are no errors, but no test is actually run: the output is Empty test suite.
The new specification should be
package services
import org.specs2.concurrent.ExecutionEnv
import org.specs2.mutable.Specification
import scala.concurrent.Future
import play.api.test.WithApplication
import scala.concurrent.duration._
class StatisticsServiceSpec(implicit ee: ExecutionEnv) extends Specification {
"The StatisticsService" should {
"compute and publish statistics" in new WithApplication() {
val f = Future(1)
f must beEqualTo(1).await(retries = 0, timeout = 5.seconds)
}
}
}
The ExecutionEnv is now really supposed to be retrieved as a specification member directly (with an implicit to be make it available to the await method).
is is not necessary in a "mutable" specification. is is the function in a Specification where you declare all the "Fragments" of your specification (a Fragment is a Description + an Execution). In a mutable specification this function is automatically populated from the fact that you trigger method calls directly in the body of the class when the specification is instantiated. The fragments created by should and in are collected in a mutable variable, hence the name "mutable specification".
If you define def is(implicit ee: ExecutionEnv), this is like defining another, valid, is definition that specs2 doesn't know about, while not creating anything for the def is: Fragments method. That's why you end up with an empty specification.
I am following the DB presription provided here. However, the class DatabaseExecutionContext is nowhere near to be found in the play API and, hence, I am unable to import it.
What am I missing?
See the example code here: https://github.com/playframework/play-samples/blob/2.8.x/play-scala-anorm-example/app/models/DatabaseExecutionContext.scala
import javax.inject._
import akka.actor.ActorSystem
import play.api.libs.concurrent.CustomExecutionContext
#Singleton
class DatabaseExecutionContext #Inject()(system: ActorSystem) extends CustomExecutionContext(system, "database.dispatcher")
You need to configure this context as shown by the documentation: https://www.playframework.com/documentation/2.8.x/AccessingAnSQLDatabase#Using-a-CustomExecutionContext
While you already got an answer, I suggest to use a base trait instead, like:
import javax.inject.{Inject, Singleton}
import akka.actor.ActorSystem
import play.api.libs.concurrent.CustomExecutionContext
import scala.concurrent.ExecutionContext
trait DatabaseExecutionContext extends ExecutionContext
#Singleton
class DatabaseAkkaExecutionContext #Inject()(system: ActorSystem)
extends CustomExecutionContext(system, "database.dispatcher")
with DatabaseExecutionContext
The reason is that if you don't, you'll need to bring akka while testing operations requiring this execution context, with the trait, you should be able to write a simple executor for your tests, like:
implicit val globalEC: ExecutionContext = scala.concurrent.ExecutionContext.global
implicit val databaseEC: DatabaseExecutionContext = new DatabaseExecutionContext {
override def execute(runnable: Runnable): Unit = globalEC.execute(runnable)
override def reportFailure(cause: Throwable): Unit = globalEC.reportFailure(cause)
}
EDIT: I have created a detailed post explaining this.
I wrote some traits to use it as a base for my functional tests
This file is for creating a DB in memory (H2 + Evolutions)
BlogApiDBTest.scala
package functional.common
import play.api.db.Databases
import play.api.db.evolutions.Evolutions
trait BlogApiDBTest {
implicit val testDatabase = Databases.inMemory(
name = "blog_db",
urlOptions = Map(
"MODE" -> "MYSQL"
),
config = Map(
"logStatements" -> true
)
)
org.h2.engine.Mode.getInstance("MYSQL").convertInsertNullToZero = false
Evolutions.applyEvolutions(testDatabase)
}
Here I am overriding some injected components for testing purposes
BlogApiComponentsTest.scala
package functional.common
import common.BlogApiComponents
import org.scalatestplus.play.components.WithApplicationComponents
import play.api.{BuiltInComponents, Configuration}
trait BlogApiComponentsTest extends WithApplicationComponents with BlogApiDBTest {
override def components: BuiltInComponents = new BlogApiComponents(context) {
override lazy val configuration: Configuration = context.initialConfiguration
override lazy val blogDatabase = testDatabase
}
}
This is the base class for my functional tests
BlogApiOneServerPerTestWithComponents.scala
package functional.common
import org.scalatestplus.play.PlaySpec
import org.scalatestplus.play.components.{OneServerPerTestWithComponents}
trait BlogApiOneServerPerTestWithComponents extends PlaySpec with OneServerPerTestWithComponents with BlogApiComponentsTest {
}
Finally the test I am trying to execute
PostControllerSpec.scala
package functional.controllers
import functional.common.BlogApiOneServerPerTestWithComponents
import org.scalatest.concurrent.{IntegrationPatience, ScalaFutures}
import play.api.mvc.{Results}
import play.api.test.{FakeRequest, Helpers}
import play.api.test.Helpers.{GET, route}
class PostControllerSpec extends BlogApiOneServerPerTestWithComponents
with Results
with ScalaFutures
with IntegrationPatience {
"Server query should" should {
"provide an Application" in {
val Some(result) = route(app, FakeRequest(GET, "/posts"))
Helpers.contentAsString(result) must be("success!")
}
}
}
Then I get
blog-api/test/functional/controllers/PostControllerSpec.scala:18:31: Cannot write an instance of play.api.mvc.AnyContentAsEmpty.type to HTTP response. Try to define a Writeable[play.api.mvc.AnyContentAsEmpty.type]
Here is the code
Adding the following import should make it work:
import play.api.test.Helpers._
Looking at the signature of route
def route[T](app: Application, req: Request[T])(implicit w: Writeable[T]): Option[Future[Result]]
we see it expects an implicit w: Writeable[T]. The above import will provide it via Writables
I have this application using Play framework with Scala and I want to have a service that executes a method on startup of my application. I am doing everything that is said at How do I perform an action on server startup in the Scala Play Framework?. My version of Play is 2.6 and I am not using GlobalSettings for this.
package bootstrap
import com.google.inject.AbstractModule
class EagerLoaderModule extends AbstractModule {
override def configure() = {
println("EagerLoaderModule.configure")
bind(classOf[InitSparkContext]).to(classOf[InitSparkContextImpl]).asEagerSingleton()
}
}
On the application.conf I included the line play.modules.enabled += "bootstrap.EagerLoaderModule". Below is my Service that I want to start Spark context.
package bootstrap
import javax.inject.{Inject, Singleton}
import play.api.inject.ApplicationLifecycle
import scala.concurrent.Future
trait InitSparkContext {
def init(): Unit
def stop(): Unit
}
#Singleton
class InitSparkContextImpl #Inject()(appLifecycle: ApplicationLifecycle) extends InitSparkContext {
override def init(): Unit = println("InitSparkContext.start")
override def stop(): Unit = println("InitSparkContext.stop")
appLifecycle.addStopHook { () =>
stop()
Future.successful(())
}
init()
}
Nothing is printed on the console, even println("EagerLoaderModule.configure") is not printed....
You can't use println on play application, you'll need to set up a Logger. So:
val log = play.api.Logger(getClass)
log.info("This happened")
Then you can use the logback file to configure your log files:
Here are some details on how to configure it:
https://www.playframework.com/documentation/2.6.x/SettingsLogger
I've recently started working in Scala and Play Framework and just upgraded a service I've been working on to Play 2.4.3. My end goal is to create a nightly process that starts up a service method in my play application for the purpose of scheduling events via a method call, which I'm currently calling with an Actor.
I had the basic idea of this working through a Global.scala file with an override onStart, but then I saw the play documentation about moving away from the use of GlobalSettings (https://www.playframework.com/documentation/2.4.x/GlobalSettings) and have been trying to move it to an injected dependency approach.
Here's what I've pieced together so far:
Module Code:
import javax.inject._
import com.myOrganization.myPackage.Actors.ScheduleActor
import play.api.libs.concurrent.AkkaGuiceSupport
import play.libs.Akka
import play.api.libs.concurrent.Execution.Implicits.defaultContext
import akka.actor.{ActorRef, ActorSystem}
import scala.concurrent.duration._
import play.Application
import com.google.inject.AbstractModule
#Singleton
class NightlyEvalSchedulerStartup #Inject()(system: ActorSystem, #Named("ScheduleActor") scheduleActor: ActorRef) {
Akka.system.scheduler.schedule(10.seconds, 20.seconds, scheduleActor, "ScheduleActor")
}
class ScheduleModule extends AbstractModule with AkkaGuiceSupport {
def configure() = {
bindActor[ScheduleActor]("ScheduleActor")
bind(classOf[NightlyEvalSchedulerStartup]).asEagerSingleton
}
}
Actor Class:
import akka.actor.{Actor, Props}
import com.myOrganization.myPackage.services.MySchedulingService
object ScheduleActor {
def props = Props[ScheduleActor]
class updateSchedules
}
class ScheduleActor extends Actor {
val MySchedulingService: MySchedulingService = new MySchedulingService
def receive = {
case "runScheduler" => MySchedulingService.nightlyScheduledUpdate()
}
}
Application.conf
play.modules.enabled += "com.myOrganization.myPackage.modules.ScheduleModule"
The service is calling down to a method that is primarily based on scala logic code and database interactions via Anorm.
Every time I try to start the service up with activator start (or run, once an Http request is received) I get the following error:
Oops, cannot start the server.
com.google.inject.CreationException: Unable to create injector, see the following errors:
1) Error injecting constructor, java.lang.RuntimeException: There is no started application
I've tried running the same code by replacing the Aka.system.scheduler... piece with a simple println() and everything seemed to work fine, meaning the service started up and I saw my message on the console. So I'm guessing there's some dependency that I'm missing for the Akka scheduler that is causing it to blowup. Any suggestions you can offer would be great, I've been banging my head against this all day.
EDIT (Solved Code Per Request):
Module Code, with some added code for getting a rough estimation of 3am the next night. This might change down the line, but it works for now:
package com.myOrganization.performanceManagement.modules
import com.myOrganization.performanceManagement.Actors.ScheduleActor
import com.myOrganization.performanceManagement.Actors.ScheduleActor.nightlySchedule
import org.joda.time.{Seconds, LocalDate, LocalTime, LocalDateTime}
import play.api.libs.concurrent.AkkaGuiceSupport
import play.api.libs.concurrent.Execution.Implicits.defaultContext
import akka.actor.{ActorRef, ActorSystem}
import scala.concurrent.duration.{FiniteDuration, SECONDS, HOURS }
import org.joda.time._
import com.google.inject.{Inject, Singleton, AbstractModule}
import com.google.inject.name.Named
class ScheduleModule extends AbstractModule with AkkaGuiceSupport {
override def configure() = {
bindActor[ScheduleActor]("ScheduleActor")
bind(classOf[NightlyEvalSchedulerStartup]).asEagerSingleton()
}
}
#Singleton
class NightlyEvalSchedulerStartup #Inject()(system: ActorSystem, #Named("ScheduleActor") scheduleActor: ActorRef) {
//Calculate initial delay to 3am the next day.
val currentTime: DateTime = DateTime.now
val targetDateTime = currentTime.plusDays(1).withTimeAtStartOfDay()
//Account for Daylight savings to an extent, not mandatory that it starts at 3am, just after midnight.
val initialDelaySeconds = targetDateTime.getHourOfDay match {
case 0 => new Duration(currentTime, targetDateTime.plusHours(3)).getStandardSeconds
case 1 => new Duration(currentTime, targetDateTime.plusHours(2)).getStandardSeconds
}
//Schedule first actor firing to occur at calculated delay and then every 24 hours.
system.scheduler.schedule(FiniteDuration(initialDelaySeconds, SECONDS), FiniteDuration(24, HOURS), scheduleActor, nightlySchedule)
}
Actor:
package com.myOrganization.performanceManagement.Actors
import akka.actor.{ActorSystem, Actor}
import com.google.inject.Inject
import com.myOrganization.performanceManagement.services.PMEvalSchedulingService
object ScheduleActor {
case object nightlySchedule
}
class ScheduleActor #Inject() (actorSystem: ActorSystem) extends Actor {
val pMEvalSchedulingService: PMEvalSchedulingService = new PMEvalSchedulingService
override def receive: Receive = {
case nightlySchedule =>
println("Called the scheduler")
pMEvalSchedulingService.nightlyScheduledEvaluationsUpdate()
}
}
Well in my case the biggest issue ended up being that when scheduling the actor call in NightlyEvalSchedulerStartup() I was calling Akka.system... which was causing the system to instantiate a new AkkaSystem before the app existed. By removing the akk system was came to represent the injected dependency which was ready to go. Hope this helps someone in the future!