Using Specs2 in a Typesafe activator play application - scala

I have used specs2 many times successfully in vanilla SBT projects. now I am starting to learn typesafe activator platform.
I did the following steps
activator new Shop just-play-scala
this is my build.sbt file
name := """Shop"""
version := "1.0-SNAPSHOT"
// Read here for optional jars and dependencies
libraryDependencies ++= Seq("org.specs2" %% "specs2-core" % "3.6.1" % "test")
resolvers += "scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"
scalacOptions in Test ++= Seq("-Yrangepos")
lazy val root = project.in(file(".")).enablePlugins(PlayScala)
I created a file Shop/app/test/models/ShopSpec.scala
import org.specs2.mutable.Specification
class ShopSpec extends Specification {
def foo = s2"""
| This is a specification to check the 'Hello world' string
| The 'Hello world' string should
| contain 11 characters $e1
| start with 'Hello' $e2
| end with 'world' $e3
| """.stripMargin
def e1 = "Hello world" must haveSize(11)
def e2 = "Hello world" must startWith("Hello")
def e3 = "Hello world" must endWith("world")
}
When I run activator test I get an error
[success] Total time: 0 s, completed Jun 24, 2015 12:21:32 AM
Mohitas-MBP:Shop abhi$ activator test
[info] Loading project definition from /Users/abhi/ScalaProjects/Shop/project
[info] Set current project to Shop (in build file:/Users/abhi/ScalaProjects/Shop/)
**cannot create a JUnit XML printer. Please check that specs2-junit.jar is on the classpath**
org.specs2.reporter.JUnitXmlPrinter$
java.net.URLClassLoader.findClass(URLClassLoader.java:381)
java.lang.ClassLoader.loadClass(ClassLoader.java:424)
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.jav
I have previously written spec2 test cases successfully when I was using SBT projects. but only when I use the typesafe activator that I get this issue with test cases.
I even changed the code of my test to something as simple as
import org.specs2.mutable.Specification
class ShopSpec extends Specification {
"A shop " should {
"create item" in {
failure
}
}
}
But still the same problem.

Wait .. I think I resolved it.
The activator play platform already has specs2 included so there is no need for me to tweak the built.sbt file for specs 2.
So I removed everything I had added to build.sbt file and left the file as
name := """Shop"""
version := "1.0-SNAPSHOT"
lazy val root = project.in(file(".")).enablePlugins(PlayScala)
Now it works fine. So basically, I don't need to add anything in a activator project for specs2.
I could have deleted the question... but leaving it here so that it can be of help to someone.

What worked for me was adding the following to build.sbt:
libraryDependencies ++= Seq("org.specs2" %% "specs2-core" % "3.6.2" % "test",
"org.specs2" %% "specs2-junit" % "3.6.2" % "test")

Related

Pre-test SBT task: Unable to instantiate JDBC driver

I'm having trouble getting an SBT task to run migrations with Flyway; I get an exception when I run the task. Any ideas how I could fix it?
org.flywaydb.core.api.FlywayException: Unable to instantiate JDBC driver: org.postgresql.Driver => Check whether the jar file is present
The following code works, when I run it in BeforeAll, in my tests (ScalaTest), but does not work when I move it into an SBT task.
val flyway = Flyway
.configure()
.locations("filesystem:./**/resources/db/migrations/")
.dataSource("jdbc:postgresql://localhost:5432/my_database", "my_user", "secret")
.load()
flyway.clean()
flyway.migrate()
My /build.sbt file looks like this:
import org.flywaydb.core.Flyway
lazy val migrate = taskKey[Unit]("Migrate database")
lazy val migrateTask = Def.task {
println("Migrate")
val flyway = Flyway
.configure()
.locations("filesystem:./**/resources/db/migrations/")
.dataSource("jdbc:postgresql://localhost:5432/my_database", "my_user", "secret")
.load()
flyway.clean()
flyway.migrate()
}
val IntegrationTest = config("integration") extend Test
lazy val integrationTestSettings = inConfig(IntegrationTest)(Defaults.testSettings) ++ List(
IntegrationTest / fork := false,
IntegrationTest / parallelExecution := false,
IntegrationTest / sourceDirectory := baseDirectory.value / "src/test/integration",
IntegrationTest / test := {
(IntegrationTest / test) dependsOn migrateTask
}.value
)
lazy val root = Project(id = "hello", base = file("."))
.configs(Configs.all: _*)
.settings(
integrationTestSettings,
libraryDependencies += "org.scalatest" %% "scalatest" % "3.1.4",
)
And my /project/build.sbt looks like this:
libraryDependencies ++= List(
"org.flywaydb" % "flyway-core" % "7.6.0",
"org.postgresql" % "postgresql" % "42.2.19",
)
The versions I'm using are:
SBT: 1.4.5
Scala: 2.13.4
Flyway: 7.6.0
Does anyone have any ideas why I'm getting that error, and how I can fix it?
Any help would be greatly appreciated. Thanks :)
Searching on the Flyway repo, the error message is coming from here - https://github.com/flyway/flyway/blob/9033185ab8bfa56b0dae9136c04763cdccc50081/flyway-core/src/main/java/org/flywaydb/core/internal/jdbc/DriverDataSource.java#L165-L182 where it's trying load the database driver from the classloader. These ClassLoader techniques sometimes clash with the sbt set up layered ClassLoader to run sbt itself. That's my speculation on what's happening.
How do we work around this?
You said that running it as part the test worked, so maybe you could create a subproject for this purpose?
ThisBuild / scalaVersion := "2.13.4"
lazy val migrate = taskKey[Unit]("Migrate database")
lazy val root = (project in file("."))
.settings(
name := "hello",
migrate := (migrateProj / run).toTask("").value
)
// utility project to run database migration
lazy val migrateProj = (project in file("migrate"))
.settings(
libraryDependencies ++= List(
"org.flywaydb" % "flyway-core" % "7.6.0",
"org.postgresql" % "postgresql" % "42.2.19",
),
Compile / run / fork := true,
publish / skip := true,
)
migrate/Migrate.scala
object Migrate extends App {
println("migrate")
// rest of the code here...
}
Now you can run
sbt:flyway> migrate
[info] running (fork) Migrate
[info] migrate
[success] Total time: 4 s, completed Mar 6, 2021 9:03:07 PM
Details about layered ClassLoader
ClassLoader techniques sometimes clash with the sbt set up layered ClassLoader to run sbt itself. sbt-the-Bash-script allows users to choose the sbt version using project/build.properties, and Scala version using build.sbt. Both of these make sbt build declarative and repeatable, and generally a good thing. But how can sbt launcher written using Scala 2.10 launch sbt 1.4.x written using Scala 2.12, which then launch your Scala 2.13 application? Each of these boundary cross is done by creating a layered ClassLoader, like the movie Inception.

is it necessary to add my custom scala library dependencies in new scala project?

I am new to Scala and I am trying to develop a small project which uses a custom library. I have created a mysql connection pool inside the library. Here's my build.sbt for library
organization := "com.learn"
name := "liblearn-scala"
version := "0.1"
scalaVersion := "2.12.6"
libraryDependencies += "mysql" % "mysql-connector-java" % "6.0.6"
libraryDependencies += "org.apache.tomcat" % "tomcat-dbcp" % "8.5.0"
I have published the same to local ivy repo using sbt publishLocal
Now I have a project which will be making use of the above library with following build.sbt
name := "SBT1"
version := "0.1"
scalaVersion := "2.12.6"
libraryDependencies += "com.learn" % "liblearn-scala_2.12" % "0.1"
I am able to compile the new project but when I run it I get
java.lang.ClassNotFoundException: org.apache.tomcat.dbcp.dbcp2.BasicDataSource
But if I add
libraryDependencies += "mysql" % "mysql-connector-java" % "6.0.6"
libraryDependencies += "org.apache.tomcat" % "tomcat-dbcp" % "8.5.0"
in the project's build.sbt it works without any issues.
Is this the actual way of doing things with scala - sbt? ie : I have to mention dependencies of custom library also inside the project?
Here is my library code (I have just 1 file)
package com.learn.scala.db
import java.sql.Connection
import org.apache.tomcat.dbcp.dbcp2._
object MyMySQL {
private val dbUrl = s"jdbc:mysql://localhost:3306/school?autoReconnect=true"
private val connectionPool = new BasicDataSource()
connectionPool.setUsername("root")
connectionPool.setPassword("xyz")
connectionPool.setDriverClassName("com.mysql.cj.jdbc.Driver")
connectionPool.setUrl(dbUrl)
connectionPool.setInitialSize(3)
def getConnection: Connection = connectionPool.getConnection
}
This is my project code:
try {
val conn = MyMySQL.getConnection
val ps = conn.prepareStatement("select * from school")
val rs = ps.executeQuery()
while (rs.next()) {
print(rs.getString("name"))
print(rs.getString("rank"))
println("----------------------------------")
}
rs.close()
ps.close()
conn.close()
} catch {
case ex: Exception => {
println(ex.printStackTrace())
}
}
By default SBT fetches all project dependencies, transitively. This means it should be necessary to explicitly declare only liblearn-scala, and not also the transitive dependencies mysql-connector-java and tomcat-dbcp. Transitivity can be disabled, and transitive dependencies can be excluded, however unless this has been done explicitly, then it should not be the cause of the problem.
Without seeing your whole build.sbt, I believe you are doing the right thing. If sbt clean publishLocal is not solving the problem, you could try the nuclear option and clear the whole ivy cache (note this will force all projects to re-fetch dependencies).

Specs2 test within plays gives me "could not find implicit value for evidence parameter of type org.specs2.main.CommandLineAsResult

I'm trying to write a test case for a simple REST API in Play2/Scala that send/receives JSON. My test looks like the following:
import org.junit.runner.RunWith
import org.specs2.matcher.JsonMatchers
import org.specs2.mutable._
import org.specs2.runner.JUnitRunner
import play.api.libs.json.{Json, JsArray, JsValue}
import play.api.test.Helpers._
import play.api.test._
import play.test.WithApplication
/**
* Add your spec here.
* You can mock out a whole application including requests, plugins etc.
* For more information, consult the wiki.
*/
#RunWith(classOf[JUnitRunner])
class APIv1Spec extends Specification with JsonMatchers {
val registrationJson = Json.parse("""{"device":"576b9cdc-d3c3-4a3d-9689-8cd2a3e84442", |
"firstName":"", "lastName":"Johnny", "email":"justjohnny#test.com", |
"pass":"myPassword", "acceptTermsOfService":true}
""")
def dropJsonElement(json : JsValue, element : String) = (json \ element).get match {
case JsArray(items) => util.dropAt(items, 1)
}
def invalidRegistrationData(remove : String) = {
dropJsonElement(registrationJson,remove)
}
"API" should {
"Return Error on missing first name" in new WithApplication {
val result= route(
FakeRequest(
POST,
"/api/v1/security/register",
FakeHeaders(Seq( ("Content-Type", "application/json") )),
invalidRegistrationData("firstName").toString()
)
).get
status(result) must equalTo(BAD_REQUEST)
contentType(result) must beSome("application/json")
}
...
However when I attempt to run sbt test, I get the following error:
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=384M; support was removed in 8.0
[info] Loading project definition from /home/cassius/brentspace/esalestracker/project
[info] Set current project to eSalesTracker (in build file:/home/cassius/brentspace/esalestracker/)
[info] Compiling 3 Scala sources to /home/cassius/brentspace/esalestracker/target/scala-2.11/test-classes...
[error] /home/cassius/brentspace/esalestracker/test/APIv1Spec.scala:34: could not find implicit value for evidence parameter of type org.specs2.main.CommandLineAsResult[play.test.WithApplication{val result: scala.concurrent.Future[play.api.mvc.Result]}]
[error] "Return Error on missing first name" in new WithApplication {
[error] ^
[error] one error found
[error] (test:compileIncremental) Compilation failed
[error] Total time: 3 s, completed 18/01/2016 9:30:42 PM
I have similar tests in other applications, but it looks like the new version of specs adds a lot of support for Futures and other things that invalidate previous tutorials. I'm on Scala 2.11.6, Activator 1.3.6 and my build.sbt looks like the following:
name := """eSalesTracker"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.6"
libraryDependencies ++= Seq(
jdbc,
cache,
ws,
"com.typesafe.slick" %% "slick" % "3.1.0",
"org.postgresql" % "postgresql" % "9.4-1206-jdbc42",
"org.slf4j" % "slf4j-api" % "1.7.13",
"ch.qos.logback" % "logback-classic" % "1.1.3",
"ch.qos.logback" % "logback-core" % "1.1.3",
evolutions,
specs2 % Test,
"org.specs2" %% "specs2-matcher-extra" % "3.7" % Test
)
resolvers += "scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"
resolvers += Resolver.url("Typesafe Ivy releases", url("https://repo.typesafe.com/typesafe/ivy-releases"))(Resolver.ivyStylePatterns)
// Play provides two styles of routers, one expects its actions to be injected, the
// other, legacy style, accesses its actions statically.
routesGenerator := InjectedRoutesGenerator
I think you are using the wrong WithApplication import.
Use this one:
import play.api.test.WithApplication
The last line of the testcase should be the assertion/evaluation statement.
e.g. before the last } of the failing testcase method put the statement false must beEqualTo(true) and run it again.

Adding module dependency information in sbt's build.sbt file

I have a multi module project in IntelliJ, as in this screen capture shows, contexProcessor module depends on contextSummary module.
IntelliJ takes care of everything once I setup the dependencies in Project Structure.
However, when I run sbt test with the following setup in build.sbt, I got an error complaining that it can't find the packages in contextSummary module.
name := "contextProcessor"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "2.2.2" % "test"
How to teach sbt that the missing modules are found?
I could use the build.sbt file in the main root directory.
lazy val root = (project in file(".")).aggregate(contextSummary, contextProcessor)
lazy val contextSummary = project
lazy val contextProcessor = project.dependsOn(contextSummary)
Reference: http://www.scala-sbt.org/0.13.5/docs/Getting-Started/Multi-Project.html
For testing only one project, I can use project command in sbt.
> sbt
[info] Set current project to root (in build file:/Users/smcho/Desktop/code/ContextSharingSimulation/)
> project contextProcessor
[info] Set current project to contextProcessor (in build file:/Users/smcho/Desktop/code/ContextSharingSimulation/)
> test
For batch mode as in How to pass command line args to program in SBT 0.13.1?
sbt "project contextProcessor" test
I think a simple build.sbt might not be enough for that.
You would need to create a more sophisticated project/Build.scala like that:
import sbt._
import sbt.Keys._
object Build extends Build {
lazy val root = Project(
id = "root",
base = file("."),
aggregate = Seq(module1, module2)
)
lazy val module1 = Project(
id = "module1",
base = file("module1-folder"),
settings = Seq(
name := "Module 1",
version := "1.0",
scalaVersion := "2.11.7",
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "2.2.2" % "test"
lazy val module2 = Project(
id = "module2",
base = file("module2-folder"),
dependencies = Seq(module1),
settings = Seq(
name := "Module 2",
version := "1.0",
scalaVersion := "2.11.7",
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "2.2.2" % "test"
}

SBT/Play2 multi-project setup does not include dependant projects in classpath in run/test

I have following SBT/Play2 multi-project setup:
import sbt._
import Keys._
import PlayProject._
object ApplicationBuild extends Build {
val appName = "traveltime-api"
val appVersion = "1.0"
val appDependencies = Seq(
// Google geocoding library
"com.google.code.geocoder-java" % "geocoder-java" % "0.9",
// Emailer
"org.apache.commons" % "commons-email" % "1.2",
// CSV generator
"net.sf.opencsv" % "opencsv" % "2.0",
"org.scalatest" %% "scalatest" % "1.7.2" % "test",
"org.scalacheck" %% "scalacheck" % "1.10.0" % "test",
"org.mockito" % "mockito-core" % "1.9.0" % "test"
)
val lib = RootProject(file("../lib"))
val chiShape = RootProject(file("../chi-shape"))
lazy val main = PlayProject(
appName, appVersion, appDependencies, mainLang = SCALA
).settings(
// Add your own project settings here
resolvers ++= Seq(
"Sonatype Snapshots" at
"http://oss.sonatype.org/content/repositories/snapshots",
"Sonatype Releases" at
"http://oss.sonatype.org/content/repositories/releases"
),
// Scalatest compatibility
testOptions in Test := Nil
).aggregate(lib, chiShape).dependsOn(lib, chiShape)
}
As you can see this project depends on two independant subprojects: lib and chiShape.
Now compile works fine - all sources are correctly compiled. However if I try run or test, neither task in runtime has classes from subprojects on classpath loaded and things go haywire with NoClassFound exceptions.
For example - my application has to load serialized data from file and it goes like this: test starts FakeApplication, it tries to load data and boom:
[info] CsvGeneratorsTest:
[info] #markerFilterCsv
[info] - should fail on bad json *** FAILED ***
[info] java.lang.ClassNotFoundException: com.library.Node
[info] at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
[info] at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
[info] at java.security.AccessController.doPrivileged(Native Method)
[info] at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
[info] at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
[info] at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
[info] at java.lang.Class.forName0(Native Method)
[info] at java.lang.Class.forName(Class.java:264)
[info] at java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:622)
[info] at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1593)
[info] ...
Strangely enough stage creates a directory structure with chi-shapes_2.9.1-1.0.jar and lib_2.9.1-1.0.jar in staged/.
How can I get my runtime/test configurations get subprojects into classpath?
Update:
I've added following code to Global#onStart:
override def onStart(app: Application) {
println(app)
ClassLoader.getSystemClassLoader.asInstanceOf[URLClassLoader].getURLs.
foreach(println)
throw new RuntimeException("foo!")
}
When I launch tests, the classpath is very very ill populated, to say at least :)
FakeApplication(.,sbt.classpath.ClasspathUtilities$$anon$1#182253a,List(),List(),Map(application.load-data -> test, mailer.smtp.test-mode -> true))
file:/home/arturas/Software/sdks/play-2.0.3/framework/sbt/sbt-launch.jar
[info] CsvGeneratorsTest:
When launching staged app, there's a lot of stuff, how it's supposed to be :)
$ target/start
Play server process ID is 29045
play.api.Application#1c2862b
file:/home/arturas/work/traveltime-api/api/target/staged/jul-to-slf4j.jar
That's strange, because there should be at least testing jars in the classpath I suppose?
It seems I've solved it.
The culprit was that ObjectInputStream ignores thread local class loaders by default and only uses system class loader.
So I changed from:
def unserialize[T](file: File): T = {
val in = new ObjectInputStream(new FileInputStream(file))
try {
in.readObject().asInstanceOf[T]
}
finally {
in.close
}
}
To:
/**
* Object input stream which respects thread local class loader.
*
* TL class loader is used by SBT to avoid polluting system class loader when
* running different tasks.
*/
class TLObjectInputStream(in: InputStream) extends ObjectInputStream(in) {
override protected def resolveClass(desc: ObjectStreamClass): Class[_] = {
Option(Thread.currentThread().getContextClassLoader).map { cl =>
try { return cl.loadClass(desc.getName)}
catch { case (e: java.lang.ClassNotFoundException) => () }
}
super.resolveClass(desc)
}
}
def unserialize[T](file: File): T = {
val in = new TLObjectInputStream(new FileInputStream(file))
try {
in.readObject().asInstanceOf[T]
}
finally {
in.close
}
}
And my class not found problems went away!
Thanks to How to put custom ClassLoader to use? and http://tech-tauk.blogspot.com/2010/05/thread-context-classlaoder-in.html on useful insight about deserializing and thread local class loaders.
This sounds similar to this bug https://play.lighthouseapp.com/projects/82401/tickets/659-play-dist-broken-with-sub-projects, though that bug is about dist and not test. I think that the fix has not made it to the latest stable release, so try building Play from source (and don't forget to use aggregate and dependsOn as demonstrated in that link.
Alternatively, as a workaround, inside sbt, you can navigate to the sub-project with project lib and then type test. It's a bit manual, but you can script that if you'd like.