Play+Scala testing Slick database - scala

I am trying to create a test class to my project that uses Play 2.6 and Scala 2.12. I've imported the scalatest lib:
libraryDependencies += guice
libraryDependencies += evolutions
libraryDependencies += jdbc
libraryDependencies += filters
libraryDependencies += "com.h2database" % "h2" % "1.4.194"
libraryDependencies += "com.typesafe.play" %% "anorm" % "2.5.3"
libraryDependencies += "org.scalatestplus.play" %% "scalatestplus-play" % "3.1.0" % Test
libraryDependencies += "org.scala-lang" % "scala-actors" % "2.10.0-M7" % "test"
libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.4"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.4" % "test"
libraryDependencies += "com.typesafe.play" %% "play-slick" % "3.0.0"
libraryDependencies += "com.typesafe.play" %% "play-slick-evolutions" % "3.0.0"
libraryDependencies += "org.xerial" % "sqlite-jdbc" % "3.19.3"
And my compiler says that ShouldMatchers does not exist in the scalatest lib.
class RackRepositorySpec extends PlaySpec with GuiceOneAppPerTest with Injecting {
val database = Databases(
driver = "org.sqlite.JDBC",
url = "jdbc:sqlite:development.db",
name = "default",
config = Map(
"username" -> "",
"password" -> ""
)
)
val guice = new GuiceInjectorBuilder()
.overrides(bind[Database].toInstance(database))
.injector()
val defaultDbProvider = guice.instanceOf[DatabaseConfigProvider]
def beforeAll() = Evolutions.applyEvolutions(database)
def afterAll() = {
// Evolutions.cleanupEvolutions(database)
database.shutdown()
}
Evolution(
1,
"create table test (id bigint not null, name varchar(255));",
"drop table test;"
)
}
I got the last version of the Scalatest but it seems that this class does not exist anymore. I am following this example: https://dzone.com/articles/getting-started-play-21-scala
Does anyone have another example to built a Scala test for Slick in-memory database?
[info] RackRepositorySpec:
[info] models.RackRepositorySpec *** ABORTED ***
[info] com.google.inject.ConfigurationException: Guice configuration errors:
[info]
[info] 1) No implementation for play.api.db.slick.DatabaseConfigProvider was bound.
[info] while locating play.api.db.slick.DatabaseConfigProvider
[info]
[info] 1 error
[info] at com.google.inject.internal.InjectorImpl.getProvider(InjectorImpl.java:1045)
[info] at com.google.inject.internal.InjectorImpl.getProvider(InjectorImpl.java:1004)
[info] at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1054)
[info] at play.api.inject.guice.GuiceInjector.instanceOf(GuiceInjectorBuilder.scala:409)
[info] at play.api.inject.guice.GuiceInjector.instanceOf(GuiceInjectorBuilder.scala:404)
[info] at play.api.inject.ContextClassLoaderInjector.$anonfun$instanceOf$2(Injector.scala:117)
[info] at play.api.inject.ContextClassLoaderInjector.withContext(Injector.scala:126)
[info] at play.api.inject.ContextClassLoaderInjector.instanceOf(Injector.scala:117)
[info] at models.RackRepositorySpec.<init>(RackRepositorySpec.scala:26)
[info] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
Kind Regards,
Felipe

I did use Slick for one of my Scala projects and I had Postgress as my database. For unit testing purposed, I created a test h2 in memory database which is populated during unit test and teared down after the tests are done.
You can have a look at the sample here:
https://github.com/joesan/plant-simulator/blob/master/test/com/inland24/plantsim/services/database/PowerPlantDBServiceSpec.scala
What I also wanted to do is to have some sort of automation where in iI wanted to populate the h2 data files and keep them forever for the next cycles of unit testing.

ShouldMatchers were deprecated in version 2 and removed in version 3. Use Matchers or MustMatchers instead.
See the release notes.

Related

Scala: object profile is not a member of package com.amazonaws.auth

I am having a build problem. Here is my sbt file:
name := "SparkPi"
version := "0.2.15"
scalaVersion := "2.11.8"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.10
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.0.1"
// old:
//libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.1"
// https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk
libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.0.002"
scalacOptions ++= Seq("-feature")
Here is the full error message I am seeing:
[info] Set current project to SparkPi (in build file:/Users/xxx/prog/yyy/)
[info] Updating {file:/Users/xxx/prog/yyy/}yyy...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Compiling 2 Scala sources to /Users/xxx/prog/yyy/target/scala-2.11/classes...
[error] /Users/xxx/prog/yyy/src/main/scala/PiSpark.scala:6: object profile is not a member of package com.amazonaws.auth
[error] import com.amazonaws.auth.profile._
[error] ^
[error] /Users/xxx/prog/yyy/src/main/scala/PiSpark.scala:87: not found: type ProfileCredentialsProvider
[error] val creds = new ProfileCredentialsProvider(profile).getCredentials()
[error] ^
[error] two errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 14 s, completed Nov 3, 2016 1:43:34 PM
And here are the imports I am trying to use:
import com.amazonaws.services.s3._
import com.amazonaws.auth.profile._
How do I import com.amazonaws.auth.profile.ProfileCredentialsProvider in Scala?
EDIT
Changed sbt file so spark core version corresponds to Scala version, new contents:
name := "SparkPi"
version := "0.2.15"
scalaVersion := "2.11.8"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.1"
// old:
//libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.1"
// https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk
libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.0.002"
scalacOptions ++= Seq("-feature")
You are using scalaVersion := "2.11.8" but library dependency has underscore 2.10 spark-core_2.10 which is bad.
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.0.1"
^
change 2.10 to 2.11
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.1"
`

ProvisionException: Unable to provision

I got this error while trying to integrate postgresql into my play app:
ProvisionException: Unable to provision, see the following errors:
1) No implementation for play.api.db.slick.DatabaseConfigProvider was bound.
while locating play.api.db.slick.DatabaseConfigProvider
for parameter 0 at models.EntryRepo.<init>(EntryRepo.scala:10)
while locating models.EntryRepo
for parameter 0 at controllers.Entries.<init>(Entries.scala:17)
while locating controllers.Entries for parameter 4 at router.Routes.<init>(Routes.scala:39)
while locating router.Routes
while locating play.api.inject.RoutesProvider while locating play.api.routing.Router
My SBT file:
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.6"
libraryDependencies ++= Seq(
cache,
ws,
specs2 % Test
)
libraryDependencies ++= Seq(
"com.typesafe.play" %% "play-slick" % "1.1.1",
"com.typesafe.play" %% "play-slick-evolutions" % "1.1.1",
"org.postgresql" % "postgresql" % "9.4-1201-jdbc4",
"com.typesafe.slick" %% "slick" % "3.1.1",
"com.typesafe.slick" %% "slick-hikaricp" % "3.1.1",
"org.slf4j" % "slf4j-nop" % "1.6.4"
)
resolvers += "scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"
routesGenerator := InjectedRoutesGenerator
My application.conf
slicks.dbs.default.driver="slick.driver.PostgresDriver$"
slicks.dbs.default.dataSourceClass="slick.jdbc.DatabaseUrlDataSource"
slicks.dbs.default.db.default.driver = org.postgresql.Driver
slicks.dbs.default.db.default.url = "jdbc:postgresql://localhost:5000/aleksander"
slicks.dbs.default.db.default.user = postgres
slicks.dbs.default.db.default.password = "lelelel"
The parts with the injection
class EntryRepo #Inject()(protected val dbConfigProvider: DatabaseConfigProvider) {...}
class Entries #Inject()(entryRepo: EntryRepo) extends Controller {...}
I've been following the play-slick3 template from activator. I tried to follow the template as close as possible but the error still persists
I had similiar issue and I was using Play 2.5.x.
In their new application.conf their code convention has changed from simple toggling to more "JSONish" formatting (called HOCON).
db {
foo
}
And you may be tricked into thinking that you should put
slicks.dbs.default.driver="slick.driver.PostgresDriver$"
slicks.dbs.default.dataSourceClass="slick.jdbc.DatabaseUrlDataSource"
slicks.dbs.default.db.default.driver = org.postgresql.Driver
slicks.dbs.default.db.default.url ="jdbc:postgresql://localhost:5000/aleksander"
slicks.dbs.default.db.default.user = postgres
slicks.dbs.default.db.default.password = "lelelel"
inside those db {} braces. However, after pulling my hairs for over 6 hours I found out that putting db configuration code OUTSIDE those braces solves my problem. You may have fallen into similiar trap.
Of course you can use this in jsonic fashion like this:
slicks.dbs.default.db { /* your slick configuration */}

Play Framework 2.4 WithApplication call not found in play.api.test

I'm trying to simple test a route in the play framework 2.4 and I follow the guide here: https://www.playframework.com/documentation/2.4.x/ScalaFunctionalTestingWithSpecs2 (testing the router)
here the code
package routesAndController
import org.specs2.mutable._
import org.specs2.runner._
import org.junit.runner._
import play.api.test._
import play.api.test.Helpers._
/**
* Created by root on 3/11/16.
*/
#RunWith(classOf[JUnitRunner])
class AnalysisEntryPointTest extends Specification {
"the AnalysisEntryPoint" should {
"where the route must be /DomoticRoomServer/Analysis with 200" in new WithApplication {
val result = route(FakeRequest(GET, "/domoticRoom/analysis")).get
status(result) must equalTo(OK)
contentType(result) must beSome.which(_ == "text/html")
}
}
}
All pretty straight forward. The problem is that the Class 'WithApplication' is not found in the play.api.test package but in the play.test.
I tried to use the object in the api.test but specs2 give me the error:
[error] /home/benkio/projects/DomoticRoom/Server/test/routesAndController/AnalysisEntryPointTest.scala:19: could not find implicit value for evidence parameter of type org.specs2.execute.AsResult[play.test.WithApplication{val result: scala.concurrent.Future[play.api.mvc.Result]}]
[error] "where the route must be /DomoticRoomServer/Analysis with 200" in new WithApplication() {
[error] ^
[error] one error found
[error] (test:compileIncremental) Compilation failed
any suggestions?
here the build.sbt:
import play.routes.compiler.InjectedRoutesGenerator
import play.sbt.PlayScala
name := """Domotic Room Server"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.7"
resolvers ++= Seq(
"scalaz-bintray" at "http://dl.bintray.com/scalaz/releases",
"Sonatype OSS Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots",
"Millhouse Bintray" at "http://dl.bintray.com/themillhousegroup/maven"
)
libraryDependencies ++= Seq(
"com.typesafe.play" %% "play-cache" % "2.4.6",
"org.specs2" %% "specs2-core" % "3.6" % "test",
"org.specs2" %% "specs2-junit" % "3.6" % "test",
"org.specs2" %% "specs2-scalacheck" % "3.6" % "test",
"org.reactivemongo" %% "play2-reactivemongo" % "0.11.9",
"com.themillhousegroup" %% "play2-reactivemongo-mocks" % "0.11.9_0.4.26"
)
// Play provides two styles of routers, one expects its actions to be injected, the
// other, legacy style, accesses its actions statically.
routesGenerator := InjectedRoutesGenerator
scalacOptions in Test ++= Seq("-Yrangepos")
fork in run := true
And here my project/plugin.sbt:
// The Typesafe repository
resolvers += "Typesafe repository" at "https://repo.typesafe.com/typesafe/releases/"
// The Play plugin
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.4.6")
Play has a shortcut to declare the test dependencies, including its own packages. The correct way to add specs2 and Play test classes is:
libraryDependencies ++= Seq(
"com.typesafe.play" %% "play-cache" % "2.4.6",
"org.reactivemongo" %% "play2-reactivemongo" % "0.11.9",
"com.themillhousegroup" %% "play2-reactivemongo-mocks" % "0.11.9_0.4.26",
specs2 % Test
)
This is documented here. There is also a shortcut to use cache, as also documented here. So your dependencies should be declared like this:
libraryDependencies ++= Seq(
"org.reactivemongo" %% "play2-reactivemongo" % "0.11.9",
"com.themillhousegroup" %% "play2-reactivemongo-mocks" % "0.11.9_0.4.26",
cache,
specs2 % Test
)
The advantage here is that you don't need to track the dependencies that are compatible with Play. Also, you don't need to repeat Play version all over your dependencies, just at the project/plugins.sbt file.
Of course, you can still override and add any other dependencies as you like. You were adding scalacheck, per instance:
libraryDependencies ++= Seq(
"org.reactivemongo" %% "play2-reactivemongo" % "0.11.9",
"com.themillhousegroup" %% "play2-reactivemongo-mocks" % "0.11.9_0.4.26",
cache,
specs2 % Test,
"org.specs2" %% "specs2-scalacheck" % "3.6" % Test
)
Edit after discussion:
Be welcome to Dependency Hell. Looks like that play2-reactivemongo and play2-reactivemongo-mocks are adding a very old specs2 dependency. You can see that by using sbt-dependency-graph and running sbt dependencyTree. Here is the complete output and also the relevant section:
[info] +-com.themillhousegroup:play2-reactivemongo-mocks_2.11:0.11.9_0.4.27 [S]
[info] | +-org.reactivemongo:play2-reactivemongo_2.11:0.11.10 [S]
[info] | +-org.specs2:specs2_2.11:2.3.13 [S]
You can also see that by seeing the code for play2-reactivemongo-mocks, play2-reactivemongo and Play Framework 2.4.6. These are not compatible versions of specs2 and sbt is not able to evict old versions because the projects are all adding different packages of specs2 (see how play add specific dependencies in contrast with play2-reactivemongo-mocks).
In other words, looks like the test supported offered by play2-reactivemongo-mocks is not compatible with the test support offered by Play. You can open an issue or submit a pull request to solve this, but a new version of play2-reactivemongo-mocks is necessary.
A possible solution
Exclude specs2 from play2-reactive dependencies:
libraryDependencies ++= Seq(
"org.reactivemongo" %% "play2-reactivemongo" % "0.11.10" exclude("org.specs2", "*"),
"com.themillhousegroup" %% "play2-reactivemongo-mocks" % "0.11.9_0.4.27" exclude("org.specs2", "*"),
cache,
specs2 % Test,
"org.specs2" %% "specs2-scalacheck" % "3.6" % Test
)

Why does an sbt project with ScalaVersion set to a 2.11 load 2.10 as well

Following is the core of the project/build.sbt for a scalatra/spark project :
val ScalaVersion = "2.11.6"
val ScalatraVersion = "2.4.0-RC2-2"
// ivyScala := ivyScala.value map { _.copy(overrideScalaVersion = true)}
lazy val project = Project (
"keywordsservlet",
file("."),
settings = ScalatraPlugin.scalatraSettings ++ scalateSettings ++ Seq(
organization := Organization,
name := Name,
version := Version,
scalaVersion := ScalaVersion,
resolvers += Classpaths.typesafeReleases,
resolvers += "Scalaz Bintray Repo" at "http://dl.bintray.com/scalaz/releases",
libraryDependencies ++= Seq(
// "org.scala-lang" % "scala-reflect" % ScalaVersion,
"org.apache.spark" % "spark-core_2.11" % "1.4.1",
"org.scalatra" %% "scalatra" % ScalatraVersion,
"org.scalatra" %% "scalatra-scalate" % ScalatraVersion,
"org.scalatra" %% "scalatra-specs2" % ScalatraVersion % "test",
"ch.qos.logback" % "logback-classic" % "1.1.2" % "runtime",
"org.eclipse.jetty" % "jetty-webapp" % "9.2.10.v20150310" % "container",
"javax.servlet" % "javax.servlet-api" % "3.1.0" % "provided"
),
Here is the sbt output: notice it is loading a 2.10 target !
$ sbt
[info] Loading project definition from /shared/keywords/project
[info] Updating {file:/shared/keywords/project/}keywords-build...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Compiling 1 Scala source to /shared/keywords/project/target/scala-2.10/sbt-0.13/classes...
[info] Set current project to KeywordsServlet (in build file:/shared/keywords/)
So what is happening here?
There is a difference between the version of Scala that you are using for your project and the version of Scala that sbt itself uses.
sbt 0.13 can compile 2.9, 2.10 and 2.11 (and 2.12). However, when it compiles your build.sbt or Build.scala files, sbt 0.13 uses Scala 2.10.
Similarly, all of the plugins that sbt uses are compiled with 2.10.
On the other hand, sbt 0.12 used Scala 2.9.

Errors while compiling project migrated to SBT - error while loading package and Assertions

I'm migrating a Scala application that compiles and runs fine by manually
including jars in the classpath to a SBT build configuration.
My build.sbt is as follows:
name := "hello"
version := "1.0"
scalaVersion := "2.9.2"
libraryDependencies += "org.slf4j" % "slf4j-simple" % "1.6.4"
libraryDependencies += "junit" % "junit" % "4.11"
libraryDependencies += "org.scalatest" % "scalatest_2.10" % "1.9.2"
libraryDependencies += "org.hamcrest" % "hamcrest-all" % "1.3"
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.0.13"
libraryDependencies += "com.github.scct" % "scct_2.10" % "0.2.1"
libraryDependencies += "org.scala-lang" % "scala-swing" % "2.9.2"
When I compile it I get the following errors:
Loading /usr/share/sbt/bin/sbt-launch-lib.bash
[info] Set current project to hello (in build file:/home/kevin/gitrepos/go-game-msc/)
> compile
[info] Updating {file:/home/kevin/gitrepos/go-game-msc/}go-game-msc...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Compiling 25 Scala sources to /home/kevin/gitrepos/go-game-msc/target/scala-2.9.2/classes...
[error] error while loading package, class file needed by package is missing.
[error] reference value <init>$default$2 of object deprecated refers to nonexisting symbol.
[error] error while loading Assertions, class file needed by Assertions is missing.
[error] reference value <init>$default$2 of object deprecated refers to nonexisting symbol.
[error] two errors found
[error] (compile:compile) Compilation failed
[error] Total time: 21 s, completed 09-Mar-2014 12:07:14
I've tried matching up the dependencies with the jar files I am using:
hamcrest-all-1.3.jar
logback-classic-1.0.13.jar
scalaedit-assembly-0.3.7(1).jar
scalatest_2.9.0-1.9.1.jar
slf4j-simple-1.6.4.jar
hamcrest-core-1.3.jar
logback-core-1.0.13.jar
scalaedit-assembly-0.3.7.jar
scct_2.9.2-0.2-SNAPSHOT.jar
junit-4.11.jar
miglayout-4.0.jar
scalariform.jar
slf4j-api-1.7.5.jar
Please advise.
Never mix scala binary versions.
Use always %% (instead of % and _2.x.x):
libraryDependencies +="org.scalatest" %% "scalatest" % "1.9.2"
libraryDependencies +="com.github.scct" %% "scct" % "0.2-SNAPSHOT"