In a play 2.6.7 scala project with reactivemongo, I am having an issue.
When I run a test case through IntelliJ idea (CE)'s test runner, it runs without problems. But when I try to run the same test case from command line using sbt it fails. This means I have to run tests manually, which is a burden.
When the tests are run on sbt, the following errors are being received:
[info] controllers.SomeControllerSpec *** ABORTED ***
[info] com.google.inject.ProvisionException: Unable to provision, see the following errors:
[info]
[info] 1) No implementation for play.modules.reactivemongo.ReactiveMongoApi annotated with #play.modules.reactivemongo.NamedDatabase(value=somedatabase) was bound.
[info] while locating play.modules.reactivemongo.ReactiveMongoApi annotated with #play.modules.reactivemongo.NamedDatabase(value=somedatabase)
and 72 of the same message gets repeated. Now the question is,
i. Why does it work on IDEA?
ii. What can I do to make it work through sbt?
(I used name dependency stuff already but did not help. Anyway, the test does work on idea already!!)
Thanks.
UPDATE
Here is how the test case looks like:
package controllers
import models.Some.SomeRequest
import org.scalatestplus.play._
import org.scalatestplus.play.guice.GuiceOneAppPerSuite
import play.api.libs.json.{JsObject, JsValue, Json}
import play.api.libs.ws.WSResponse
import play.api.mvc._
import play.api.test.FakeRequest
import play.api.test.Helpers.contentAsJson
import scala.concurrent.duration._
import scala.concurrent.{Await, Future}
class SomeControllerTest extends PlaySpec with GuiceOneAppPerSuite{
implicit val timeout: akka.util.Timeout = 5.seconds
val controller = app.injector.instanceOf(classOf[SomeController])
"Some test" should {
"be successful for valid request" in {
val someValidReq:JsValue = Json.toJson(
SomeRequest(param1 = "value1",
param2 = "value2")
).as[JsValue]
val result: Future[Result] = controller.someMethod.apply(FakeRequest("POST", "/endpoint")
.withJsonBody(someValidReq))
println("test req: "+someValidReq)
val respJson = contentAsJson(result)
respJson.as[JsObject].value("msg").as[String] mustBe ("Successfully completed")
}
}
}
I can run the above test through the Run menu of IntelliJ IDEA.
However, I can't run the test by using sbt, as the following fails:
sbt testOnly controllers.SomeControllerTest
Related
Say I have a scalatest class in main/scala, like
import org.scalatest.FunSuite
class q3 extends FunSuite {
test("6 5 4 3 2 1") {
val digits = Array(6,5,4,3,2,1)
assert(digits.sorted === Array(1,2,3,4,5,6))
}
}
How do I run it with sbt?
I've tried sbt test, sbt testOnly, sbt "testOnly *q3" and they all had output like
[info] Run completed in 44 milliseconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] No tests to run for Test / testOnly
A similar question from a few years back said they successfully used testOnly but I can't get it to work.
The metals extension on VSCode shows a "test" link when the file is open which successfully runs the test, but doesn't show how it does that. I want to know how to do it through sbt.
Put ScalaTest on Compile classpath in build.sbt like so
libraryDependencies += "org.scalatest" %% "scalatest" % "3.1.0"
and then call org.scalatest.run runner explicitly from within an App, for example,
object MainTests extends App {
org.scalatest.run(new ExampleSpec)
}
Putting it together we have in src/main/scala/example/MainTests.scala
package example
import org.scalatest.matchers.should.Matchers
import org.scalatest.flatspec.AnyFlatSpec
import collection.mutable.Stack
import org.scalatest._
class ExampleSpec extends AnyFlatSpec with Matchers {
"A Stack" should "pop values in last-in-first-out order" in {
val stack = new Stack[Int]
stack.push(1)
stack.push(2)
stack.pop() should be (2)
stack.pop() should be (1)
}
}
object MainTests extends App {
org.scalatest.run(new ExampleSpec)
}
and run it with runMain example.MainTests. Furthermore, we could gather tests in Suites and execute all like so
class ExampleASpec extends FlatSpec with Matchers {
"ExampleA" should "run" in { succeed }
}
class ExampleBSpec extends FlatSpec with Matchers {
"ExampleB" should "run" in { succeed }
}
class ExampleSuite extends Suites(
new ExampleASpec,
new ExampleBSpec,
)
object MainTests extends App {
(new ExampleSuite).execute()
}
Completely new to Cassandra. Tried to initialize a database in Cassandra using phantom-dsl. I received this error message.
*** RUN ABORTED ***
java.lang.AssertionError: assertion failed: no symbol could be loaded from class com.datastax.driver.core.Cluster in package core with name Cluster and classloader sun.misc.Launcher$AppClassLoader#279f2327
at scala.reflect.runtime.JavaMirrors$JavaMirror.scala$reflect$runtime$JavaMirrors$JavaMirror$$classToScala1(JavaMirrors.scala:1021)
at scala.reflect.runtime.JavaMirrors$JavaMirror$$anonfun$classToScala$1.apply(JavaMirrors.scala:980)
at scala.reflect.runtime.JavaMirrors$JavaMirror$$anonfun$classToScala$1.apply(JavaMirrors.scala:980)
at scala.reflect.runtime.JavaMirrors$JavaMirror$$anonfun$toScala$1.apply(JavaMirrors.scala:97)
at scala.reflect.runtime.TwoWayCaches$TwoWayCache$$anonfun$toScala$1.apply(TwoWayCaches.scala:39)
at scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
at scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
at scala.reflect.runtime.TwoWayCaches$TwoWayCache.toScala(TwoWayCaches.scala:34)
at scala.reflect.runtime.JavaMirrors$JavaMirror.toScala(JavaMirrors.scala:95)
at scala.reflect.runtime.JavaMirrors$JavaMirror.classToScala(JavaMirrors.scala:980)
I am not really sure whether it is an issue with the Connector in phantom-dsl or the ClusterBuilder in datastax-driver.
Connector.scala
package com.neruti.db
import com.neruti.db.models._
import com.websudos.phantom.database.Database
import com.websudos.phantom.connectors.ContactPoints
import com.websudos.phantom.dsl.KeySpaceDef
object Connector {
val host= Seq("localhost")
val port = 9160
val keySpace: String = "nrt_entities"
// val inet = InetAddress.getByName
lazy val connector = ContactPoints(host,port).withClusterBuilder(
_.withCredentials("cassandra", "cassandra")
).keySpace(keySpace)
}
CassandraSpec.scala
package com.neruti.db
import com.neruti.User
import com.neruti.db.models._
import com.neruti.db.databases._
import com.neruti.db.services._
import com.neruti.db.Connector._
import java.util.UUID
import com.datastax.driver.core.ResultSet
import org.scalatest._
import org.scalatest.{BeforeAndAfterAll,FlatSpec,Matchers,ShouldMatchers}
import org.scalatest.concurrent.ScalaFutures
import org.scalamock.scalatest.MockFactory
import scala.concurrent.duration._
import scala.concurrent.{Await, Future}
import scala.concurrent.ExecutionContext.Implicits.global
abstract class BaseCassandraSpec extends FlatSpec
with BeforeAndAfterAll
with Inspectors
with Matchers
with OptionValues
with ScalaFutures
class CassandraTest extends BaseCassandraSpec
with ProductionDatabase
with UserService
with Connector.connector.Connector{
val user = User(
Some("foobar"),
Some("foo#foobar.com"),
Some(UUID.randomUUID()),
)
override protected def beforeAll(): Unit = {
Await.result(database.userModel.create(user),10.seconds)
}
}
Looks there may be multiple issues you are looking at:
The latest version of phantom is 2.1.3, I'd strongly recommend using that, especially if you are just starting out. The migration guide is here in case you need it.
The entire reflection mechanism has been replaced in the latest version, so that error should magically go away. With respect to testing and generating objects, I would also look to include com.outworkers.util.testing, which is freely available on Maven Central
libraryDependencies ++= Seq(
//..,
"com.outworkers" %% "phantom-dsl" % "2.1.3",
"com.outworkers" %% "util-testing" % "0.30.1" % Test
)
This will offer you automated case class generation:
import com.outworkers.util.testing._
val sample = gen[User]
I want to verify the order of sequence calls, but it didn't work as I expected.
import akka.actor.ActorSystem
import akka.testkit.TestKit
import org.scalatest._
import org.specs2.mock.Mockito
class Test extends TestKit(ActorSystem("testSystem"))
with WordSpecLike
with BeforeAndAfterAll
with PrivateMethodTester
with `enter code here`Mockito
{
val m = mock[java.util.List[String]]
m.get(0) returns "one"
there was two(m).get(2) //do not throw any error, why???
}
I'm using
scala 2.11.7,
specs2-core 3.6.6,
specs2-mock 3.6.6,
scalatest 2.2.4
thx
I don't think you can mix Specs2 and ScalaTest.
You shuld remove import org.scalatest._ and use import org.specs2.mutable.SpecificationLike instead.
import akka.testkit.TestKit
import akka.actor.ActorSystem
import org.specs2.mock.Mockito
import org.specs2.mutable.SpecificationLike
class Test extends TestKit(ActorSystem("testSystem"))
with Mockito
with SpecificationLike
{
"it" should{
"raise error" in {
val m = mock[java.util.List[String]]
m.get(0) returns "one"
there was two(m).get(2)
}
}
}
Now you can see that sbt test returns something like.
[error] The mock was not called as expected:
[error] Wanted but not invoked:
[error] list.get(2);
[error] -> at Test$$anonfun$1$$anonfun$apply$1$$anonfun$apply$3.apply(Test.scala:14)
[error] Actually, there were zero interactions with this mock. (Test.scala:14)
I am attempting to execute a Specification with multiple tests that all run within the same Play application and not a separate application for each test.
As such I have the following code which should print:
Play app started
[info] PlayRunningImmutableSpec
[info]
[info] + 200 status expected
[info]
[info] + 404 status expected
Play app stopped
but instead prints:
Play app started
Play app stopped
[info] PlayRunningImmutableSpec
[info]
[info]
[info] ! 200 status expected
[error] ConnectException: : Connection refused: /127.0.0.1:19001 to http://127.0.0.1:19001/
I am using Typesafe Activator 1.2.10 which includes Play 2.3.3 and Specs2 2.3.12
What is wrong with the following code, and what would work instead?
import org.specs2.Specification
import org.specs2.execute.Result
import org.specs2.specification.Step
import org.specs2.time.NoTimeConversions
import play.api.Play
import play.api.Play.current
import play.api.http.{HeaderNames, HttpProtocol, Status}
import play.api.libs.ws.WS
import play.api.test._
class PlayRunningImmutableSpec extends Specification with NoTimeConversions with PlayRunners with HeaderNames with Status with HttpProtocol with DefaultAwaitTimeout with ResultExtractors with Writeables with RouteInvokers with FutureAwaits {
override def is = s2"""
${Step(beforeAll)}
200 status expected $e1
404 status expected $e2
${Step(afterAll)}
"""
def e1: Result = {
await(WS.url(s"http://127.0.0.1:${Helpers.testServerPort}").get()).status === 200
}
def e2: Result = {
await(WS.url(s"http://127.0.0.1:${Helpers.testServerPort}/missing").get()).status === 404
}
lazy val app = FakeApplication()
private def beforeAll = {
Play.start(app)
println("Play app started")
}
private def afterAll = {
Play.stop()
println("Play app stopped")
}
}
EDIT:
I realised my error was in the use the play.api.Play.start method and now have a simple trait to handle one startup and shutdown:
trait PlayServerRunning extends SpecificationLike {
override def map(fs: => Fragments): Fragments = Step(beforeAll) ^ fs ^ Step(afterAll)
private lazy val server = TestServer(Helpers.testServerPort)
private def beforeAll = {
server.start()
}
private def afterAll = {
server.stop()
}
}
That's on propose. Tests are executed in parallel (with implementation details according execution context).
If your tests need to be sequential, you must annotate in this way. e.g.:
"X" should {
sequential
"exp1" in { ... }
"exp2" in { ... }
}
I'm new to Scala and Dispatch, and I can't seem to get a basic Post request working.
I'm actually building a sbt plugin that uploads files to a third party service.
Here is my build.sbt file:
sbtPlugin := true
name := "sbt-sweet-plugin"
organization := "com.mattwalters"
version := "0.0.1-SNAPSHOT"
libraryDependencies += "net.databinder.dispatch" %% "dispatch-core" % "0.11.0"
And here is the plugin's SweetPlugin.scala:
package com.mattwalters
import sbt._
import Keys._
import dispatch._
object SweetPlugin extends Plugin {
import SweetKeys._
object SweetKeys {
lazy val sweetApiToken =
SettingKey[String]("Required. Find yours at https://example.com/account/#api")
lazy val someOtherToken =
SettingKey[String]("Required. Find yours at https://example.com/some/other/token/")
lazy val sweetFile =
SettingKey[String]("Required. File data")
lazy val sweetotes =
SettingKey[String]("Required. Release notes")
lazy val sweetUpload =
TaskKey[Unit]("sweetUpload", "A task to upload the specified sweet file.")
}
override lazy val settings = Seq (
sweetNotes := "some default notes",
// define the upload task
sweetUpload <<= (
sweetApiToken,
someOtherToken,
sweetFile,
sweetNotes
) map { (
sweetApiToken,
someOtherToken,
sweetFile,
sweetNotes
) =>
// define http stuff here
val request = :/("www.example.com") / "some" / "random" / "endpoint"
val post = request.POST
post.addParameter("api_token", sweetApiToken)
post.addParameter("some_other_token", someOtherToken)
post.addParameter("file", io.Source.fromFile(sweetFile).mkString)
post.addParameter("notes", sweetNotes)
val responseFuture = Http(post OK as.String)
val response = responseFuture()
println(response) // see if we can get something at all....
}
)
}
The dispatch documentation shows:
import dispatch._, Defaults._
but I get
reference to Defaults is ambiguous;
[error] it is imported twice in the same scope by
removing , Defaults._ makes this error go away.
I also tried the recommendation from this post:
import dispatch._
Import dispatch.Default._
But alas I get:
object Default is not a member of package dispatch
[error] import dispatch.Default._
Also tried the advice from
Passing implicit ExecutionContext to contained objects/called methods:
import concurrent._
import concurrent.duration._
But I still get
Cannot find an implicit ExecutionContext, either require one yourself or import ExecutionContext.Implicits.global
Back to square one...
New to scala so any advice at all on the code above is appreciated.
Since the recommended sbt console run works fine, it looks like one of your other imports also has a Defaults module. It's a standard approach in Scala for collecting implicit values to be used as function params (another naming convention is Implicits).
The other problem is that there is a typo/out-of-date problem in the import statement you got from Google Groups - it's Defaults, plural.
In summary - the best solution is to explicitly let Scala know which module you want to use:
import dispatch._
import dispatch.Defaults._
In the general case, only if the library's docs don't say otherwise: the last error message is pretty common to concurrent programming in Scala. To quote the relevant part:
either require one yourself or import ExecutionContext.Implicits.global
So, unless you want to roll your own ExecutionContext, just import the scala.concurrent.ExecutionContext.Implicits.global one via the scala.concurrent.ExecutionContext.Implicits module.