In my sbt project, I have two packages in it folder.
I want package.one tests to run first and then package.two. This is needed because tests in package.two are dependent on a Map created and populated by tests from the first package. Is there any way to achieve this?
Lets assume you have following package structure:
p1/
Test1.scala
p2/
Test2.scala
Then you can run following commands:
sbt> testOnly -- -m "p1"
sbt> testOnly -- -m "p2"
This will execute tests from p1 first and then from p2
Related
I have the package structure as:
src -> test -> scala -> notification
Inside notification i have two packages unit and integration.
unit package has the unit tests and integration package has integration tests. I want to execute only the unit test cases. Is there i way i can do it by sbt test only?
for one class, i know it can be done like this:
I have tried for the one class but do not know how to do it for a pacakge.
sbt "test:testOnly *LoginServiceSpec"
testOnly allows you to run wildcards using *. So if all your tests are in package namespace x you can call
> testOnly x.*
This wildcard can be put everythere, so if you have a subpackage in x, but all your tests end with e.g. Spec you could run
> testOnly x.*Spec
However if you are using integration test I recommend creating a separate configuration for them so that you would run:
> test
for unit tests and
> it:test
for integration test. Then you would put them in src/it/scala. directory.
Try this
sbt "test:testOnly notification.unit.*"
I want to build subproject in Spark with sbt. I found this example and it works
$ ./build/sbt -Phive -Phive-thriftserver (build)
sbt (spark)> project hive (switch to subproject)
sbt (hive)> testOnly *.HiveQuerySuite -- -t foo ( run test case)
However, I tried the following but it does not build but quit
./build/sbt -mllib
I do not know how does the author figure out -Phive -Phive-thriftserver. I cannot find this in Spark source code.
I just want to do the exact same thing as the example but with a different subproject.
This is not asking how to use projects to print out all available projects.
Specify the project scope:
./build/sbt mllib/compile
refer to: http://www.scala-sbt.org/0.13/docs/Scopes.html
Say I have a Scalatest file in the main directory, is there a sbt command to run the test such as testOnly or `runMain'? On IntelliJ, you are given the option to run the test.
You should be able to use test-only. From the scalatest user guide:
test-only org.acme.RedSuite org.acme.BlueSuite
I have a multi module project with moduleA, moduleB, and moduleC. I want to run my class com.helpme.run.MyTest from moduleB.
My guess is the sbt command should look like:
sbt "project moduleA" --mainClass com.helpme.run.MyTest test
But no luck. Please help!! Thanks!
First of all you can run a test by using testOnly
$ sbt testOnly MyTest
But if your project is a multi-project sbt project and you have the same test class with the same name in more than one project you can navigate between projects by project command and then run the test
$ sbt
> project moduleA
> testOnly MyTest
Note that you have to first run sbt and then run the rest of commands from the sbt shell.
Depends on your project configuration testOnly couldn't work
You can try this command too:
sbt "project myProject" "testOnly com.this.is.my.Test"
I have not found any documentation on how to do this. For JUnit the equivalent would be:
mvn -Dtest=org.apache.spark.streaming.InputStreamSuite test
tl;dr mvn test -Dsuites="some.package.SpecsClass"
I found an answer from here and it works:(https://groups.google.com/forum/#!topic/scalatest-users/Rr0gy61dg-0)
run test 'a pending test' in HelloSuite, and all tests in HelloWordSpec:
mvn test -Dsuites='org.example.
HelloSuite #a pending test, org.example.HelloWordSpec'
run all tests in HelloSuite containing 'hello':
mvn test -Dsuites='org.example.HelloSuite hello'
for more details: http://scalatest.org/user_guide/using_the_scalatest_maven_plugin
Found the answer: it is
-DwildcardSuites
So here is the example command line:
mvn -pl streaming -DwildcardSuites=org.apache.spark.streaming.InputStreamSuite test
Update Newer versions of scalatest use
-Dsuites
So the syntax would be:
mvn -pl streaming -Dsuites=org.apache.spark.streaming.InputStreamSuite test
Note that if you have some Java tests in the same module, as much of spark does, you need to turn them off -which you can do by telling surefire to run a test that isn't there
Here is the test that I've just been running
mvn test -Dtest=moo -DwildcardSuites=org.apache.spark.deploy.yarn.ClientSuite
That skips the java test and only runs the scala one.
One thing which scalatest doesn't seem to do is let you run a single test within a suite, the way maven surefire does. That's not ideal if you have one failing test in a big suite.
[Correction 2016-08-22: looks like you can ask for a specific suite by name; look at the other answers below. Happy to be wrong].