I am trying to run a program developed in scala cucumber and trying to run same via command line. This job is executed successfully via Intellij when ran Runner class but while running it via command line, it shows "No tests run"
[info] Loading global plugins from /Users/user/.sbt/0.13/plugins
[info] Loading project definition from /Users/user/Documents/Spark-Scala/Spark_Scala_Cucumber/project
[info] Set current project to Spark_Scala_Cucumber (in build file:/Users/user/Documents/Spark-Scala/Spark_Scala_Cucumber/)
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for test:testOnly
[success] Total time: 2 s, completed Aug 29, 2019 11:15:02 PM
tried following commands:
sbt test
&
sbt "testOnly /Users/user/Documents/Spark-Scala/Spark_Scala_Cucumber/src/test/scala/features/steps/testRunner.scala"
#RunWith(classOf[Cucumber])
#CucumberOptions(
features = Array("classpath:features"),
glue = Array("classpath:features.steps"),
tags = Array("#my-tag"),
monochrome = true,
plugin = Array("pretty",
"html:target/cucumber",
"json:target/cucumber/test-report.json",
"junit:target/cucumber/test-report.xml")
)
class testRunner {}
Path for this file:
/Users/user/Documents/Spark-Scala/Spark_Scala_Cucumber/src/test/scala/features/steps/testRunner.scala
Expecting this should be executed command line and cucumber report is generated command line.
I missed to add the dependency in build.sbt
I found this solution which was answered earlier by below thread:
How do you run cucumber with Scala 2.11 and sbt 0.13?
dependency to be added to build.sbt:
"com.novocode" % "junit-interface" % "0.11" % Test
With this dependency the code can be executed via command line :
sbt test
This will execute all the scenarios/feature files, mentioned tags from src/test/scala/features folders.
Related
I am attempting to create the PlayFramework Scala seed project.
So far I've used sbt new playframework/play-scala-seed.g8 command and it has created the necessary files within my root directory movie-app.
From this point, PlayFramework says to run sbt run, so I tried that, but I get the following error:
[info] Updated file *omitting personal directories*/Movie-App/project/build.properties: set sbt.version to 1.4.7
[info] welcome to sbt 1.4.7 (Ubuntu Java 11.0.10)
[info] loading project definition from *omitting personal directories*/Movie-App/project
[info] set current project to movie-app (in build file:*omitting personal directories*/Movie-App/)
[error] java.lang.RuntimeException: No main class detected.
[error] at scala.sys.package$.error(package.scala:30)
[error] stack trace is suppressed; run last Compile / bgRun for the full output
[error] (Compile / bgRun) No main class detected.
[error] Total time: 0 s, completed Jul 2, 2021, 11:27:33 PM
I haven't found anything helpful online yet.
Do I need to set the current project to "movie-app" like the error says? If so what do I need to write in the build.properties file?
If not, can anyone please explain the issue?
Thanks
$> sbt new playframework/play-scala-seed.g8
This template generates a Play Scala project.
Give it a name when asked. Skip rest by pressing enter.
name [play-scala-seed]: movie-app
$> cd movie-app
$> sbt run
When running scala in sbt via runMain I have the issue that some output written via println is cut off. If I run the following code by running sbt "runMain aw.OutputTry" the output start to get cut off around at some point.
package aw
object OutputTry {
def main(args: Array[String]) {
for(i <- 1 to 5000) {
println(f"${i}")
}
}
}
Example output (I snipped the output at the '...'):
uhu01#DESKTOP-4LSJM58:~/git/spinal$ sbt "runMain aw.OutputTry"
[info] Loading settings for project spinal-build from metals.sbt,plugins.sbt ...
[info] Loading project definition from /home/uhu01/git/spinal/project
[info] Loading settings for project spinal from build.sbt ...
[info] Set current project to aw (in build file:/home/uhu01/git/spinal/)
[info] sbt server started at local:///home/uhu01/.sbt/1.0/server/771a115d7899feb4b3f3/sock
sbt:aw> runMain aw.OutputTry
[warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list
[info] Running (fork) aw.OutputTry
[info] 1
[info] 2
[info] 3
...
[info] 963
[info] 964
[info] 965
[success] Total time: 4 s, completed Apr 13, 2020 11:15:55 PM
I assume some buffering is going on in sbt to e.g. prepend the output with the [info] tag? To me the behavior looks a bit like a buffer in sbt is not flushed after the program exists.
Things I tried:
Flushing the output in the scala code (by calling Console.flush() in the loop, after println) - does not help
In the sbt documentation I could find the setting logBuffered, I checked this and it's already set to false
Pipe the output of the sbt call to a file - then all lines are visible as expected
Calling Thread.sleep(1000) before exiting only moves the problem, and would not be a solution in any case
I first suspected my environment (shell, etc.) but running the code directly in a scala REPL works as expected
Is there some sbt setting that I did overlook during my search? Any tips how to get shown the full output?
Environment: I'm using sbt 1.2.7, Scala 2.11.12 and openjdk 1.8.0_424 on Ubuntu 18.04 in WSL
sbt 1.2.7 is a pretty old version. Try 1.3.9 (latest as of April 2020).
(And remember that in order to keep builds reproducible, your sbt version is determined by your project/build.properties file, not by what you have installed.)
I have an interesting problem where I basically need to create a .jar (plus all of the classpath dependencies) that contains all of the tests of an SBT project (plus any of its subprojects). The idea is that I can just run the jar using java -jar and all of the tests will execute.
I heard that this is possible to do with sbt-assembly but you would have to manually run assembly for each sbt sub-project that you have (each with their own .jars) where as ideally I would just want to run one command that generates a giant .jar for every test in every sbt root+sub project that you happen to have (in the same way if you run test in an sbt project with sub projects it will run tests for everything).
The current testing framework that we are using is specs2 although I am not sure if this makes a difference.
Does anyone know if this is possible?
Exporting test runner is not supported
sbt 1.3.x does not have this feature. Defined tests are executed in tandem with the runner provided by test frameworks (like Specs2) and sbt's build that also reflectively discovers your defined tests (e.g. which class extends Spec2's test traits?). In theory, we already have a good chunk of what you'd need because Test / fork := true creates a program called ForkMain and runs your tests in another JVM. What's missing from that is dispatching of your defined tests.
Using specs2.run runner
Thankfully Specs2 provides a runner out of the box called specs2.run (See In the shell):
scala -cp ... specs2.run com.company.SpecName [argument1 argument2 ...]
So basically all you need to know is:
your classpath
list of fully qualified name for your defined tests
Here's how to get them using sbt:
> print Test/fullClasspath
* Attributed(/private/tmp/specs-runner/target/scala-2.13/test-classes)
* Attributed(/private/tmp/specs-runner/target/scala-2.13/classes)
* Attributed(/Users/eed3si9n/.coursier/cache/v1/https/repo1.maven.org/maven2/org/scala-lang/modules/scala-xml_2.13/1.2.0/scala-xml_2.13-1.2.0.jar)
...
> print Test/definedTests
* Test foo.HelloWorldSpec : subclass(false, org.specs2.specification.core.SpecificationStructure)
We can exercise specs2.run runner from sbt shell as follows:
> Test/runMain specs2.run foo.HelloWorldSpec
Aggregating across subprojects
Aggregating tests across subprojects requires some thinking. Instead of creating a giant ball of assembly, I would recommend the following. Create a dummy subproject testAgg, and then collect all the Test/externalDependencyClasspath and Test/packageBin into its target/dist. You can then grab all the JAR and run java -jar ... as you wanted.
How would one go about that programmatically? See Getting values from multiple scopes.
lazy val collectJars = taskKey[Seq[File]]("")
lazy val collectDefinedTests = taskKey[Seq[String]]("")
lazy val testFilter = ScopeFilter(inAnyProject, inConfigurations(Test))
lazy val testAgg = (project in file("testAgg"))
.settings(
name := "testAgg",
publish / skip := true,
collectJars := {
val cps = externalDependencyClasspath.all(testFilter).value.flatten.distinct
val pkgs = packageBin.all(testFilter).value
cps.map(_.data) ++ pkgs
},
collectDefinedTests := {
val dts = definedTests.all(testFilter).value.flatten
dts.map(_.name)
},
Test / test := {
val jars = collectJars.value
val tests = collectDefinedTests.value
sys.process.Process(s"""java -cp ${jars.mkString(":")} specs2.run ${tests.mkString(" ")}""").!
}
)
This runs like this:
> testAgg/test
[info] HelloWorldSpec
[info]
[info] The 'Hello world' string should
[info] + contain 11 characters
[info] + start with 'Hello'
[info] + end with 'world'
[info]
[info]
[info] Total for specification HelloWorldSpec
[info] Finished in 124 ms
3 examples, 0 failure, 0 error
[info] testAgg / Test / test 1s
If you really want to you probably could generate source from the collectDefinedTests make testAgg depend on the Test configurations of all subprojects, and try to make a giant ball of assembly, but I'll leave as an exercise to the reader :)
I currently have a problem with recompile on code change with sbt.
I was following the sbt reference 'sbt by example'
I installed sbt 1.2.8 and followed the instructions:
Create a minimum sbt build
$ mkdir foo-build
$ cd foo-build
$ touch build.sbt
Start sbt shell
$ sbt
[info] Loading global plugins from C:\Users\hce\.sbt\1.0\plugins
[info] Loading project definition from E:\learn\Scala\demo\foo-build\project
[info] Loading settings for project foo-build from build.sbt ...
[info] Set current project to foo-build (in build file:/E:/learn/Scala/demo/foo-build/)
[info] sbt server started at local:sbt-server-57c501e502d72a00d890
Recompile on code change (Note the ~ prefix before the compile command)
sbt:foo-build> ~compile
[success] Total time: 0 s, completed Jul 6, 2019 12:01:24 PM
1. Waiting for source changes in project foo-build... (press enter to interrupt)
Create a source file
Leave the previous command running. From a different shell or in your file manager create in the project directory the following nested directories: src/main/scala/example. Then, create Hello.scala in the example directory using your favorite editor as follows:
package example
object Hello extends App {
println("Hello")
}
This new file should be picked up by the running command. But it is not working on my system.
Expected Behaviour:
[info] Compiling 1 Scala source to /tmp/foo-build/target/scala-2.12/classes ...
[info] Done compiling.
[success] Total time: 2 s, completed May 6, 2018 3:53:42 PM
2. Waiting for source changes... (press enter to interrupt)
Here are some Information about my environment
$ java -version
java version "1.8.0_211"
Java(TM) SE Runtime Environment (build 1.8.0_211-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.211-b12, mixed mode)
$ sbt sbtVersion
[info] Loading global plugins from C:\Users\hce\.sbt\1.0\plugins
[info] Loading project definition from E:\learn\Scala\demo\foo-build\project
[info] Loading settings for project foo-build from build.sbt ...
[info] Set current project to foo-build (in build file:/E:/learn/Scala/demo/foo-build/)
[info] 1.2.8
$ systeminfo.exe | grep '^OS'
OS Name: Microsoft Windows 10 Enterprise LTSC
OS Version: 10.0.17763 N/A Build 17763
OS Manufacturer: Microsoft Corporation
OS Configuration: Standalone Workstation
OS Build Type: Multiprocessor Free
What I already tried:
reinstall sbt
try it with windows commad line
try it mingw64 bash
What am I missing to run the sbt ~compile command correctly?
I found the answer.
The instruction contains an error.
We are supposed to put the src/main/scala/example directory in the project directory.
From a different shell or in your file manager create in the project directory the following nested directories: src/main/scala/example.
But this leads to the problem described. foo-build/project/ is for build definition code.
If I put the src/main/scala/example directory in the foo-build directory it is working.
I should have executed the run command beforehand, which gives the 'No main class detected' error. Which in turn helped to discover, that the directory structure was incorrect by reading the following stackoverflow question: how to set main class in sbt project .
My mistake. Sorry for bothering you.
I was trying to use jacoco to integrate test report to my sbt project. https://github.com/sbt/jacoco4sbt
I added jacoco.settings into build.sbt
I also added addSbtPlugin("de.johoop" % "jacoco4sbt" % "2.1.6") into plugins.sbt
When I run sbt jacoco:check, it is working fine. However, when I try to look at how many tasks for jacoco, sbt tasks doesn't show anything related to jacoco.
I have to go to source code to look at it.
https://github.com/sbt/jacoco4sbt/blob/master/src/main/scala/de/johoop/jacoco4sbt/Keys.scala
May I know why jacoco is not shown for sbt tasks command and what is preferable way to look at all the available tasks for the plugins
Edit:
I suspect that the statement lazy val Config = config("jacoco") extend(Test) hide means jacoco extends Test task, so it wont show it in the sbt tasks, but I am not sure.
By running
> tasks -v
Edit: If that doesn't work consider adding more "v"s, such as tasks -vvv, or even tasks -V to see all the tasks.
I see, for instance, cover:
This is a list of tasks defined for the current project.
It does not list the scopes the tasks are defined in; use the 'inspect' command for that.
Tasks produce values. Use the 'show' command to run the task and print the resulting value.
check Executes the tests and saves the execution data in 'jacoco.exec'.
classesToCover compiled classes (filtered by includes and excludes) that will be covered
clean Cleaning JaCoCo's output-directory.
compile Compiles sources.
console Starts the Scala interpreter with the project classes on the classpath.
consoleProject Starts the Scala interpreter with the sbt and the build definition on the classpath and useful imports.
consoleQuick Starts the Scala interpreter with the project dependencies on the classpath.
copyResources Copies resources to the output directory.
cover Executes the tests and creates a JaCoCo coverage report.
coveredSources Covered Sources.
Note also what it says at the beginning (wrapped for clarity):
It does not list the scopes the tasks are defined in;
use the 'inspect' command for that.
which leads to
> inspect cover
[info] No entry for key.
[info] Description:
[info] Executes the tests and creates a JaCoCo coverage report.
[info] Delegates:
[info] *:cover
[info] {.}/*:cover
[info] */*:cover
[info] Related:
[info] jacoco:cover
So you know to run jacoco:cover