I execute the following command in the terminal sbt -jvm-debug 9999 and start a remote debug configuration with default values in Intellij 15.0.4-1. Next I execute the sbt task run and breakpoints work as expected. When I execute the test task instead debugging wont work anymore despite the fact that the same code gets executed.
Using play-scala activator seed with Play Framework 2.4. Tests are written in spec2.
Has anyone an idea what I might do wrong?
Here is my code:
Class DebugTest.scala
object DebugTest {
def helloWorld(): Unit ={
println("Oh my")
}
}
Class ApplicationSpec.scala
import org.specs2.mutable._
import play.api.test._
import play.api.test.Helpers._
class ApplicationSpec extends Specification {
"Application" should {
"just print oh my in console" in new WithApplication{
DebugTest.helloWorld()
}
}
}
File build.sbt
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.7"
libraryDependencies ++= Seq( jdbc,
ws,
specs2 % Test,
"org.webjars.bower" % "adminlte" % "2.3.3",
"org.pac4j" % "play-pac4j" % "2.2.0-SNAPSHOT",
"org.pac4j" % "pac4j-http" % "1.9.0-SNAPSHOT",
"com.typesafe.play" % "play-cache_2.11" % "2.4.6"
)
resolvers += "scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"
resolvers += "Sonatype snapshots repository" at "https://oss.sonatype.org/content/repositories/snapshots/"
routesGenerator := InjectedRoutesGenerator
fork in run := true
fork in test := false
File test.sbt
fork in test := false
Play sbt plugin defines the following setting:
fork in Test := true
So, when you are launching your tests, a different jvm is started (without remote debugging).
You just have to add in your build.sbt:
fork in Test := false
You could even create a test.sbt file containing only that line and ignore it from your source control.
This should only be used during debugging. After, please come back to the default behavior; or you can get unexpected results when launching tests multiple times in the same sbt session.
Related
As I want to execute the jar generated by my scala project in the Command Line Interface, I get the following problem:
Exception in thread "main" java.lang.NoClassDefFoundError: org/rogach/scallop/ScallopConf
Although in the dependency file I mentionned scallop dependency as follow
import sbt._
object Dependencies {
lazy val betterFiles = "com.github.pathikrit" %% "better-files" % "3.7.0"
lazy val scalaz = "org.scalaz" %% "scalaz-core" % "7.2.27"
lazy val scallop = "org.rogach" %% "scallop" % "3.1.5"
// -- Logging
lazy val scalaLogging = "com.typesafe.scala-logging" %% "scala-logging" % "3.9.2"
lazy val slf4jBackend = "org.slf4j" % "slf4j-simple" % "1.7.26"
// -- Testing
lazy val scalaTest = "org.scalatest" %% "scalatest" % "3.0.5"
}
My build.sbt file is the following:
import Dependencies._
ThisBuild / scalaVersion := "2.12.5"
ThisBuild / sbtVersion := "1.2.6"
ThisBuild / version := "0.1.0-SNAPSHOT"
lazy val root = (project in file("."))
.settings(
name := "phenix-challenge",
libraryDependencies ++= Seq(
betterFiles,
scalaz,
scallop,
scalaLogging,
slf4jBackend % Runtime,
scalaTest % Test
)
)
If you have an Idea that could resolve my Issue please HELP!
Many thanks in advance
To execute the jar generated by your scala project in the Command Line Interface you can use sbt plugin to assembly a fat-jar including your libraries/dependencies. Having such jar you would be able to run your app via java -jar ...
There are several SBT plugins for build a fat-jar. Perhaps the easiest one would be the sbt-assembly.
Add this plugin to file project/plugins.sbt (create this file if needed):
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.9")
Now use sbt to build a fat-jar:
sbt assembly
Then run via java -jar YouMainClass
Another option is to use pure sbt to run Main class using command sbt run, then you do really need to build a fat-jar.
We have standard SBT project with standard directory structure.
The test folder contains all the JUnit tests.
We introduced a new folder othertests and the folder othertests is peer to folder test
The othertest folder contains some special test cases. The test classes names in the folder othertest can be easily distinguished from the normal JUnit tests.
Below is my working build.sbt configuration.
I have added the folder othertest in test sources using javaSources in Test SBT task.
We are using activator to run tests and do other stuff.
When I run activator> test I want all tests from the folder test only to run and I want to run tests from the folder othertests separately.
Questions.
How to override the behavior of test tasks to filter out tests from the folder othertests
Should I create shared modules to run normal junit tests separately and other junit tests separately.
Below is my build.sbt configuration
import java.io.PrintStream
import play.sbt.PlayJava
import play.twirl.sbt.Import._
name := "Service"
version := "5.1.0"
scalaVersion := "2.11.8"
routesGenerator := InjectedRoutesGenerator
lazy val ContractTest = config("contract") extend(Test)
def contractTestFilter(name: String): Boolean = name endsWith "ContractTest"
def ignoreContractTest(name: String): Boolean = !contractTestFilter(name)
lazy val root = (project in file("."))
.enablePlugins(PlayJava)
.configs(ContractTest)
.settings(
inConfig(ContractTest) (Defaults.testTasks),
javaSource in Test := { (baseDirectory in Test) (_ / "contracttest") }.value,
testOptions in ContractTest := Seq(Tests.Filter(contractTestFilter),Tests.Argument(TestFrameworks.JUnit, "-q", "-v", "-s")),
testOptions in Test := Seq(Tests.Filter(ignoreContractTest),Tests.Argument(TestFrameworks.JUnit, "-q", "-v", "-s"))
)
lazy val xyz = taskKey[Unit]("custom task to create loglayout jar file under lib folder")
xyz := {
LogLayoutJar.build(scalaBinaryVersion.value, streams.value.log)
}
run := (run in Runtime).dependsOn(xyz).evaluated
javaOptions in Test ++= Seq("-Dconfig.resource=applic.tt.cnf")
libraryDependencies ++= Seq(
json,
javaWs,
"org.mockito" % "mockito-all" % "1.10.19" % Test,
"org.scalatestplus.play" %% "scalatestplus-play" % "1.5.1" % Test,
"org.easytesting" % "fest-assert" % "1.4" % Test,
"org.scalactic" %% "scalactic" % "2.2.0",
"org.jmockit" % "jmockit" % "1.9" % Test,
"com.portingle" % "slf4jtesting" % "1.0.0" % Test,
"org.scalacheck" %% "scalacheck" % "1.12.6" % Test
)
resolvers ++= Seq(
"Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/"
)
parallelExecution in Test := runningTestInParallel
testGrouping in Test := groupByModule((externalDependencyClasspath in Test).value,
(definedTests in Test).value, streams.value.log)
javacOptions ++= Seq(
"-Xlint:unchecked",
"-Xlint:deprecation"
)
scalacOptions ++= Seq(
"-feature",
"-language:implicitConversions",
"-deprecation"
)
// Custom tasks //
val silenceSystemErr = taskKey[Unit]("Replaces System.err with a PrintStream to nowhere.")
silenceSystemErr := {
System.setErr(new PrintStream(new DevNull))
println("Messages System.err will not be printed.")
}
val restoreSystemErr = taskKey[Unit]("Restores the original System.err")
restoreSystemErr := {
System.setErr(systemErr)
println("Messages System.err will be printed.")
}
From jenkins we run tests using below command -
bin/activator -Dsbt.log.noformat=true -Denvironment_name=test -DsuppressLogging=true clean silenceSystemErr jacoco:cover
Thank You
Rakesh
1.
You're contradicting yourself. Why add javaSources in Test if you don't want them to run when you run the Test command?
What you should do is create an [Additional Test configuration|http://www.scala-sbt.org/0.13/docs/Testing.html#Additional+test+configurations] that extends Test and runs only tests inside your othertests folder.
2.
You can create another module. I personally don't like this idea because then I have to name the module according to what it tests and I have 2 separate modules that should actually only be one.
A separate test module might be a good idea if you have some dependencies in your tests that slow down the overall build time of your module. For example imagine a Java project with Gatling performance tests. If the performance tests are in the same module then whenever I rebuild it it will also rebuild the gatling tests that require the scala compiler which is slower.
Some people can live with this, I'm one of them. I prefer to have the tests in the same project and possibly suffer a time penalty when rebuilding the module. Which rarely happens I create different test configurations when needed.
Another reason to choose separate modules for tests is when your tests depend on multiple modules and you don't want this module dependency at compile time for the src code.
Some IDE's and/or languages might encourage the use of separate module for tests as I understand is the case for C# and Visual Studio (but I might be wrong here, don't take my word for it).
For testing, I'm using an in-memory NIO FileSystem implementaion ( memoryfs ). I've taken advantage of it before, and it seems to run fine through e.g. Maven.
However, now, in an SBT project, it's impossible to initialize a new FileSystem.
Here's a minimal SBT configuration to reproduce the problem:
import sbt._
import Keys._
name := "testfs"
organization := "com.example
version := "0.1-SNAPSHOT"
scalaVersion := "2.11.6"
libraryDependencies ++= {
val scalaTestVersion = "2.2.5"
Seq(
"org.scalatest" %% "scalatest" % scalaTestVersion % "test",
"org.mockito" % "mockito-core" % "1.10.19" % "test",
"de.pfabulist.lindwurm" % "memoryfs" % "0.28.3" % "test"
)}
And here's a test:
import de.pfabulist.lindwurm.memory.MemoryFSBuilder
import org.scalatest.{FlatSpec, MustMatchers}
class FsDummySpec extends FlatSpec with MustMatchers {
it must "init the FS" in {
new MemoryFSBuilder().watchService(true).name("testFs").build() //init here
}
}
Running sbt test will result in:
[info] FsDummySpec:
[info] - must init the FS *** FAILED ***
[info] java.nio.file.ProviderNotFoundException: Provider "memoryfs" not found
[info] at java.nio.file.FileSystems.getFileSystem(FileSystems.java:224)
[info] at de.pfabulist.kleinod.paths.Pathss.getOrCreate(Pathss.java:76)
Here's the thing: this should run without any problems. My question is: why, and how to fix it?
Glancing over the custom FS provider docs it looks like SBT borks the classpath somehow, but its hard to say why.
Note: interestingly enough, IntelliJ IDEA's test runner seems to work without a hitch, the problem is only on the command line (in "SBT proper").
The comment by openCage hinted at the solution.
It turns out custom file systems do require an additional element, i.e. a service provider definition file located in META-INF/services.
If you use a custom NIO FileSystem, you need to make that provider definition file available in the test classpath.
The simplest way is probably just to fork the test VM, i.e. add the following to your build.sbt:
fork in Test := true
I'm trying to use JavaAppPackaging from sbt-native-packager. My understanding is, that when I run:
sbt stage
I should get a directory target/universal/stage/bin with some startup scripts. Now I only get lib which contains my jar and it's dependencies.
Here's the relevant parts of my build.sbt:
val scalatra = "org.scalatra" %% "scalatra" % "2.3.1"
enablePlugins(JavaAppPackaging)
lazy val root = (project in file(".")).
settings(
name := "myapp",
version := "0.2",
scalaVersion := "2.11.6",
libraryDependencies += scalatra
)
Also, my plugins.sbt has this:
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.0.0")
I'm using sbt 0.13.8.
So why don't I get the startup scripts, what am I missing?
You need make sure sbt finds a main for the script.
This could mean either make sure you have a main: an object that extends App or one that defines a def main(args: Array[String]): Unit.
Otherwise try setting mainClass, like so:
mainClass in Compile := Some("JettyLauncher")
Try setting the main class without any scopes: mainClass := Some ("full.path.to.MainApp")
I am building a simple Scala project with SBT 0.11.
All the code files are in ~/MyProject/src/main/scala/
~/MyProject/build.sbt is the following
name := "MyProject"
version := "1.0"
scalaVersion := "2.9.1"
libraryDependencies ++= Seq(
"mysql" % "mysql-connector-java" % "5.1.+",
"c3p0" % "c3p0" % "0.9.1.2",
"org.apache.commons" % "commons-lang3" % "3.0.1",
"commons-lang" % "commons-lang" % "2.6",
"javassist" % "javassist" % "3.12.1.GA"
)
~/MyProject/project/Build.scala is the following
import sbt._
object MyProjectBuild extends Build {
lazy val MyProject = Project("MyProject", file("."))
}
This seems to work almost fine. The project does compile and run. The project name is set correctly (if I don't use Build.scala, then the name seems to appear something like "default-????", despite it being specified in build.sbt).
But the problem is that dependencies do not seem to work - update command doesn't download anything. How to fix this? Do I need to specify my dependencies in Build.scala rather than in build.sbt in this case?
Is it possible that you've already retrieved the project dependencies, but don't realize it because they are stored in the Ivy cache? You can view the managed classpath from the SBT console with the command
show managed-classpath
Recent versions of SBT do not store the managed dependencies in the project directory, unless the project is configured to do so. If you want, you can add the following to your build.sbt file:
retrieveManaged := true
This will create a ~/MyProject/lib_managed/ directory and contents.