How to check if sbt in test context? - scala

How can an application tell if it's running under an 'sbt test' context? Is there a system property that can be checked?

There are probably different ways. I found the following works:
testOptions += Tests.Setup(_ => sys.props("testing") = "true")
And then you can just test for sys.props.get("testing") in your class.

Related

How to run Test Suites sequentially in ScalaTest / SBT?

How to run Test Suites sequentially in ScalaTest / SBT?
For example if I have this test suites A, B and C I want to make sure that the tests of A will be run 1st then the ones of B then the ones of C.
Is there in configuration that I can make in Scalatest or SBT?
Thank you.
try using parallelExecution in Test := false
According to the documentation http://doc.scalatest.org/1.7/org/scalatest/Suite.html
You need to create your own Test Suite like the following:
FirstTest.scala
import org.scalatest.{DoNotDiscover, FunSuite}
#DoNotDiscover
class FirstTest extends FunSuite {
test("first test"){
assert(1 == 1)
}
}
SecondTest.scala
import org.scalatest.{DoNotDiscover, FunSuite}
#DoNotDiscover
class SecondTest extends FunSuite{
test("second test"){
assert(2 == 2)
}
}
MainTest.scala
import org.scalatest.Suites
class MainTest extends Suites (new FirstTest,new SecondTest)
Now, if you run sbt test it's work properly.
Notes: the property #DoNotDiscover is mandatory. this avoid unexpected behavior like execution of FirstTest and SecondTest after the execution of the MainSuite that are already executed the two test suites.
I hope it was helpful
As raj mehra said, the solution is to configure to run the tests not in parallel.
Test / parallelExecution := false
Former parallelExecution in Test := false is deprecated.
Here is the Documentation that explains it: SBT Parallel Execution
From it:
As before, parallelExecution in Test controls whether tests are mapped to separate tasks.

How to disable "Slow" tagged Scalatests by default, allow execution with option?

I want to disable certain automated tests tagged as "Slow" by default but allow the user to enable their execution with a simple command line. I imagine this is a very common use case.
Given this test suite:
import org.scalatest.FunSuite
import org.scalatest.tagobjects.Slow
class DemoTestSuite extends FunSuite {
test("demo test tagged as slow", Slow) {
assert(1 + 1 === 2)
}
test("demo untagged test") {
assert(1 + 1 === 2)
}
}
By default, sbt test will run both tagged and untagged tests.
If I add the following to my build.sbt:
testOptions in Test += Tests.Argument("-l", "org.scalatest.tags.Slow")
Then I get my desired default behavior where untagged tests run and the Slow tagged test will not run.
However, I can't figure out a command line option that will run the slow tests when I want to run them. I've done several searches and tried several examples. I'm somewhat surprised as this seems like a very common scenario.
I had a similar issue: I wanted to have tests that are disabled by default, but run in the release process. I solved it by creating a custom test configuration and setting testOptions in different scopes. So adapting this solution to your case, it should be something along these lines (in your build.sbt):
lazy val Slow = config("slow").extend(Test)
configs(Slow)
inConfig(Slow)(Defaults.testTasks)
Now by default exclude slow tests:
testOptions in Test += Tests.Argument("-l", "org.scalatest.tags.Slow")
But in the Slow scope don't exclude them and run only them:
testOptions in Slow -= Tests.Argument("-l", "org.scalatest.tags.Slow")
testOptions in Slow += Tests.Argument("-n", "org.scalatest.tags.Slow")
Now when you run test in sbt, it will run everything except slow test and when you run slow:test it will run only slow tests.

How to pass scalacOptions (Xelide-below) to sbt via command line

I am trying to call sbt assembly from the command line passing it a scalac compiler flag to elides (elide-below 1).
I have managed to get the flag working in the build.sbt by adding this line to the build.sbt
scalacOptions ++= Seq("-Xelide-below", "1")
And also it's working fine when I start sbt and run the following:
$> sbt
$> set scalacOptions in ThisBuild ++=Seq("-Xelide-below", "0")
But I would like to know how to pass this in when starting sbt, so that my CI jobs can use it while doing different assembly targets (ie. dev/test/prod).
One way to pass the elide level as a command line option is to use system properties
scalacOptions ++= Seq("-Xelide-below", sys.props.getOrElse("elide.below", "0"))
and run sbt -Delide.below=20 assembly. Quick, dirty and easy.
Another more verbose way to accomplish the same thing is to define different commands for producing test/prod artifacts.
lazy val elideLevel = settingKey[Int]("elide code below this level.")
elideLevel in Global := 0
scalacOptions ++= Seq("-Xelide-below", elideLevel.value.toString)
def assemblyCommand(name: String, level: Int) =
Command.command(s"${name}Assembly") { s =>
s"set elideLevel in Global := $level" ::
"assembly" ::
s"set elideLevel in Global := 0" ::
s
}
commands += assemblyCommand("test", 10)
commands += assemblyCommand("prod", 1000)
and you can run sbt testAssembly prodAssembly. This buys you a cleaner command name in combination with the fact that you don't have to exit an active sbt-shell session to call for example testAssembly. My sbt-shell sessions tend to live for a long time so I personally prefer the second option.

Custom run task for subproject with arguments from build.sbt?

I have a subproject named oppenheimer in my project. It's very simple to run this project from the sbt console.
[myproject] $ oppenheimer/run
I can also pass in a command line argument as such:
[myproject] $ oppenheimer/run migrate
[myproject] $ oppenheimer/run clean
How can I do this from build.sbt? Is it possible to define a task that does this? It would suffice to have something like this:
val customMigrate = ...
val customClean = ...
And this is so that I could use it elsewhere in the project, like such:
(test in Test) <<= (test in Test).dependsOn(customMigrate)
The answer is given in the sbt FAQ section "How can I create a custom run task, in addition to run?". Basically:
lazy val customMigrate = taskKey[Unit]("custom run task")
fullRunTask(customMigrate, Test, "foo.Main", "migrate")

How can I get automatic dependency resolution in my scala scripts?

I'm just learning scala coming out of the groovy/java world. My first script requires a 3rd party library TagSoup for XML/HTML parsing, and I'm loath to have to add it the old school way: that is, downloading TagSoup from its developer website, and then adding it to the class path.
Is there a way to resolve third party libraries in my scala scripts? I'm thinking Ivy, I'm thinking Grape.
Ideas?
The answer that worked best for me was to install n8:
curl https://raw.github.com/n8han/conscript/master/setup.sh | sh
cs harrah/xsbt --branch v0.11.0
Then I could import tagsoup fairly easily example.scala
/***
libraryDependencies ++= Seq(
"org.ccil.cowan.tagsoup" % "tagsoup" % "1.2.1"
)
*/
def getLocation(address:String) = {
...
}
And run using scalas:
scalas example.scala
Thanks for the help!
While the answer is SBT, it could have been more helpful where scripts are regarded. See, SBT has a special thing for scripts, as described here. Once you get scalas installed, either by installing conscript and then running cs harrah/xsbt --branch v0.11.0, or simply by writing it yourself more or less like this:
#!/bin/sh
java -Dsbt.main.class=sbt.ScriptMain \
-Dsbt.boot.directory=/home/user/.sbt/boot \
-jar sbt-launch.jar "$#"
Then you can write your script like this:
#!/usr/bin/env scalas
!#
/***
scalaVersion := "2.9.1"
libraryDependencies ++= Seq(
"net.databinder" %% "dispatch-twitter" % "0.8.3",
"net.databinder" %% "dispatch-http" % "0.8.3"
)
*/
import dispatch.{ json, Http, Request }
import dispatch.twitter.Search
import json.{ Js, JsObject }
def process(param: JsObject) = {
val Search.text(txt) = param
val Search.from_user(usr) = param
val Search.created_at(time) = param
"(" + time + ")" + usr + ": " + txt
}
Http.x((Search("#scala") lang "en") ~> (_ map process foreach println))
You may also be interested in paulp's xsbtscript, which creates an xsbtscript shell that has the same thing as scalas (I guess the latter was based on the former), with the advantage that, without either conscript or sbt installed, you can get it ready with this:
curl https://raw.github.com/paulp/xsbtscript/master/setup.sh | sh
Note that it installs sbt and conscript.
And there's also paulp's sbt-extras, which is an alternative "sbt" command line, with more options. Note that it's still sbt, just the shell script that starts it is more intelligent.
SBT (Simple Build Tool) seems to be the build tool of choice in the Scala world. It supports a number of different dependency resolution mechanisms: https://github.com/harrah/xsbt/wiki/Library-Management
Placed as an answer cause it doesn't fit in comment length constraint.
In addition to #Chris answer, I would like to recommend you some commons for sbt (which I personally think is absolutely superb). Although sbt denote Simple Build Tool, sometimes it is not so easy for first-timers to setup project with sbt (all this things with layouts, configs, and so on).
Use giter (g8) to create new project with predefined template (which g8 fetches from github.com). There are templates for Android app, unfiltered and more. Sometimes they are include some of the dependencies by default.
To create layout just type:
g8 gseitz/android-sbt-project
(An example for Android app)
Alternatively, use np pluggin for sbt, which provides interactive type-through way to create new project and basic layout.
A corrected and simplified version of the current main answer: use scalas.
You have to compose your script of 3 parts. One would be sbt, another would be a very simple wrapper around SBT called scalas, the last one is your custom script. Note that the first two scripts can be installed either globally (/usr/bin/, ~/bin/) or locally (in the same directory).
the first part is sbt. If you already have it installed then good. If not, you can either install it, or use a very cool script from paulp: https://github.com/paulp/sbt-extras/blob/master/sbt BTW, that thing is a charming way to use sbt on Unix. Although not available on windows. Anyways...
the second part is scalas. It's just an entrypoint to SBT.
#!/bin/sh
exec /path/to/sbt -Dsbt.main.class=sbt.ScriptMain -sbt-create \
-Dsbt.boot.directory=$HOME/.sbt/boot \
"$#"
the last part is your custom script. Example:
#!/usr/bin/env scalas
/***
scalaVersion := "2.11.0"
libraryDependencies ++= Seq(
"org.joda" % "joda-convert" % "1.5",
"joda-time" % "joda-time" % "2.3"
)
*/
import org.joda.time._
println(DateTime.now())
//println(DateTime.now().minusHours(12).dayOfMonth())
What Daniel said. Although it's worth mentioning that the sbt docs carefully label this functionality "experimental".
Indeed, if you try to run the embedded script with scalaVersion := "2.10.3", you'll get not found: value !#
Luckily, the !# script header-closer is unnecessary here, so you can leave it out.
Under scalaVersion := "2.10.3", the script will need to have the file extension ".scala"; using the bash shell script file extension, ".sh", won't work.
Also, it isn't clear to me that the latest version of Dispatch (0.11.0) supports dispatch-twitter, which is used in the example.
For more about header-closers in this context, see Alvin Alexander's blog post on Scala scripting, or section 14.10 of his Scala Cookbook.
I have a build.gradle file with the following task:
task classpath(dependsOn: jar) << {
println "CLASSPATH=${tasks.jar.archivePath}:${configurations.runtime.asPath}"
}
Then, in my Scala script:
#!
script_dir=$(cd $(dirname "$0") >/dev/null; pwd -P)
classpath=$(cd ${script_dir} && ./gradlew classpath | grep '^CLASSPATH=' | sed -e 's|^CLASSPATH=||')
PATH=${SCALA_HOME}/bin:${PATH}
JAVA_OPTS="-Xmx4g -XX:MaxPermSize=1g" exec scala -classpath ${classpath} "$0" "$0" "$#"
!#
Note that we don't need a separate scalas executable in our PATH, since we can use the self-executing shell script trick.
Here's an example script, which reads its own content (via the $0 variable), chops off everything before an arbitrary marker (__BEGIN_SCRIPT__) and runs sbt on the result. We use process substitution to pretend this calculated content is a real file. One problem with this approach is that sbt will seek within the given file, i.e. it doesn't read it sequentially. That stops it working with the <(foo) form of process substitution, as found in bash; however zsh has a =(foo) form which is seekable.
#!/usr/bin/env zsh
set -e
# Find the line # in this file ($0) after the line beginning __BEGIN_SCRIPT__
LINENUM=$(awk '/^__BEGIN_SCRIPT__/ {print NR + 1; exit 0; }' "$0")
sbtRun() {
# Run the sbt command, such that it will fetch dependencies and execute a
# script
sbt -Dsbt.main.class=sbt.ScriptMain \
-sbt-create \
-Dsbt.boot.directory="$HOME/.sbt/boot" \
"$#"
}
# Run SBT on the contents of this file, starting at LINENUM
sbtRun =(tail -n+"$LINENUM" "$0")
exit 0
__BEGIN_SCRIPT__
/***
scalaVersion := "2.11.0"
libraryDependencies ++= Seq(
"org.joda" % "joda-convert" % "1.5",
"joda-time" % "joda-time" % "2.3"
)
*/
import org.joda.time._
println(DateTime.now())