I have a following problem with sbt. From sbt console I can get value of baseDirectory global setting.
> baseDirectory
[info] /home/georginaumov/Documents/hello
>
I added one task from build.sbt
lazy val printBaseDirectory: TaskKey[Unit] = TaskKey[Unit]("printBaseDirectory", "Print baseDirectory for the project", KeyRanks.ATask)
printBaseDirectory <<= streams map Tasks.printBaseDirectory
This is code for Tasks singleton object.
import sbt.Keys.TaskStreams
import sbt._
object Tasks {
def printBaseDirectory(streams: TaskStreams): Unit = {
streams.log.info("Here I want to print value of baseDirectory")
}
}
But I cannot get the value. I tried many things and in fact the problem is that I cannot get java.io.File from sbt.SettingKey[java.io.File].
How to solve the problem?
I tried
printBaseDirectory <<= streams map Tasks.printBaseDirectory(baseDirectory)
and
def printBaseDirectory(baseDir: sbt.File)(streams: TaskStreams): Unit = {
streams.log.info("Here I want to print value of baseDirectory")
}
into the singleton object but get following error:
error: type mismatch;
[error] Type error in expression
found : sbt.SettingKey[java.io.File]
required: sbt.File
(which expands to) java.io.File
Edit:
Many thanks to Martin. I wrote a article on my blog for people with a similar problem in the future.
I think what you want is to define Tasks as:
import sbt.Keys.TaskStreams
import sbt._
object Tasks {
def printBaseDirectory(streams: TaskStreams, dir: File): Unit = {
streams.log.info(dir.getAbsolutePath)
}
}
and in build.sbt:
lazy val printBaseDirectory: TaskKey[Unit] = TaskKey[Unit]("printBaseDirectory", "Print baseDirectory for the project", KeyRanks.ATask)
printBaseDirectory <<= (streams, baseDirectory) map Tasks.printBaseDirectory
Or:
lazy val printBaseDirectory: TaskKey[Unit] = TaskKey[Unit]("printBaseDirectory", "Print baseDirectory for the project", KeyRanks.ATask)
printBaseDirectory := {
Tasks.printBaseDirectory(streams.value, baseDirectory.value)
}
Using the .value macro is the preferred way.
Related
If I define a SBT task key outside of my build.sbt file as a Scala class in the project folder, how can I import that task
So in ./project/MyTask.scala I have;
import sbt.Keys._
import sbt._
object MyTask {
lazy val uname = settingKey[String]("Your name")
lazy val printHi = taskKey[Unit]("print Hi")
printHi := { println(s"hi ${name.value}") }
}
Then in ./build.sbt I have;
import MyTask._
uname := "Joe"
Then when I run sbt printHi I get an error that the task cannot be found. Running show uname also works. When I define printHi in build.sbt directly without the object import everything works as expected.
I need so somehow add this task to the build.sbt file. How can I do this?
The issue is that your expression printHi := { println(s"hi ${name.value}") } isn't associated to anything.
First off, everything in sbt is a transformation, in this case (:=) overrides any previous setting of printHi to the definition you give (println(s"hi ${name.value}")). But by not associating that expression (which is a Setting[Task[Unit]]) to anything (for instance to a project, or as a value that then gets attached to a project) it just gets evaluated in the construction of the MyTask object and then thrown away.
One way to do this is to put that setting (printHi := println(s"hi ${name.value}")), in a Seq[Setting[_]] that you then pull into build.sbt:
project/MyTask.scala
import sbt._, Keys._
object MyTask {
val printHi = taskKey[Unit]("prints Hi")
val myTaskSettings = Seq[Setting[_]](
printHi := println(s"hi ${name.value}")
)
}
build.sbt
import MyTask._
myTaskSettings
Another way is to define MyTask to be a mini plugin that lives in project/. You can see an example of this in PgpCommonSettings.
I am trying to make my own custom CSV reader. I am using IntelliJ IDEA 14 with sbt and specs2 test framework.
The class I declared in src/main is as follows:
import java.io.FileInputStream
import scala.io.Source
class CSVStream(filePath:String) {
val csvStream = Source.fromInputStream(new FileInputStream(filePath)).getLines()
val headers = csvStream.next().split("\\,", -1)
}
The content of the test file in src/test is as follows:
import org.specs2.mutable._
object CSVStreamSpec {
val csvSourcePath = getClass.getResource("/csv_source.csv").getPath
}
class CSVStreamSpec extends Specification {
import CSVStreamLib.CSVStreamSpec._
"The CSV Stream reader" should {
"Extract the header" in {
val csvSource = CSVStream(csvSourcePath)
}
}
}
The build.sbt file contains the following:
name := "csvStreamLib"
version := "1.0"
scalaVersion := "2.11.4"
libraryDependencies ++= Seq("org.specs2" %% "specs2-core" % "2.4.15" % "test")
parallelExecution in Test := false
The error I am getting when I type test is as follows:
[error] /Users/raiyan/IdeaProjects/csvStreamLib/src/test/scala/csvStreamSpec.scala:18: not found: value CSVStream
[error] val csvSource = CSVStream(csvSourcePath)
[error] ^
[error] one error found
[error] (test:compile) Compilation failed
[error] Total time: 23 s, completed 30-Dec-2014 07:44:46
How do I make the CSVStream class accessible to the CSVStreamSpec class in the test file?
Update:
I tried it with sbt in the command line. The result is the same.
You forgot the new keyword. Without it, the compiler looks for the companion object named CSVStream, not the class. Since there is none, it complains. Add new and it'll work.
I am new to SBT and I have been trying to build a custom task for this build.
I have a simple build project:
import sbt._
import Keys._
object JsonBuild extends Build{
lazy val barTask = taskKey[Unit]("some simple task")
val afterTestTask1 = barTask := { println("tests ran!") }
val afterTestTask2 = barTask <<= barTask.dependsOn(test in Test)
lazy val myBarTask = taskKey[Unit]("some simple task")
//val afterMyBarTask1 = myBarTask := { println("tests ran!") }
lazy val afterMyBarTask2 = myBarTask <<= (myBarTask).dependsOn(test in Test) map { _ => println("tests ran!") }
//settings ++ Seq(afterMyBarTask2)
override lazy val settings = super.settings ++ Seq(afterMyBarTask2)
}
I keep getting the error:
References to undefined settings:
{.}/*:myBarTask from {.}/*:myBarTask (C:\Users\haques\Documents\workspace\SBT\jsonParser\project\Build.scala:13)
{.}/test:test from {.}/*:myBarTask (C:\Users\haques\Documents\workspace\SBT\jsonParser\project\Build.scala:13)
Did you mean test:test ?
I have googled around and I cannot find a solution.
Can you explain why it is not working?
lazy val myBarTask = taskKey[Unit]("some simple task")
override lazy val settings = super.settings ++ Seq(myBarTask := { (test in Test).value; println("tests ran!") } )
myBarTask is undefined when you call dependsOn. you should define it before using dependsOn. also value call on key (task/setting) is now preferred way to depend on other keys. you can still use your version, but define myBarTask
This has been bothering.
I did a bit more reading.
I think I know why the above code does not work.
lazy val afterMyBarTask2 = myBarTask <<= (myBarTask).dependsOn(test in Test) map { _ => println("tests ran!") }
When I write (myBarTask).dependsOn(test in Test), the project scope for test is chosen by SBT as ThisBuild.
{.}/test:test from {.}/*:myBarTask (C:\Users\haques\Documents\workspace\SBT\jsonParser\project\Build.scala:13)
ThisBuild project scope does not have the setting test in configuration Test.
Only projects have the setting test.
The key I think that setting is added by some default SBT plugin to the projects settings.
You check what scopes settings exist in SBT by using the inspect command.
If you type in the SBT REPL:
{.}/test:test
The output is:
inspect {.}/test:test
[info] No entry for key.
SBT correctly suggests:
test:test which is:
{file:/C:/Users/haques/Documents/workspace/SBT/jsonParser/}jsonparser/test:test
If the project is not specified in the project scope axis, SBT chooses the current project by default.
Every SBT project if not specified has its own project settings.
I'm writing an SBT Plugin that adds a Command and would like users to be able to configure this Command by setting variables in their build.sbt. What is the simplest way to achieve this?
Here is an simplified example of what the Plugin looks like:
import sbt.Keys._
import sbt._
object MyPlugin extends Plugin {
override lazy val settings = Seq(commands += Command.args("mycommand", "myarg")(myCommand))
def myCommand = (state: State, args: Seq[String]) => {
//Logic for command...
state
}
}
I would like someone to be able to add the follow to their build.sbt file:
newSetting := "light"
How do I make this available as a String variable from inside the myCommand Command above?
Take a look at the example here: http://www.scala-sbt.org/release/docs/Extending/Plugins.html#example-plugin
In this example, a task and setting are defined:
val newTask = TaskKey[Unit]("new-task")
val newSetting = SettingKey[String]("new-setting")
val newSettings = Seq(
newSetting := "test",
newTask <<= newSetting map { str => println(str) }
)
A user of your plugin could then provide their own value for the newSetting setting in their build.sbt:
newSetting := "light"
EDIT
Here's another example, closer to what you're going for:
Build.scala:
import sbt._
import Keys._
object HelloBuild extends Build {
val newSetting = SettingKey[String]("new-setting", "a new setting!")
val myTask = TaskKey[State]("my-task")
val mySettings = Seq(
newSetting := "default",
myTask <<= (state, newSetting) map { (state, newSetting) =>
println("newSetting: " + newSetting)
state
}
)
lazy val root =
Project(id = "hello",
base = file("."),
settings = Project.defaultSettings ++ mySettings)
}
With this configuration, you can run my-task at the sbt prompt, and you'll see newSetting: default printed to the console.
You can override this setting in build.sbt:
newSetting := "modified"
Now, when you run my-task at the sbt prompt, you'll see newSetting: modified printed to the console.
EDIT 2
Here's a stand-alone version of the example above: https://earldouglas.com/ext/stackoverflow.com/questions/17038663/
I've accepted #James's answer as it really helped me out. I moved away from using a Commands in favour of a Task (see this mailing list thread). In the end my plugin looked something like this:
package packge.to.my.plugin
import sbt.Keys._
import sbt._
object MyPlugin extends Plugin {
import MyKeys._
object MyKeys {
val myTask = TaskKey[Unit]("runme", "This means you can run 'runme' in the SBT console")
val newSetting = SettingKey[String]("newSetting")
}
override lazy val settings = Seq (
newSetting := "light",
myTask <<= (state, newSetting) map myCommand
)
def myCommand(state: State, newSetting: String) {
//This code runs when the user types the "runme" command in the SBT console
//newSetting is "light" here unless the user overrides in their build.sbt (see below)
state.log.info(newSetting)
}
}
To override the newSetting in the build.sbt of a project that uses this plugin:
import packge.to.my.plugin.MyKeys._
newSetting := "Something else"
The missing import statement had me stuck for a while!
I would like to make my ScalaCheck property tests in my specs2 test suite deterministic, temporarily, to ease debugging. Right now, different values could be generated each time I re-run the test suite, which makes debugging frustrating, because you don't know if a change in observed behaviour is caused by your code changes, or just by different data being generated.
How can I do this? Is there an official way to set the random seed used by ScalaCheck?
I'm using sbt to run the test suite.
Bonus question: Is there an official way to print out the random seed used by ScalaCheck, so that you can reproduce even a non-deterministic test run?
If you're using pure ScalaCheck properties, you should be able to use the Test.Params class to change the java.util.Random instance which is used and provide your own which always return the same set of values:
def check(params: Test.Parameters, p: Prop): Test.Result
[updated]
I just published a new specs2-1.12.2-SNAPSHOT where you can use the following syntax to specify your random generator:
case class MyRandomGenerator() extends java.util.Random {
// implement a deterministic generator
}
"this is a specific property" ! prop { (a: Int, b: Int) =>
(a + b) must_== (b + a)
}.set(MyRandomGenerator(), minTestsOk -> 200, workers -> 3)
As a general rule, when testing on non-deterministic inputs you should try to echo or save those inputs somewhere when there's a failure.
If the data is small, you can include it in the label or error message that gets shown to the user; for example, in an xUnit-style test: (since I'm new to Scala syntax)
testLength(String x) {
assert(x.length > 10, "Length OK for '" + x + "'");
}
If the data is large, for example an auto-generated DB, you might either store it in a non-volatile location (eg. /tmp with a timestamped name) or show the seed used to generate it.
The next step is important: take that value, or seed, or whatever, and add it to your deterministic regression tests, so that it gets checked every time from now on.
You say you want to make ScalaCheck deterministic "temporarily" to reproduce this issue; I say you've found a buggy edge-case which is well-suited to becoming a unit test (perhaps after some manual simplification).
Bonus question: Is there an official way to print out the random seed used by ScalaCheck, so that you can reproduce even a non-deterministic test run?
From specs2-scalacheck version 4.6.0 this is now a default behaviour:
Given the test file HelloSpec:
package example
import org.specs2.mutable.Specification
import org.specs2.ScalaCheck
class HelloSpec extends Specification with ScalaCheck {
package example
import org.specs2.mutable.Specification
import org.specs2.ScalaCheck
class HelloSpec extends Specification with ScalaCheck {
s2"""
a simple property $ex1
"""
def ex1 = prop((s: String) => s.reverse.reverse must_== "")
}
build.sbt config:
import Dependencies._
ThisBuild / scalaVersion := "2.13.0"
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / organization := "com.example"
ThisBuild / organizationName := "example"
lazy val root = (project in file("."))
.settings(
name := "specs2-scalacheck",
libraryDependencies ++= Seq(
specs2Core,
specs2MatcherExtra,
specs2Scalacheck
).map(_ % "test")
)
project/Dependencies:
import sbt._
object Dependencies {
lazy val specs2Core = "org.specs2" %% "specs2-core" % "4.6.0"
lazy val specs2MatcherExtra = "org.specs2" %% "specs2-matcher-extra" % specs2Core.revision
lazy val specs2Scalacheck = "org.specs2" %% "specs2-scalacheck" % specs2Core.revision
}
When you run the test from the sbt console:
sbt:specs2-scalacheck> testOnly example.HelloSpec
You get the following output:
[info] HelloSpec
[error] x a simple property
[error] Falsified after 2 passed tests.
[error] > ARG_0: "\u0000"
[error] > ARG_0_ORIGINAL: "猹"
[error] The seed is X5CS2sVlnffezQs-bN84NFokhAfmWS4kAg8_gJ6VFIP=
[error]
[error] > '' != '' (HelloSpec.scala:11)
[info] Total for specification HelloSpec
To reproduce that specific run (i.e with the same seed)You can take the seed from the output and pass it using the command line scalacheck.seed:
sbt:specs2-scalacheck>testOnly example.HelloSpec -- scalacheck.seed X5CS2sVlnffezQs-bN84NFokhAfmWS4kAg8_gJ6VFIP=
And this produces the same output as before.
You can also set the seed programmatically using setSeed:
def ex1 = prop((s: String) => s.reverse.reverse must_== "").setSeed("X5CS2sVlnffezQs-bN84NFokhAfmWS4kAg8_gJ6VFIP=")
Yet another way to provide the Seed is pass an implicit Parameters where the seed is set:
package example
import org.specs2.mutable.Specification
import org.specs2.ScalaCheck
import org.scalacheck.rng.Seed
import org.specs2.scalacheck.Parameters
class HelloSpec extends Specification with ScalaCheck {
s2"""
a simple property $ex1
"""
implicit val params = Parameters(minTestsOk = 1000, seed = Seed.fromBase64("X5CS2sVlnffezQs-bN84NFokhAfmWS4kAg8_gJ6VFIP=").toOption)
def ex1 = prop((s: String) => s.reverse.reverse must_== "")
}
Here is the documentation about all those various ways.
This blog also talks about this.
For scalacheck-1.12 this configuration worked:
new Test.Parameters {
override val rng = new scala.util.Random(seed)
}
For scalacheck-1.13 it doesn't work anymore since the rng method is removed. Any thoughts?