I want to build subproject in Spark with sbt. I found this example and it works
$ ./build/sbt -Phive -Phive-thriftserver (build)
sbt (spark)> project hive (switch to subproject)
sbt (hive)> testOnly *.HiveQuerySuite -- -t foo ( run test case)
However, I tried the following but it does not build but quit
./build/sbt -mllib
I do not know how does the author figure out -Phive -Phive-thriftserver. I cannot find this in Spark source code.
I just want to do the exact same thing as the example but with a different subproject.
This is not asking how to use projects to print out all available projects.
Specify the project scope:
./build/sbt mllib/compile
refer to: http://www.scala-sbt.org/0.13/docs/Scopes.html
Related
is there a way to build/compile all configurations at once? I have a project that has a Dev configuration in addition to the default Compile and Test configuration, and i am looking for a command or a setting in my build.sbt that would allow me to compile/package all 3 configurations at once.
Basically looking for a way to avoid having to do these 3 commands to build the entire source tree:
sbt compile
sbt dev:compile
sbt test:compile
When I use sbt from IntelliJ it is able to do this on building the project, but I am looking to do this from the command line.
First, you can run multiple tasks with a single sbt invocation:
sbt compile dev:compile test:compile
Second, you could define an alias in your build which does what you want:
addCommandAlias("compileAll", "; compile; dev:compile; test:compile")
Then, just run sbt compileAll.
I am trying to merge two modules into single module. Both are successfully running modules. I merge two modules. And trying to run the test cases.
i am compiling source and testcases by using sbt commands:
sbt
clean
compile
project module-read
test:compile
it:test
Till test:compile everything working fine but after it:test, it showing lot of compilation issues.
Could I know best way of compiling?
The test:compile task will only compile tests within the src/test/scala folder as per the default sbt test configuration.
In order to compile your integration tests (in src/it/scala) you will have to run it:compile .
See http://www.scala-sbt.org/0.13.5/docs/Detailed-Topics/Testing.html#integration-tests for more info.
I have a multi module project with moduleA, moduleB, and moduleC. I want to run my class com.helpme.run.MyTest from moduleB.
My guess is the sbt command should look like:
sbt "project moduleA" --mainClass com.helpme.run.MyTest test
But no luck. Please help!! Thanks!
First of all you can run a test by using testOnly
$ sbt testOnly MyTest
But if your project is a multi-project sbt project and you have the same test class with the same name in more than one project you can navigate between projects by project command and then run the test
$ sbt
> project moduleA
> testOnly MyTest
Note that you have to first run sbt and then run the rest of commands from the sbt shell.
Depends on your project configuration testOnly couldn't work
You can try this command too:
sbt "project myProject" "testOnly com.this.is.my.Test"
I have a multi-project build.sbt file. I would like to assemble the jar for just one of the projects. Currently, I do the following:
$ sbt
project analysis
assembly
...
exit
I would like to save a few steps and assemble the jar for the project "analysis" from the command line. Is there a way to do this?
Thanks.
You can use sbt without its REPL:
$ sbt analysis/assembly
For our Scala development we currently use ivy + ant, but we are also trying to use sbt for our development workflow. This would be for the continuous incremental compilation when not using an IDE.
sbt uses ivy, so in theory this should work. But when using an ivy external file the tests won't compile.
To reproduce this you can even use the generated ivy.xml file from any sbt project.
Here are the steps to reproduce the error on a sbt project with tests,
from the sbt console run deliverLocal (deliver-local in previous versions of sbt)
copy the generated ivy file into your project home and rename it to 'ivy.xml'. From my understanding using this file should be equivalent to declaring the dependencies in build.sbt.
edit the build.sbt, add externalIvyFile() on one line and then comment all dependencies declarations
in the console, run reload, then test
compile will run just fine, but test will fail at compile time. None of the dependencies will be honoured, not even the production code of the current project.
What am I missing?
In my case it worked with the following build.sbt:
externalIvyFile()
classpathConfiguration in Compile := Compile
classpathConfiguration in Test := Test
classpathConfiguration in Runtime := Runtime
You just need the extra three lines in the end. Here is a link for more info: http://www.scala-sbt.org/release/docs/Detailed-Topics/Library-Management.html#ivy-file-dependency-configuration
Look for the Full Ivy Example. I hope it helps!
EDIT: Just to be complete - here is what pointed me to the above link: https://github.com/sbt/sbt/issues/849.