SBT Broken Pipe - scala

I've been avoiding using SBT since the support in intellij for maven has always been far superior, plus I don't see much advantage in SBT; but I figure why fight the masses.
So one of my open source projects I've converted over to SBT. Now when I run tests (approx 1000 test cases), I get OOMs. Ok so I've tried
fork in Test := true
javaOptions in Test ++= Seq("-Xmx2048m", "-XX:MaxPermSize=512m")
Ok so my OOMs go away but now I get
sbt.ForkMain$Run$RunAborted: java.net.SocketException: Broken pipe
at sbt.ForkMain$Run.write(ForkMain.java:114)
at sbt.ForkMain$Run$1.info(ForkMain.java:132)
Seems to be in different places each time.
These tests all pass if I'm building via maven (scala test maven plugin).
Help me Obi-wan or SBT lovers.
Edit: Adding env details
sbt 0.12.4
java 7.25
scala 2.10.2

Related

Real SBT Classpath at Runtime

I have some test cases that need to look at the classpath to extract the paths of some files/directories in there. This works fine in the IDE.
The problem is that, when running SBT test, Properties.javaClassPath gives me /usr/share/sbt-launcher-packaging/bin/sbt-launch.jar.
The classpath is fine when I run show test:dependency-classpath. Is there a way to obtain that information from inside the running Scala/Java program? Or is there a way to toss it into a system property or environment variable?
By default the tests are run inside of the SBT process, so the classpath will look like it did when you started sbt (I guess sbt does some trixery to dynamicly load the classes for the tests, not sure). One way to do what you want is to run your tests in a forked jvm, that way sbt will start a new jvm to run the test suite and that should have the expected class path:
fork in Test := true
I have been working on understanding how the EmbeddedCassandra works in the spark-cassandra-connector project which uses the classpath to start up and control a Cassandra instance. Here is a line from their configuration that gets the correct classpath.
(compile in IntegrationTest) <<= (compile in Test, compile in IntegrationTest) map { (_, c) => c }
The entire source can be found here: https://github.com/datastax/spark-cassandra-connector/blob/master/project/Settings.scala
Information on the <<= operator can be found here: http://www.scala-sbt.org/0.12.2/docs/Getting-Started/More-About-Settings.html#computing-a-value-based-on-other-keys-values. I'm aware that this is not the current version of sbt, but the definition still holds.

Using sbt 0.13.1, tests won't compile using the generated externalIvyFile

For our Scala development we currently use ivy + ant, but we are also trying to use sbt for our development workflow. This would be for the continuous incremental compilation when not using an IDE.
sbt uses ivy, so in theory this should work. But when using an ivy external file the tests won't compile.
To reproduce this you can even use the generated ivy.xml file from any sbt project.
Here are the steps to reproduce the error on a sbt project with tests,
from the sbt console run deliverLocal (deliver-local in previous versions of sbt)
copy the generated ivy file into your project home and rename it to 'ivy.xml'. From my understanding using this file should be equivalent to declaring the dependencies in build.sbt.
edit the build.sbt, add externalIvyFile() on one line and then comment all dependencies declarations
in the console, run reload, then test
compile will run just fine, but test will fail at compile time. None of the dependencies will be honoured, not even the production code of the current project.
What am I missing?
In my case it worked with the following build.sbt:
externalIvyFile()
classpathConfiguration in Compile := Compile
classpathConfiguration in Test := Test
classpathConfiguration in Runtime := Runtime
You just need the extra three lines in the end. Here is a link for more info: http://www.scala-sbt.org/release/docs/Detailed-Topics/Library-Management.html#ivy-file-dependency-configuration
Look for the Full Ivy Example. I hope it helps!
EDIT: Just to be complete - here is what pointed me to the above link: https://github.com/sbt/sbt/issues/849.

jacoco4sbt is not "detecting" my tests. Any idea why?

I have a typical sbt (0.13) build and have added the jacoco4sbt plugin to my build.
addSbtPlugin("de.johoop" % "jacoco4sbt" % "2.1.1")
I use specs2 to run my tests (2.2.2).
If I run
~>sbt
>test
all my tests get run (120 of them). However, if I do
>jacoco:test
it runs 0 tests, as if the jacoco configuration cannot find them.
A quick search reveals that there is an issue with jacoco4sbt and Play because Play sets parallelExecution to false. However, I am not using Play, and parallelExecution is set to True for both configurations. I have tried to set them both to false to no avail.
Any idea what might be going wrong?
n.b. The project I am working on is open source, so I created a branch where I put my attempt at adding jacoco4sbt. Feel free to clone it and see what is happening for yourself.
https://github.com/jedesah/scala-codesheet-api/tree/jacoco
I had this issue, but upgraded to Specs2 2.2.3 and jacoco4sbt started producing output from that point.
For what it's worth, I had the same problem when using specs2. When I switched to ScalaTest, jacoco4sbt started detecting my tests.
I have a very basic configuration too, so I don't know we're missing something or if there's something wrong in the current jacoco4sbt version. I did try version 2.1.0 of jacoco4sbt but had the same results.

How to compile tests with SBT without running them

Is there a way to build tests with SBT without running them?
My own use case is to run static analysis on the test code by using a scalac plugin. Another possible use case is to run some or all of the test code using a separate runner than the one built into SBT.
Ideally there would be a solution to this problem that applies to any SBT project. For example, Maven has a test-compile command that can be used just to compile the tests without running them. It would be great if SBT had the same thing.
Less ideal, but still very helpful, would be solutions that involve modifying the project's build files.
Just use the Test / compile command.
Test/compile works for compiling your unit tests.
To compile integration tests you can use IntegrationTest/compile.
Another hint to continuously compile on every file change: ~Test/compile
We have a build.sbt file that is used for multiple projects. Doing sbt test:compile compiled the tests for every single project and took over 30 minutes.
I found out I can compile only the tests for a specific project named xyz by doing:
sbt xyz/test:compile
Using sbt version 1.5.0 and higher test:compile returns deprecation warning.
Use Test / compile.
(docs)

SBT doesn't reconize junit testcase written in java file

I made a Scala/Java mixed project with SBT 0.11.2. My config for JUnit testing is
resolvers += "twitter.com" at "http://maven.twttr.com/"
seq(com.github.retronym.SbtOneJar.oneJarSettings: _*)
libraryDependencies += "com.novocode" % "junit-interface" % "0.10-M2" % "test"
When I write JUnit test cases in Scala with #Test, every goes well. But when I write a Java JUnit test case, then run test in sbt, the Java JUnit test cannot be reconized. Only test cases written in Scala are executed.
How can I make sbt recognize my Java and Scala test cases at the same time?
Probably late for the origional question, but..
I've just been looking at this. The JUnit tests in my project were not running for me until I ran sbt clean test. Now all working like a charm.
There was a bug in 0.11.x in detecting Java tests that was fixed in 0.12.0, although I didn't think it affected detecting annotated tests. You might try coming up with a minimal test case and checking with the latest sbt version (0.12.1). If the problem still exists, file a bug.
you should put your test classes into src/test/java and your class name should end with "Test" (for example myTest.java)