Where should javah fit into an SBT build? - scala

javah is used to generate C headers from #native methods. It operates on compiled class files, so it requires a classpath as an argument. For this reason, it seems sensible to make a javah task depend on fullClasspathin Compile.
The issue I am facing is that the generated headers are needed in order to build a native library, and the native library needs to be a resource. But, because it is a resource, it will be included in fullClasspath in Compile, which leads to a circular dependency.
Does SBT have a classpath key that includes all .class files but excludes resources?

I just discovered sbt-jni, a very interesting new SBT plugin which simplifies working with JNI from SBT.
When reading some source code, I stumbled over
this line
, which seems to refer to the problem you are facing. If I understand it correctly, the work-around in sbt-jni is to combine dependencyClasspath in Compile, compile in Compile, and classDirectory in Compile instead of using fullClasspath in Compile. I'm not sure if this will exactly result in a classpath that includes all .class files, but maybe something like that works for you as well.

Related

How to use SBT's externalPom() command

I have a Maven POM file that the deployment engineers need to deploy the system in the enterprise. I have developers using SBT for a Scala project. They use SBT targets that just aren't supported in Maven. We'd like to use the Maven POM file to define the dependencies, slurp in those dependencies in SBT, and define SBT development targets there.
According to the SBT documentation, the externalPom() command is the way to do that. But even with the simplest POM file (two developers have tried this with two different simple POM files that defined different dependencies), the externalPom() command seems to half work. The SBT targets clearly recognize the dependency defined in the POM, but can't resolve it. This error arises:
Cannot add dependency 'commons-collections#commons-collections;3.2.2'
to configuration 'default' of module
default#maven-sbt$sources_javadoc;0.1-SNAPSHOT because this
configuration doesn't exist!
When the externalPom() command is commented out and the equivalent dependency added directly in the build.sbt file everything goes swimmingly. The dependency comes directly from Maven Central in both cases; one from copying the dependency from the Maven tab and one from copying the dependency from the SBT tab. Once again, two developers are seeing exactly the same thing, from two different dependencies. The thing that's common is that both developers have reduced the build.sbt file down to a single statement. In the "slurp from POM" case, that statement is externalPom(). In the "plain old SBT" case, that statement is the dependency copied from Maven Central. The POM file is a dependency list with a single dependency (as simple as we can make it and still test externalPom()).
We suspect that we need something else in the build.sbt to make the externalPom() command work but we don't know what it is. Any help with that would be greatly appreciated.
I did some experimentation with this, and was able to replicate your error in my experiments.
I'm still a bit of a Scala / SBT newbie, but I created a build.sbt file that looks like so:
val Default = config("default")
lazy val root = (project in file(".")).
configs(Default).
settings(
externalPom()
)
This did compile for me!
One non-obvious catch: I had to make sure to include the scala-library in my POM file as a dependency

How do I distribute a Scala macro as a project?

Suppose I have a Scala compile-time macro that I find useful and would like to share it (I do). How do I create a JAR file that when loaded into another project would execute the macro when compiling the new project?
Specifically, I've made a StaticAnnotation that rewrites the AST of the class that it wraps before compile time. This works in my Maven build (macro defined in the main directory, runs on test cases in the test directory) because I have
<compilerPlugins>
<compilerPlugin>
<groupId>org.scalamacros</groupId>
<artifactId>paradise_2.10.5</artifactId>
<version>2.1.0-M5</version>
</compilerPlugin>
</compilerPlugins>
in my scala-maven-plugin. (I'm starting with a Scala 2.10 project and if it works, will provide both 2.10 and 2.11.)
But if I put the resulting JAR on a Scala console classpath, in a Scala script, or into another Maven project (without special compiler plugins), it simply ignores the macro: the AST does not get overwritten and my compile-time println statements don't execute. If I use the #compileTimeOnly annotation on my macro (new in Scala 2.11), then it complains with the #compileTimeOnly error message.
Do I really need to tell my users to add compiler plugins in their pom.xml files, as well as alternate instructions for SBT and other build tools? Other packages containing macros (MacWire, Log4s) don't come with complicated build instructions: they just say, "point to this dependency in Maven Central." I couldn't find the magic in their build process that makes this work. What am I missing?
If you're relying on a macro-paradise-only feature then yes, you do need to tell your users to add compiler plugins. See http://docs.scala-lang.org/overviews/macros/annotations.html . The projects you mention are only using the scala compiler's built-in (non-paradise) macro features, not macro annotations.

scala & intellij : difficulties to create a runnnable jar

I can't launch a scala jar; when I launch it I get the error "Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/immutable/List" which seems to mean the scala library is not loaded...
this is a screenshot showing a lot of informations on the artifact window.
here is the manifest:
Manifest-Version: 1.0
Class-Path: libs/scala-library-2.10.0.jar libs/commons-logging-1.1.1.j
ar libs/jcip-annotations-1.0.jar libs/jwnl-1.4_rc3.jar libs/laf-plugi
n-7.2.1.jar libs/laf-widget-7.2.1.jar libs/miglayout-core-4.2.jar lib
s/miglayout-swing-4.2.jar libs/scala-actors.jar libs/scala-library.ja
r libs/scala-swing.jar libs/slf4j-api-1.6.4.jar libs/slick_2.10-1.0.0
.jar libs/sqlite-jdbc-3.7.2.jar libs/substance-7.2.1.jar libs/trident
-7.2.1-swing.jar
Main-Class: Fenetre
and when I enter "java xf myJar.jar", there are extracted files in the directory:
- .class files
- in the libs folder, there are the libraries INCLUDING scala-library.jar & scala-library-2.10.0.jar(I specified only one of these two files in the manifest to avoid conflicts)
can you help me?
I'm new to Scala and don't know what the problem is however I've been compiling "fat jars" which include all the required libs.
I've been using https://github.com/sbt/sbt-assembly to do this successfully.
Despite what your manifest is telling you, when you run the application you either do not have the scala-library included in your class path or there's some confusion when you attempt to import List. Scala should automatically import the immutable collection classes in your project with the root Predef implementation.
Predef provides type aliases for types which are commonly used, such as the immutable collection types scala.collection.immutable.Map, scala.collection.immutable.Set, and the scala.collection.immutable.List constructors (scala.collection.immutable.:: and scala.collection.immutable.Nil). The types Pair (a scala.Tuple2) and Triple (a scala.Tuple3), with simple constructors, are also provided.
Predef in core Scaladocs
Try printing the classpath from within your app to confirm.
Although not pertinent to your question, I would recommend using SBT for dependency management now that IntelliJ IDEA 13 has full SBT integration support. Your collaborators not using IntelliJ will also be happier because SBT gives them more options for build technologies, editors, and other tooling when working on the project.

how to compile single file in sbt

I'm doing some refactoring that made compiler temporally give errors in several files. I'd like to work with them one by one (starting with common dependencies) and need some tool to check if modification is correct.
sbt compile is inconvenient because it gives too many errors and spends much time for compiling things that have no good.
I'm searching for a way to compile single file with sbt or a method for extracting sbt side libraries definition to pass them to a normal scalac compiler
There was a similar topic: How to compile just some files with sbt? that turned out to be source code error discussion rather that sbt functionality disclosure.
You could add the following line to build.sbt:
sources in Compile <<= (sources in Compile).map(_ filter(_.name == "Particular.scala"))
Then fix Particular.scala, then edit build.sbt and put the name of the next source file. If you keep the sbt console open, reload will re-read the .sbt file after you modify it.
I just wanted to mention here that I came across sbt-compile-quick-plugin (https://github.com/etsy/sbt-compile-quick-plugin). It does what it says on the tin, just add addSbtPlugin("com.etsy" % "sbt-compile-quick-plugin" % "1.3.0") to your project/plugins.sbt, then you can just start up sbt and run compileQuick /path/to/your/file

Compiling Java annotations with sbt

I've created Java annotations (since I need run time retention) under $PROJECT/src/main/java and my scala codewhich uses these java annotations us under $PROJECT/src/main/scala. The Java annotation thus created also makes use of a Java ENUM as it's value.
If I compile the project then sbt doesn't seem to compile the Java annotations first and errors out on each usage of the enum in annotations. If I comment out all usages of the Java enum in annotations in scala code and do a compile, uncomment enum usage and compile again it all works fine.
How do I ensure that sbt compiles my java annotations and enum (i.e. $PROJECT/src/main/java) before attempting to compile scala code when doing a clean build?
EDIT: I have a bare bones build.sbt and am using sbt 0.11.2
Some good news: This is a known issue and has been resolved.
Some bad news: It's resolved in 2.10 and the fix may not be backported to 2.9.3 (quoting Paul Phillips in the issue thread):
I've tagged this for backporting, which is not a guarantee; I don't
have time to do it right now but I expect to in the near future.
Some good news: If you're stuck on pre-2.10 and your Java sources don't depend on your Scala sources, you can just add the following to your build.sbt and all is well:
compileOrder := CompileOrder.JavaThenScala
Some bad news: If you're stuck on pre-2.10 and your Java sources do depend on your Scala sources, I'm pretty sure you're out of luck, and the comment-compile-uncomment trick is probably your best bet.
I'll bet you're facing SI-2764. This has been fixed in Scala 2.10.
In the meantime, create a separate sub-project for your Java annotations, and depend on this from the project containing the Scala code.. Then the Scala compiler will only process the .class files, rather the .java files.