Scala Play messages file to inline or reuse the version in build.sbt - scala

I have a Scala Play project and currently I show the current application version at some location in my main template. The version I can easily define in the conf/messages file. However, since I have an automated build for creating releases, the release iterations will update the build.sbt increasing the version according to the release there e.g. version := "1.0.6-SNAPSHOT"
I could use the same mechanics during the release to update my conf/messages file as well but instead I would prefer to have my conf/messages file including the version information from build.sbt e.g. alla application.version=${sbt.application.version}.
How can I accomplish this? is it possible at all?
UPDATE: it is worth mentioning that in Maven these build settings become Java system properties and can be easily used.

You can use sbt-buildinfo plugin to generate a Scala source based on the build.sbt.
The plugin generates a BuildInfo object, which contains information you can then use to display the application version.
Otherwise I don't think you can access sbt information from your configuration.

You can use the xsbt-filter plugin to achieve this. It basically works like Maven's resource filtering mechanism, and exposes the project's name, version, etc. by default. You can further configure it to expose other properties.

Related

SBT : Auto versioning of artifacts

We have a Scala project which we are building via CI tool (TeamCity/Jenkins). We are looking for an ability to set the build version of the artifact from the build job itself and not depend on the entry in build.sbt file. To give a reference, for java maven builds we can use goal set-version where the artifact version is set, irrespective of what we have in pom.xml. we are looking for something similar for a SBT build as well.
I'd reccoment to take a look at next sbt plugins:
https://github.com/dwijnand/sbt-dynver
https://github.com/sbt/sbt-git
Our team uses sbt-dynver to create version from Git, because it is easier. I'd recommend to build version on top of git tags information rather then using CI tool (TeamCity/Jenkins) information - like build number, because you can build same version twice for instance.
Also, consider using https://github.com/sbt/sbt-buildinfo - additionally, so to expose build version though API or print to output to quick identify currently deployed app version.
This isn't really the right forum for this kind of recommendation, but you could start by looking a sbt-git which will set version numbers based on GIT tags.
One option your have, is to add this to your build.sbt:
version := sys.env.getOrElse("ARTIFACT_VERSION", "0.0.0-SNAPSHOT")
Then setting the version you want at the environment variable ARTIFACT_VERSION.

Apache UIMA Ruta Workbench with custom ruta-core

In our corpus we often find and need to parse data that is alpha-numeric as a single token (for example file hashes, email addresses, etc.) We have created our own ruta-core version by re-working the JFlex definition. Is there a way we can still work with this new version of ruta-core in Workbench?
If you use simple Ruta projects, you would need to replace the ruta.engine plugin with a different jar containing your ruta-core version. The clean way would be to build a complete update site with your version.
You could maybe also set your ruta-core jar in the classpath of your ruta launch configurations.
If you use maven-based projects, you can set the dependency to your version of ruta-core, which should then be used in the launch delegate.
For your use case, I would not use your own version of ruta-core at all. You could simply write your own version of the TokenLexer, as you probably did. Then, you can configure the utilized TokenLexer in the RutaEngine as there is a configuration parameter for setting it. Thus, there is already some functionality to customize the JFlex definition without building your own ruta-core.
DISCLAIMER: I am a developer of UIMA Ruta

How to set Scala version for sbt itself to use?

Does the Scala version that sbt uses to run matters? Can it be changed?
p.s. I'm not asking about changing the Scala version against which your project is build.
In regards to the current version, looking at the changelogs for sbt, there are several updates that mention a new version of scala, the most recent of which is:
Scala version is bumped to 2.10.5. [...]
As for changing it, I don't believe that's possible, unless you want to build it (and any plugins you are using) from source yourself.
Does it really matter? Not really! Your project itself will use whatever version of scala you specify. There are only two ways I can see that it might matter - if you want to write a custom sbt plugin to us in your project that relies on a feature of a newer scala version, or if you want to use one of those features in your build build files (which are just scala scripts). However, I really can't imagine what you would need to do for a build that you couldn't accomplish with the features of 2.10.x.

Can SBT scopes be used for custom libraryDependencies for specific code blocks?

I've a simple SBT project, in which one code block reads from HDFS (needs a certain version of Hadoop's libraryDependencies) and another code block (needs another version of Hadoop's libraryDependencies) writes the filtered result to Cassandra.
Can SBT scopes be used to assign a different libraryDependencies to the two code blocks?
You can do this, but you have to split your code over one of the scope axises: Project, configuration, task. The only axis that can be used for your purpose is the "project" axis. So you have to create a multi-project sbt project and split your code on its sub projects.
But his will not solve your problem. Because you will not be able to run the resulting application. The Java class loader has no way to decide, when to use the one version of Hadoop and when the other. It will load one version of the classes in question and then use it in all cases.
For this task you have to use a context aware class loader. An example for this is an OSGi container, like Apache Feilx. OSGi is version aware and can load different versions of the same library in the same Java process. It will then reference to the classes of the correct version of the library depending on the context the library is used.
To be more precise: You must convert your different versions of your Hadoop library into OSGi bundles. Then you must split your code into mutliple OSGi bundles, each with a dependency of the correct version of the Hadoop bundle in its meta data (Manifest file). When you want to start you application, you must run it in an OSGi container.
This can be done, but is quite complex. Better to clean up your code, so you only depend on one version of the Hadoop library.

How to support multiple Scala versions in a library

I have a fairly normal Scala project currently being built using Maven. I would like to support both Scala 2.9.x and the forthcoming 2.10, which is not binary or source compatible. I am willing to entertain converting to SBT if necessary, but I have run into some challenges.
My requirements for this project are:
Single source tree (no branching). I believe that trying to support multiple concurrent "master" branches for each Scala version will be the quickest way to miss bugfixes between the branches.
Version specific source directories. Since the Scala versions are not source compatibile, I need to be able to specify an auxiliary source directory for version specific sources.
Version specific source jars. End users should be able to download the correct source jar, with the correct version specific sources, for their version of Scala for IDE integration.
Integrated deployment. I currently use the Maven release plugin to deploy new versions to the Sonatype OSS repository, and would like to have a similarly simple workflow for releases.
End-user Maven support. My end users are often Maven users, and so a functional POM that accurately reflects dependencies is critical.
Shaded jar support. I need to be able to produce a JAR that includes a subset of my dependenices and removes the shaded dependencies from the published POM.
Things I have tried:
Maven profiles. I created a set of Maven profiles to control what version of Scala is used to build, using the Maven build-helper plugin to select the version specific source tree. This was working well until it came time to publish;
Using classifiers to qualify versions doesn't work well, because the source jars would also need custom classifiers ('source-2.9.2', etc.), and most IDE tools wouldn't know how to locate them.
I tried using a Maven property to add the SBT-style _${scala.version} suffix to the artifact name, but Maven does not like properties in the artifact name.
SBT. This works well once you can grok it (no small task despite extensive documentation). The downside is that there does not seem to be an equivalent to the Maven shade plugin. I've looked at:
Proguard. The plugin is not updated for SBT 0.12.x, and won't build from source because it depends on another SBT plugin that has changed groupIds, and doesn't have a 0.12.x version under the old name. I have not yet been able to work out how to instruct SBT to ignore/replace the plugin dependency.
OneJar. This uses custom class loading to run Main classes out of embedded jars, which is not the desired result; I want the class files of my project to be in the jar along with (possibly renamed) class files from my shaded dependencies.
SBT Assembly plugin. This can work to a degree, but the POM file appears to include the dependencies that I'm trying to shade, which doesn't help my end users.
I accept that there may not be a solution that does what I want for Scala, and/or I may need to write my own Maven or Scala plugins to accomplish the goal. But if I can I'd like to find an existing solution.
Update
I am close to accepting #Jon-Ander's excellent answer, but there is still one outstanding piece for me, which is a unified release process. The current state of my build.sbt is on GitHub. (I'll reproduce it here in an answer later for posterity).
The sbt-release plugin does not support multi-version builds (i.e., + release does not behave as one might like), which makes a sort of sense as the process of release tagging doesn't really need to happen across versions. But I would like two parts of the process to be multi-version: testing and publishing.
What I'd like to have happen is something akin to two-stage maven-release-plugin process. The first stage would do the administrative work of updating Git and running the tests, which in this case would mean running + test so that all versions are tested, tagging, updating to snapshot, and then pushing the result to upstream.
The second stage would checkout the tagged version and + publish, which will rerun the tests and push the tagged versions up to the Sonatype repository.
I suspect that I could write releaseProcess values that do each of these, but I'm not sure if I can support multiple releaseProcess values in my build.sbt. It probably can work with some additional scopes, but that part of SBT is still strange majick to me.
What I currently have done is changed the releaseProcess to not publish. I then have to checkout the tagged version by hand and run + publish after the fact, which is close to what I want but does compromise, especially since the tests are only run on the current scala version in the release process. I could live with a process that isn't two-stage like the maven plugin, but does implement multi-version test and publish.
Any additional feedback that can get me across the last mile would be appreciated.
Most of this is well supported in sbt within a single source tree
Version specific source directories are usually not need. Scala programs tends to be source compatible - so often in fact that
crossbuilding (http://www.scala-sbt.org/release/docs/Detailed-Topics/Cross-Build) has first class support in sbt.
If you really need version specific code, you can add extra source folders.
Putting this in your build.sbt file will add "src/main/scala-[scalaVersion]" as a source directory for each version as you crossbuild in addition to the regular "src/main/scala".
(there is also a plugin available for generating shims between version, but I haven't tried it - https://github.com/sbt/sbt-scalashim)
unmanagedSourceDirectories in Compile <+= (sourceDirectory in Compile, scalaVersion){ (s,v) => s / ("scala-"+v) }
version specific source jars - see crossbuilding, works out of the box
integrated deployment - https://github.com/sbt/sbt-release (has awesome git integration too)
Maven end-users - http://www.scala-sbt.org/release/docs/Detailed-Topics/Publishing.html
Shaded - I have used this one https://github.com/sbt/sbt-assembly which have worked fine for my needs.
Your problem with the assembly plugin can be solved by rewriting the generated pom.
Here is an example ripping out joda-time.
pomPostProcess := {
import xml.transform._
new RuleTransformer(new RewriteRule{
override def transform(node:xml.Node) = {
if((node \ "groupId").text == "joda-time") xml.NodeSeq.Empty else node
}
})
}
Complete build.sbt for for reference
scalaVersion := "2.9.2"
crossScalaVersions := Seq("2.9.2", "2.10.0-RC5")
unmanagedSourceDirectories in Compile <+= (sourceDirectory in Compile, scalaVersion){ (s,v) => s / ("scala-"+v) }
libraryDependencies += "joda-time" % "joda-time" % "1.6.2"
libraryDependencies += "org.mindrot" % "jbcrypt" % "0.3m"
pomPostProcess := {
import xml.transform._
new RuleTransformer(new RewriteRule{
override def transform(node:xml.Node) = {
if((node \ "groupId").text == "joda-time") xml.NodeSeq.Empty else node
}
})
}
I've done something similar to this with SBT as an example:
https://github.com/seanparsons/scalaz/commit/21298eb4af80f107181bfd09eaaa51c9b56bdc28
It's made possible by SBT allowing all the settings to be determined based on other settings, which means that most other things should "just work".
As far as the pom.xml aspect I can only recommend asking the question in the SBT mailing list, I would be surprised if you couldn't do that however.
My blog post http://www.day-to-day-stuff.blogspot.nl/2013/04/fixing-code-and-binary.html contains an example of a slightly more finegrained solution for attaching different source directories; one per major S. Also it explains how to create scala-version-specific code that can be used by not-specific code.
Update 2016-11-08: Sbt now supports this out of the box: http://www.scala-sbt.org/0.13/docs/sbt-0.13-Tech-Previews.html#Cross-version+support+for+Scala+sources