Ideal Scala's compiler and libraries using Play framework - scala

I've just downloaded Play Framework 2.1-RC1.
Within folders, I noticed that SBT folder includes itself Scala's compiler/library whose version is:
scala.2.10.0-RC1.
Previously, I used to compile some Scala programs with the more recent version:
scala.2.10.0-RC2
Should I stay with scala.2.10.0-RC1 provided by Play (in order to avoid potential incompatibility with Play)?
Further, is there a way (a specific Play's command-line?) to add automatically src-jars for Scala into Play's SBT folder (scala-library-src.jar etc...)? Indeed the folder structure contains currently only jars files:
jansi.jar
jline.jar
scala-compiler.jar
scala-library.jar
scala-reflect.jar

Scala 2.10.0 final has been out for about a week, now. I'd expect that the Play artifacts are available for it by now and if they're not they surely will be soon.

Related

Cross-building Scala libraries

I would like to cross-build some of my Bazel targets to Scala 2.12 and 2.13. As a further point of complexity, I need to be able to express cross-target dependencies (eg. some 2.13 target may have a Bazel dependency on a 2.12 target).
Note: this isn't a regular library dependency (eg. with the dependency 2.12-built JAR showing up on the class path when compiling the 2.13 JAR), as that would almost surely result in issues due to having two incompatible versions of the Scala standard library on the class path. Rather, this is just a case where I need the dependency JAR built so I can use it in some integration tests in the 2.13 target.
What I've found online so far...
This issue from rules_scala seems it doesn't support baking the Scala version into the target and instead you have to pick the Scala version globally.
This Databricks post has a cross-building section that is exactly what I think I would like (eg. one target generated per library per supported Scala version), but the snippets in that post don't seem to be backed by any runnable Bazel code.
This later post by Databricks also hints at a cross_scala_lib rule, but also doesn't have any accompanying code.

How to match multiple specific scala versions in source directory in sbt

The sbt documentation tells me I can suffix the scala version in order to get sbt to pick up the sources when building with that version.
https://www.scala-sbt.org/1.x/docs/Cross-Build.html#Scala-version+specific+source+directory
It doesn't say how I can specify the suffix so that I can match multiple scala versions in one directory. I need one to match both 2.12 and 2.13, and one to match 2.11.
I found a few example of that in a few libraries like zio or cats. The suffixes looked like -2.12+ or -2.12-2.13. When applied to my build though, sbt doesn't seem to find these paths, and compilation fails because some classes are not found. If I separate them with -2.12 and -2.13 it works, but I'd have to duplicate code.
You can try sbt new scala/scala-seed.g8 and rename src/main/scala to src/main/scala-2.12+ and see that it doesn't work, while it works when renamed to src/main/scala-2.13.
Can someone point me to what I'm missing?
I've tried sbt versions 1.3.10 and 1.2.8.

Scala and Scala.js version included in artifact id

I just successfully released my first Scala & Scala.js cross-building library to Sonatype and can now use the following two artifacts in my applicatons:
https://search.maven.org/artifact/com.github.fbaierl/scala-tarjan_2.12/0.1.1/jar
https://search.maven.org/artifact/com.github.fbaierl/scala-tarjan_sjs0.6_2.12/0.1.1/jar
My question now is: Why is the Scala and Scala.js version included in the artifact id? I don't think I have seen such a thing before so I was wondering if I did something wrong. Here is my build.sbt: https://github.com/fbaierl/scalajs-cross-compile-tarjan/blob/03954a3e2d1442ad339298a986209c1403c9692e/build.sbt
That's the way that Scala artifacts work. Pretty much all artifacts look like this -- it just isn't obvious when you use those artifacts in sbt, because (IIRC) the _2.12 is implied by the %% operator in sbt. (And the _sjs0.6 is implied by the %%% operator.)
The underlying reason for it is that artifacts compiled by different major versions of the Scala compiler (Scala versions are epoch.major.minor) aren't binary compatible (because otherwise the language and standard library couldn't evolve). You can't mix e.g. _2.12 and _2.11 artifacts on the classpath, so the “same” version of the same library must be published separately for both Scala versions, so the suffix is needed to distinguish them.

Scala IDE and Apache Spark -- different scala library version found in the build path

I have some main object:
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
object Main {
def main(args: Array[String]) {
val sc = new SparkContext(
new SparkConf().setMaster("local").setAppName("FakeProjectName")
)
}
}
...then I add spark-assembly-1.3.0-hadoop2.4.0.jar to the build path in Eclipse from
Project > Properties... > Java Build Path :
...and this warning appears in the Eclipse console:
More than one scala library found in the build path
(C:/Program Files/Eclipse/Indigo 3.7.2/configuration/org.eclipse.osgi/bundles/246/1/.cp/lib/scala-library.jar,
C:/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar).
This is not an optimal configuration, try to limit to one Scala library in the build path.
FakeProjectName Unknown Scala Classpath Problem
Then I remove Scala Library [2.10.2] from the build path, and it still works. Except now this warning appears in the Eclipse console:
The version of scala library found in the build path is different from the one provided by scala IDE:
2.10.4. Expected: 2.10.2. Make sure you know what you are doing.
FakeProjectName Unknown Scala Classpath Problem
Is this a non-issue? Either way, how do I fix it?
This is often a non-issue, especially when the version difference is small, but there are no guarantees...
The problem is (as stated in the warning) that your project has two Scala libraries on the class path. One is explicitly configured as part of the project; this is version 2.10.2 and is shipped with the Scala IDE plugins. The other copy has version 2.10.4 and is included in the Spark jar.
One way to fix the problem is to install a different version of Scala IDE, that ships with 2.10.4. But this is not ideal. As noted here, Scala IDE requires every project to use the same library version:
http://scala-ide.org/docs/current-user-doc/gettingstarted/index.html#choosing-what-version-to-install
A better solution is to clean up the class path by replacing the Spark jar you are using. The one you have is an assembly jar, which means it includes every dependency used in the build that produced it. If you are using sbt or Maven, then you can remove the assembly jar and simply add Spark 1.3.0 and Hadoop 2.4.0 as dependencies of your project. Every other dependency will be pulled in during your build. If you're not using sbt or Maven yet, then perhaps give sbt a spin - it is really easy to set up a build.sbt file with a couple of library dependencies, and sbt has a degree of support for specifying which library version to use.
The easiest solution:
In Eclipse :
1. Project/ (righclick) Properties
2. Go to Scala Compiler
3. click Use Project Settings
4. set Scala Installation to a compatible version. Generally Fixed Scala Installation 2.XX.X (build-in)
5. Rebuild the project.
There are 2 types of Spark JAR files (just by looking at the Name):
- Name includes the word "assembly" and not "core" (has Scala inside)
- Name includes the word "core" and not "assembly" (no Scala inside).
You should include the "core" type in your Build Path via “Add External Jars”
(the version you need) since the Scala IDE already shoves one Scala for you.
Alternatively, you can just take advantage of the SBT and add the following
Dependency (again, pay attention to the versions you need):
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0"
Then you should NOT include “forcefully” any spark JAR in the Build Path.
Happy sparking:
Zar
>

Cross build scala using gradle

I've got a Scala project that is built with Gradle. The Scala code is source compatible with scala 2.9 and 2.10 and I'd like to cross build it to both major Scala versions. Does Gradle support this?
For example, my gradle project will have a single module:
build.gradle
src/main/scala/foo.scala
and I'd like the resulting published jars to be:
org-foo_2.9-0.1.jar (with dependency on scala-library 2.9)
org-foo_2.10-0.1.jar (with dependency on scala-library 2.10)
Gradle's Scala plugin doesn't currently support cross-building. It's possible to implement it yourself, though. In my Polyglot Gradle talk, I presented a proof-of-concept.
I am searching for a good example of this. The Gradle manual doesn't mention how to specify Scala version but looking at the source code for the Scala plugin it seems to infer it from the Scala library jar that you specify.
The best example I could find is the Apache Kafka build system. It specifies the Scala version and then uses some additional logic to resolve the correct version of the Scala libraries. It also uses some logic to attach the correct label to the jars its builds to correspond to the correct Scala version.
This feels like a lot of work and something that the build system should do for you like in SBT.