Cross build scala using gradle - scala

I've got a Scala project that is built with Gradle. The Scala code is source compatible with scala 2.9 and 2.10 and I'd like to cross build it to both major Scala versions. Does Gradle support this?
For example, my gradle project will have a single module:
build.gradle
src/main/scala/foo.scala
and I'd like the resulting published jars to be:
org-foo_2.9-0.1.jar (with dependency on scala-library 2.9)
org-foo_2.10-0.1.jar (with dependency on scala-library 2.10)

Gradle's Scala plugin doesn't currently support cross-building. It's possible to implement it yourself, though. In my Polyglot Gradle talk, I presented a proof-of-concept.

I am searching for a good example of this. The Gradle manual doesn't mention how to specify Scala version but looking at the source code for the Scala plugin it seems to infer it from the Scala library jar that you specify.
The best example I could find is the Apache Kafka build system. It specifies the Scala version and then uses some additional logic to resolve the correct version of the Scala libraries. It also uses some logic to attach the correct label to the jars its builds to correspond to the correct Scala version.
This feels like a lot of work and something that the build system should do for you like in SBT.

Related

Cross-building Scala libraries

I would like to cross-build some of my Bazel targets to Scala 2.12 and 2.13. As a further point of complexity, I need to be able to express cross-target dependencies (eg. some 2.13 target may have a Bazel dependency on a 2.12 target).
Note: this isn't a regular library dependency (eg. with the dependency 2.12-built JAR showing up on the class path when compiling the 2.13 JAR), as that would almost surely result in issues due to having two incompatible versions of the Scala standard library on the class path. Rather, this is just a case where I need the dependency JAR built so I can use it in some integration tests in the 2.13 target.
What I've found online so far...
This issue from rules_scala seems it doesn't support baking the Scala version into the target and instead you have to pick the Scala version globally.
This Databricks post has a cross-building section that is exactly what I think I would like (eg. one target generated per library per supported Scala version), but the snippets in that post don't seem to be backed by any runnable Bazel code.
This later post by Databricks also hints at a cross_scala_lib rule, but also doesn't have any accompanying code.

Is there a way to use the Scala 3 compiler (Dotty) from Gradle yet?

I am new to both Gradle and Dotty (and still relatively new to Scala overall). I was able to create a Scala project with Gradle like this:
gradle init --dsl kotlin --type scala-library --package com.stackoverflow.example
and I know that Dotty can be used with SBT. But is there a way to wire up Dotty with Gradle (yet)?
There’s dotty examples repository with demonstrations of how to build dotty with various build tools including gradle.
https://github.com/michelou/dotty-examples/blob/master/examples/dotty-example-project/build.gradle
As you can see there’s nothing standard yet, but when there will be, I believe maintainers will update this repo.
In my personal project I did something similar to this with some simplifications:
https://github.com/SimY4/xpath-to-xml/blob/master/xpath-to-xml-scala/build-3.gradle

Why when Maven Build Works good but adding Spark Jar as external Jars gives a compile error “object Apache is not a member of package org”

On Eclipse, while setting up spark , even after adding external jars to build path to spark-2.4.3-bin-hadoop2.7/jars/<_all.jar>,
Complier complains about '“object apache is not a member of package org''
Yes, Building dependencies via Maven or SBT would fix it. A question is asked
scalac compile yields "object apache is not a member of package org"
But Question over here is , WHY the traditional way is failing like this ?
If we reffer here , Scala/Spark version compatibility We could see a similar issue. The problem is Scala is NOT backward compatible. Hence each Spark module is complied against specific Scala library. But when we run from eclipse, the eclipse Scala environment may not be compatible that particular scala version of which we have the Spark libraries set up.

Scala IDE and Apache Spark -- different scala library version found in the build path

I have some main object:
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
object Main {
def main(args: Array[String]) {
val sc = new SparkContext(
new SparkConf().setMaster("local").setAppName("FakeProjectName")
)
}
}
...then I add spark-assembly-1.3.0-hadoop2.4.0.jar to the build path in Eclipse from
Project > Properties... > Java Build Path :
...and this warning appears in the Eclipse console:
More than one scala library found in the build path
(C:/Program Files/Eclipse/Indigo 3.7.2/configuration/org.eclipse.osgi/bundles/246/1/.cp/lib/scala-library.jar,
C:/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar).
This is not an optimal configuration, try to limit to one Scala library in the build path.
FakeProjectName Unknown Scala Classpath Problem
Then I remove Scala Library [2.10.2] from the build path, and it still works. Except now this warning appears in the Eclipse console:
The version of scala library found in the build path is different from the one provided by scala IDE:
2.10.4. Expected: 2.10.2. Make sure you know what you are doing.
FakeProjectName Unknown Scala Classpath Problem
Is this a non-issue? Either way, how do I fix it?
This is often a non-issue, especially when the version difference is small, but there are no guarantees...
The problem is (as stated in the warning) that your project has two Scala libraries on the class path. One is explicitly configured as part of the project; this is version 2.10.2 and is shipped with the Scala IDE plugins. The other copy has version 2.10.4 and is included in the Spark jar.
One way to fix the problem is to install a different version of Scala IDE, that ships with 2.10.4. But this is not ideal. As noted here, Scala IDE requires every project to use the same library version:
http://scala-ide.org/docs/current-user-doc/gettingstarted/index.html#choosing-what-version-to-install
A better solution is to clean up the class path by replacing the Spark jar you are using. The one you have is an assembly jar, which means it includes every dependency used in the build that produced it. If you are using sbt or Maven, then you can remove the assembly jar and simply add Spark 1.3.0 and Hadoop 2.4.0 as dependencies of your project. Every other dependency will be pulled in during your build. If you're not using sbt or Maven yet, then perhaps give sbt a spin - it is really easy to set up a build.sbt file with a couple of library dependencies, and sbt has a degree of support for specifying which library version to use.
The easiest solution:
In Eclipse :
1. Project/ (righclick) Properties
2. Go to Scala Compiler
3. click Use Project Settings
4. set Scala Installation to a compatible version. Generally Fixed Scala Installation 2.XX.X (build-in)
5. Rebuild the project.
There are 2 types of Spark JAR files (just by looking at the Name):
- Name includes the word "assembly" and not "core" (has Scala inside)
- Name includes the word "core" and not "assembly" (no Scala inside).
You should include the "core" type in your Build Path via “Add External Jars”
(the version you need) since the Scala IDE already shoves one Scala for you.
Alternatively, you can just take advantage of the SBT and add the following
Dependency (again, pay attention to the versions you need):
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0"
Then you should NOT include “forcefully” any spark JAR in the Build Path.
Happy sparking:
Zar
>

Is the usage of Scalariform as an embedded library considered abandoned?

I had been using Scalariform in a project I upgraded to Scala 2.11. In doing so, I discovered that Scalariform does not appear to have an artifact published for 2.11 in any of the usual places. This makes the usual sbt cross-version dependency unhappy.
As 2.11 has been out for a while already, this has me questioning if the usage of Scalariform as an embedded library should be considered abandoned? Has the community moved on to an alternative I just don't know about?
Scalafmt is an alternative code formatter that does compile to 2.11 and can be used as an embedded library. An up-to-date usage example is here https://olafurpg.github.io/scalafmt/#Standalonelibrary