How to use lower version of dependency on conflict in maven project - scala

I am working on a multi-module maven project, which have the module with different versions of the scala, the reason being the new module say B was added which required the higher scala version i.e 2.11 and rest of the project is build around scala 2.10.
Now I have a use case where I need to call a method from the class of B to the one with lower version.
which results in the dependency conflict.
Is there any way for which I can define the version of dependencies i.e to use lower version to compile.

Related

Cross-building Scala libraries

I would like to cross-build some of my Bazel targets to Scala 2.12 and 2.13. As a further point of complexity, I need to be able to express cross-target dependencies (eg. some 2.13 target may have a Bazel dependency on a 2.12 target).
Note: this isn't a regular library dependency (eg. with the dependency 2.12-built JAR showing up on the class path when compiling the 2.13 JAR), as that would almost surely result in issues due to having two incompatible versions of the Scala standard library on the class path. Rather, this is just a case where I need the dependency JAR built so I can use it in some integration tests in the 2.13 target.
What I've found online so far...
This issue from rules_scala seems it doesn't support baking the Scala version into the target and instead you have to pick the Scala version globally.
This Databricks post has a cross-building section that is exactly what I think I would like (eg. one target generated per library per supported Scala version), but the snippets in that post don't seem to be backed by any runnable Bazel code.
This later post by Databricks also hints at a cross_scala_lib rule, but also doesn't have any accompanying code.

How to match multiple specific scala versions in source directory in sbt

The sbt documentation tells me I can suffix the scala version in order to get sbt to pick up the sources when building with that version.
https://www.scala-sbt.org/1.x/docs/Cross-Build.html#Scala-version+specific+source+directory
It doesn't say how I can specify the suffix so that I can match multiple scala versions in one directory. I need one to match both 2.12 and 2.13, and one to match 2.11.
I found a few example of that in a few libraries like zio or cats. The suffixes looked like -2.12+ or -2.12-2.13. When applied to my build though, sbt doesn't seem to find these paths, and compilation fails because some classes are not found. If I separate them with -2.12 and -2.13 it works, but I'd have to duplicate code.
You can try sbt new scala/scala-seed.g8 and rename src/main/scala to src/main/scala-2.12+ and see that it doesn't work, while it works when renamed to src/main/scala-2.13.
Can someone point me to what I'm missing?
I've tried sbt versions 1.3.10 and 1.2.8.

Understanding eclipse maven dependency hierarchy

I want to understand the dependencies for a multi-module maven project and for that referred to eclipse dependency hierarchy.
I did understand fairly, however some of the things I am not able to understand at all.
Below is the screen shot.
The things which I didn't understand are:
--> managed from 1.0.2 [Compile}
--> managed from 1.0.2 (omitted for conflict with 1.0.0) [Compile]
I did search online but I got information in traces. Can anyone help me understand what they mean in easy to understand?
Thanks.
Maven builds a flat classpath from the dependency tree each for compiling ([compile]), for testing, and for running.
In a flat classpath, unlike OSGi, a dependency can only exist in one version. In your cropped screenshot, there is on the second level among other things:
kafka-streams 1.0.2 and
kafka-clients 1.0.0.
kafka-streams 1.0.2 requires kafka-clients 1.0.2 which conflicts to kafka-clients 1.0.0. Therefore kafka-streams 1.0.2 is omitted for conflicts with 1.0.0 even if the version 1.0.2 is required here ("managed from 1.0.2").
More detailed:The classpath which is used to compile or run a plain Java application is flat: all required libraries are globally specified as an ordered list. It is not possible to use a library of a specific version for one package and for another package the same library in a different version.In Maven dependencies builds a tree: each dependency might have its own dependencies. Maven maps the tree of dependencies to the classpath, an ordered list of libraries. If in the Maven dependencies tree the same library exists in different versions, it is not possible to create a flat classpath. This is a conflict.This conflict is resolved by picking one version and omitting all other versions. At the place where the picked version is used instead of the required version, (managed from <required but not picked version>) and (omitted for conflict with <picked version to use instead>) is displayed.In addition, Maven can create different classpaths to compile, to test or to run a Java application via so-called scopes. The [compile] scope is the default scope for using a library in all tasks: compiling, testing and running.
Make sure that the versions specified in the pom.xml file are compatible with each other (which is not yet the case in your screenshot): you have to upgrade kafka-clients from 1.0.0 to 1.0.2 (or downgrade the other libraries).

Scala IDE and Apache Spark -- different scala library version found in the build path

I have some main object:
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
object Main {
def main(args: Array[String]) {
val sc = new SparkContext(
new SparkConf().setMaster("local").setAppName("FakeProjectName")
)
}
}
...then I add spark-assembly-1.3.0-hadoop2.4.0.jar to the build path in Eclipse from
Project > Properties... > Java Build Path :
...and this warning appears in the Eclipse console:
More than one scala library found in the build path
(C:/Program Files/Eclipse/Indigo 3.7.2/configuration/org.eclipse.osgi/bundles/246/1/.cp/lib/scala-library.jar,
C:/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar).
This is not an optimal configuration, try to limit to one Scala library in the build path.
FakeProjectName Unknown Scala Classpath Problem
Then I remove Scala Library [2.10.2] from the build path, and it still works. Except now this warning appears in the Eclipse console:
The version of scala library found in the build path is different from the one provided by scala IDE:
2.10.4. Expected: 2.10.2. Make sure you know what you are doing.
FakeProjectName Unknown Scala Classpath Problem
Is this a non-issue? Either way, how do I fix it?
This is often a non-issue, especially when the version difference is small, but there are no guarantees...
The problem is (as stated in the warning) that your project has two Scala libraries on the class path. One is explicitly configured as part of the project; this is version 2.10.2 and is shipped with the Scala IDE plugins. The other copy has version 2.10.4 and is included in the Spark jar.
One way to fix the problem is to install a different version of Scala IDE, that ships with 2.10.4. But this is not ideal. As noted here, Scala IDE requires every project to use the same library version:
http://scala-ide.org/docs/current-user-doc/gettingstarted/index.html#choosing-what-version-to-install
A better solution is to clean up the class path by replacing the Spark jar you are using. The one you have is an assembly jar, which means it includes every dependency used in the build that produced it. If you are using sbt or Maven, then you can remove the assembly jar and simply add Spark 1.3.0 and Hadoop 2.4.0 as dependencies of your project. Every other dependency will be pulled in during your build. If you're not using sbt or Maven yet, then perhaps give sbt a spin - it is really easy to set up a build.sbt file with a couple of library dependencies, and sbt has a degree of support for specifying which library version to use.
The easiest solution:
In Eclipse :
1. Project/ (righclick) Properties
2. Go to Scala Compiler
3. click Use Project Settings
4. set Scala Installation to a compatible version. Generally Fixed Scala Installation 2.XX.X (build-in)
5. Rebuild the project.
There are 2 types of Spark JAR files (just by looking at the Name):
- Name includes the word "assembly" and not "core" (has Scala inside)
- Name includes the word "core" and not "assembly" (no Scala inside).
You should include the "core" type in your Build Path via “Add External Jars”
(the version you need) since the Scala IDE already shoves one Scala for you.
Alternatively, you can just take advantage of the SBT and add the following
Dependency (again, pay attention to the versions you need):
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0"
Then you should NOT include “forcefully” any spark JAR in the Build Path.
Happy sparking:
Zar
>

Cross build scala using gradle

I've got a Scala project that is built with Gradle. The Scala code is source compatible with scala 2.9 and 2.10 and I'd like to cross build it to both major Scala versions. Does Gradle support this?
For example, my gradle project will have a single module:
build.gradle
src/main/scala/foo.scala
and I'd like the resulting published jars to be:
org-foo_2.9-0.1.jar (with dependency on scala-library 2.9)
org-foo_2.10-0.1.jar (with dependency on scala-library 2.10)
Gradle's Scala plugin doesn't currently support cross-building. It's possible to implement it yourself, though. In my Polyglot Gradle talk, I presented a proof-of-concept.
I am searching for a good example of this. The Gradle manual doesn't mention how to specify Scala version but looking at the source code for the Scala plugin it seems to infer it from the Scala library jar that you specify.
The best example I could find is the Apache Kafka build system. It specifies the Scala version and then uses some additional logic to resolve the correct version of the Scala libraries. It also uses some logic to attach the correct label to the jars its builds to correspond to the correct Scala version.
This feels like a lot of work and something that the build system should do for you like in SBT.