Could not find or load main class Error when changing scala compiler version - eclipse

The default version of Scala complier of Scala IDE for eclipse is 2.12. It runs well for hello world.
However when I change the Scala complier version to 2.11 like following:
Then it shows Error: Could not find or load main class
When I change back to 2.12, it works again.
My question:
My Scala IDE for eclipse version is latest. Why this happens? And what should I do? I have to change to scala 2.11 to load the apache-spark 2.4 jar files. Otherwise it will show errors.

Related

How to enable Partial Unification in Spark REPL with Scala 2.11.8?

I have Scala code written in Scala 2.11.12 using the partial-unification compiler option, which I would like to run in a Spark 2.2.2 REPL.
With a Spark version compiled against Scala 2.11.12 (i.e. 2.3+), this is possible in the Spark REPL via :settings -Ypartial-unification, and the code executes.
I want to run this on Spark 2.2.2, which is compiled against Scala 2.11.8.
To do this, I have downloaded the jar with the partial unification compiler plugin (source from: https://github.com/milessabin/si2712fix-plugin), which backports this setting.
I've played around with a Scala 2.11.8 REPL (adding jar to the classpath - seems too rudimentary) and haven't managed to get it working there (before trying to add it to Spark), and am asking if anyone knows how to do this or if adding a compiler setting to a REPL via a JAR is not possible.
Any other advice appreciated!

Why when Maven Build Works good but adding Spark Jar as external Jars gives a compile error “object Apache is not a member of package org”

On Eclipse, while setting up spark , even after adding external jars to build path to spark-2.4.3-bin-hadoop2.7/jars/<_all.jar>,
Complier complains about '“object apache is not a member of package org''
Yes, Building dependencies via Maven or SBT would fix it. A question is asked
scalac compile yields "object apache is not a member of package org"
But Question over here is , WHY the traditional way is failing like this ?
If we reffer here , Scala/Spark version compatibility We could see a similar issue. The problem is Scala is NOT backward compatible. Hence each Spark module is complied against specific Scala library. But when we run from eclipse, the eclipse Scala environment may not be compatible that particular scala version of which we have the Spark libraries set up.

Cross-compiled with an incompatible version

I am using eclipse with m2eclipse-scala plugin. Currently, I get the following error message:
exampleA_2.10-2.0.1.jar of module build path is cross-compiled with an incompatible version of Scala (2.10.0). In case this report is mistaken, this check can be disabled in the compiler preference page
It looks like the versions of extracted Scala and Scala IDE match. I just wanted to make sure that this is a "false-negative" as described here and can be safely turned off.
As #The Archetypal Paul suggested, it was because I was using wrong Scala library.
If you are using Scala 2.11 (check at About Scala IDE -> installation details), you can downgrade by following instruction here. It's a lot easier than uninstalling and re-installing Scala IDE as other Stackoverflow posts recommend.
I also faced the same issue->
I am trying to use casbah jar in scala to integrate with mongodb.
After analyzing the problem i found that ->
i am trying to use casbah 2.9.1 version and my scala version is 2.11.8
Root-Cause of such error is : your jar is compiled in 2.9.0 version and you are using scala 2.11.8 version
So, to resolve that i use the jar that is compiled into 2.11 scala version-
<groupId>org.mongodb</groupId>
<artifactId>casbah-core_2.11</artifactId>
<version>3.1.1</version>
I was facing similar issue in Eclipse IDE where I had built a Spark scala project in Maven. The scala version was set to 2.11.
Later, I upgraded Scala-Ide plugin in Eclipse after which my project marked below errors,
exampleA_2.10-2.0.1.jar of module build path is cross-compiled with an incompatible version of Scala (2.10.0). In case this report is mistaken, this check can be disabled in the compiler preference page
Right click project folder > scala > set scala version. Here my scala version was displayed as 2.10. I selected 2.11 and removed all the error messages.

.jar files cross-compiled with an incompatible version of Scala (2.10)

I am building my first Scala/Play application and after I create and import an eclipse project from the Play shell, I get 17 errors.
akka-actor_2.10.jar is cross compiled with an incompatible version of Scala (2.10)
akka-slf4j_2.10.jar is cross compiled with an incompatible version of Scala (2.10)
anorm_2.10.jar ...
play_2.10.jar ...
play_iterates_2.10.jar ...
The list goes on to include the Scala jars, scalaz jars, etc...
I am using:
Eclipse 4.2.2
Scala ide 3.0.0.nightly-2_09
Scala 2.10
Play 2.1
Has someone experienced the same thing?
Is it possible you are using Scala IDE built for for Scala 2.9.x ?
Make sure you use the one for 2.10.x:
http://download.scala-ide.org/sdk/e38/scala210/dev/site/

Library 'scala-2.10.0-RC1 not used' thrown by IntelliJ integrating a Play 2 app

I've just generated a fresh Play! application, version 2.1-RC1.
This one includes two Scala compiler/library couple:
Scala 2.9.2
Scala 2.10.0-RC1
The whole well compiles within IntelliJ IDEA 12 but a warning occurs as the image shows it:
It would seem so that another compiler is used instead 2.10.0-RC1.
However, my Scala facet is configured as this:
What might be the warning cause?
I precise that I've got also a Scala variable environment (used for shell Scala commands) configured to point to scala-2.10.0-RC2, but I well imagine that IntelliJ is based on library that user indicates in Scala Facet.
You can remove that .jar from the libraries, it's not used because it's redundantly generated by IntelliJ SBT plubin.