I am currently using SonarQube 8.9 LTS
I have tried to install sonar-scala https://github.com/sonar-scala/sonar-scala in extensions/plugins folder, but I am facing error like as below:
java.lang.IllegalStateException: There are two languages declared with the same key 'scala' declared by the plugins 'scala' and 'sonarscala'. Please uninstall one of the conflicting plugins.
Could someone who worked with sonar scala confirm, is there no way I can keep both default scala rules and sonar scala work together in parallel? uninstalling existing sonarway scala is a risk for me
Related
There are multiple binary incompatible scala 2 versions, however the document says the installation is either via IDE or SBT.
DOWNLOAD SCALA 2
Then, install Scala:...either by installing an IDE such as IntelliJ, or sbt, Scala's build tool.
Spark 3 needs Scala 2.12.
Spark 3.1.2 uses Scala 2.12. You will need to use a compatible Scala version (2.12.x).
Then how can we make sure the scala version is 2.12 if we install sbt?
Or the documentation is not accurate and it should be "to use specific version of scala, need to download specific scala version on your own"?
Updates
As per the answer by mario-galic, in ONE-CLICK INSTALL FOR SCALA it is said:
Installing Scala has always been a task more challenging than necessary, with the potential to drive away beginners. Should I install Scala itself? sbt? Some other build tools? What about a better REPL like Ammonite? Oh and before all that I need to install Java?
To solve this problem, the Scala Center contracted Alexandre Archambault in January 2020 to add a one-click install of Scala through coursier. For example, on Linux, all we now need is:
$ curl -Lo cs https://git.io/coursier-cli-linux && chmod +x cs && ./cs setup
The Scala version is specified in the build.sbt file so SBT will download the appropriate version of Scala as necessary.
I personally use SDKMAN! to install Java and then SBT.
The key concept to understand is the difference between system-wide installation and project-specific version. System-wide installation ends up somewhere on the PATH like
/usr/local/bin/scala
and can be installed in various ways, personally I recommend coursier one-click install for Scala
curl -Lo cs https://git.io/coursier-cli-linux && chmod +x cs && ./cs setup
Project-specific version is specified by scalaVersion sbt settings which downloads Scala to coursier cache location. To see the Scala version and location used by the particular project try show scalaInstance which
inspect scalaInstance
[info] Task: sbt.internal.inc.ScalaInstance
[info] Description:
[info] Defines the Scala instance to use for compilation, running, and testing.
Scala should be binary compatible within minor version so Spark 3 or any other software built against any Scala 2.12.x version should work with any other Scala 2.12.x version where we have major.minor.patch. Note binary compatibility is not guaranteed for internal compiler APIs, so for example when publishing compiler plugins the best practice is to publish it against full specific Scala version. For example notice how kind-projector compiler plugin is published against full Scala version 2.13.6
https://repo1.maven.org/maven2/org/typelevel/kind-projector_2.13.6/
whilst scala-cats application-level library is published against any Scala 2.13.x version
https://repo1.maven.org/maven2/org/typelevel/cats-core_2.13/
Similarly spark is published against any Scala 2.12.x version
https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.12/
Regarding system-wide installation one trick I do for quick switching of versions is to put scala-runners on the PATH and then different versions can be launched via --scala-version argument
scala --scala-version 2.12.14
Using coursier or scala-runners you can even switch JDK quickly via -C--jvm for example
scala --scala-version 2.12.14 -C--jvm=11
For a project, there should be no need to download manually a specific version of Scala. sbt either directly or indirectly via an IDE will download all the dependencies behind the scenes for you, so the only thing to specify is sbt setting scalaVersion.
Using Python as analogy to Scala, and Pipenv as anology to sbt, then python_version in Pipfile is similar to scalaVersion in build.sbt. After executing pipenv shell and pipenv install you end up with project-specific shell environment with project specific Python version and dependencies. sbt similarly downloads project specific Scala version and dependencies based of build.sbt although it has no need for lock files or for modifying your shell environment.
On Eclipse, while setting up spark , even after adding external jars to build path to spark-2.4.3-bin-hadoop2.7/jars/<_all.jar>,
Complier complains about '“object apache is not a member of package org''
Yes, Building dependencies via Maven or SBT would fix it. A question is asked
scalac compile yields "object apache is not a member of package org"
But Question over here is , WHY the traditional way is failing like this ?
If we reffer here , Scala/Spark version compatibility We could see a similar issue. The problem is Scala is NOT backward compatible. Hence each Spark module is complied against specific Scala library. But when we run from eclipse, the eclipse Scala environment may not be compatible that particular scala version of which we have the Spark libraries set up.
I have use the IntelliJ Scala plugin before, in 13 and 13.1. I upgraded to 14, and it doesn't work anymore for my SBT project.
For all Scala standard lib stuff, I see errors like "Cannot find symbol scala.Option".
at scala project, compiler error - Cannot resolve symbol List? says I need to have the Scala facet for my module. I looked in facets, and Scala wasn't an option.
I've uninstalled IntelliJ and the Scala plugin and my settings and the projects files multiple times, but still happens.
How do I fix this?
the new scala plugin for intellij 14 removed the facet and replaced it with Scala SDK library, see blog
for sbt project, I guess the best bet is to re-create your project by:
File -> open -> select the build.sbt of your project in popup -> delete existing project and import
I had a similar issue when a Java module calls an Scala object. The issue was from the wrong setup in Source Folders; the Scala source was in src/main/scala/..., but in the Project Structure, the Source Folders were setup as src by default. When I changed it as src/main/scala, the Java module can find the Scala object correctly.
I fixed this by using the nightly builds of the Scala plugin.
The fixes have now been incorporated into the stable versions.
In my case i just had to reload the IDE...
I'm trying to develop on the Scala compiler project with the help of ScalaIDE. I followed this guide to set up the development environment. When I now try to build the mentioned projects, the reflect project won't get built. Instead, I get the following error via the console output:
uncaught exception during compilation: scala.reflect.internal.MissingRequirementError reflect Unknown Scala Problem
Having tried the provided ANT script of the project via the console, everything seems to work fine.
Does anyone know if I'm missing a hidden compiler flag, dependency or something like this?
Thanks!
With the IDE for Scala 2.10 you can only build the 2.10.x branch of the Scala compiler. If you want to work on master, you need to install a 2.11-based version of the IDE. We don't publicise IDE for 2.11 nightlies yet, but they are available at:
http://download.scala-ide.org/nightly/scala-ide-master-2.11.0-SNAPSHOT/
I've just generated a fresh Play! application, version 2.1-RC1.
This one includes two Scala compiler/library couple:
Scala 2.9.2
Scala 2.10.0-RC1
The whole well compiles within IntelliJ IDEA 12 but a warning occurs as the image shows it:
It would seem so that another compiler is used instead 2.10.0-RC1.
However, my Scala facet is configured as this:
What might be the warning cause?
I precise that I've got also a Scala variable environment (used for shell Scala commands) configured to point to scala-2.10.0-RC2, but I well imagine that IntelliJ is based on library that user indicates in Scala Facet.
You can remove that .jar from the libraries, it's not used because it's redundantly generated by IntelliJ SBT plubin.