Can compile with 2.11.2, but not 2.11.3 - scala

build.sbt file:
name := "Bag"
version := "0.7.252"
scalaVersion := "2.11.3"
libraryDependencies ++= Seq(
"org.scalatest" % "scalatest_2.11" % "2.1.3" % "test",
"org.scala-lang.modules" %% "scala-swing" % "1.0.1",
"org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.2"
)
The project compile properly with 2.11.2, but on 2.11.3 I get this.

Scala 2.11.3 is not yet officially released, although as you have witnessed, the artifact was already pushed to Maven Central.
I think that version will be "pulled" because of a binary incompatibility bug introduced in collections. See SI-8899 and SI-8900. Stick to 2.11.2 until a new version (2.11.4?) will be announced.
I'm not sure I understand what is going on in your case, reading the pastbin, but I suggest you open another ticket unless it clearly stems from either of these two issues.

Scala 2.11.3 is not officially released and not recommend. If you want to read the whole story see this.
Scala 2.11.4 should be used instead.

Related

IntelliJ: scalac bad symbolic reference

In my build.sbt file I have this in my project.
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"
libraryDependencies += "org.apache.spark" % "spark-hive_2.10" % "1.3.1"
libraryDependencies += "org.apache.spark" % "spark-graphx_2.10" % "1.3.1"
libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "1.3.1"
I just let it download all the libraries automatically. I'm adding graphx, the spark-core, and the scala sdk to one of my project modules but when I try to compile I'm getting:
Error:scalac: bad symbolic reference. A signature in RDD.class refers to term hadoop
in package org.apache which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling RDD.class.
Error:scalac: bad symbolic reference. A signature in RDD.class refers to term io
in value org.apache.hadoop which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling RDD.class.
Error:scalac: bad symbolic reference. A signature in RDD.class refers to term compress
in value org.apache.io which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling RDD.class.
The weird thing is if I download graphx/mllib directly from the maven repositories it seems to compile. Any ideas?
Another possible source of error is the incorrect scalac version setting in the project. Right click project -> Open module settings -> Global Libraries, change/add the scala-sdk version appropriate to your project
Please add the hadoop dependency. Something like
libraryDependencies += "org.apache.hadoop" %% "hadoop-common" % "2.7.1"
libraryDependencies += "org.apache.hadoop" %% "hadoop-hdfs" % "2.7.1"
You may need to add other hadoop modules depending on your app.

IntelliJ Idea 14: cannot resolve symbol spark

I made a dependency of Spark which worked in my first project. But when I try to make a new project with Spark, my SBT does not import the external jars of org.apache.spark. Therefore IntelliJ Idea gives the error that it "cannot resolve symbol".
I already tried to make a new project from scratch and use auto-import but none works. When I try to compile I get the messages that "object apache is not a member of package org". My build.sbt looks like this:
name := "hello"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" % "spark-parent_2.10" % "1.4.1"
I have the impression that there might be something wrong with my SBT settings, although it already worked one time. And except for the external libraries everything is the same...
I also tried to import the pom.xml file of my spark dependency but that also doesn't work.
Thank you in advance!
This worked for me->
name := "ProjectName"
version := "0.1"
scalaVersion := "2.11.11"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.11" % "2.2.0",
"org.apache.spark" % "spark-sql_2.11" % "2.2.0",
"org.apache.spark" % "spark-mllib_2.10" % "1.1.0"
)
I use
scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.1"
in my build.sbt and it works for me.
I had a similar problem. It seems the reason was that the build.sbt file was specifying the wrong version of scala.
If you run spark-shell it'll say at some point the scala version used by Spark, e.g.
Using Scala version 2.11.8
Then I edited the line in the build.sbt file to point to that version and it worked.
Currently spark-cassandra-connector compatible with Scala 2.10 and 2.11.
It worked for me when I updated the scala version of my project like below:
ThisBuild / scalaVersion := "2.11.12"
and I updated my dependency like:
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "2.4.0",
If you use "%%", sbt will add your project’s binary Scala version to the artifact name.
From sbt run:
sbt> reload
sbt> compile
Your library dependecy conflicts with with the scala version you're using, you need to use 2.11 for it to work. The correct dependency would be:
scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.4.1"
note that you need to change spark_parent to spark_core
name := "SparkLearning"
version := "0.1"
scalaVersion := "2.12.3"
// additional libraries
libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.4.1"

Can't install Scaladoc with SBT and Intellij

I am new to scala and am currently trying to setup IntelliJ IDEA 13.1 with the Scala plugin. It has support for SBT. I have simply followed the basic tutorial for creating a new project for SBT here: http://confluence.jetbrains.com/display/IntelliJIDEA/Getting+Started+with+SBT
Currently my build.sbt file is:
name := "scalasandpit"
version := "1.0"
scalaVersion := "2.10"
libraryDependencies += "org.scalatest" % "scalatest_2.10" % "2.1.0" % "test"
autoAPIMappings := true
This pulls down various jar binaries, but no sources and no javadoc. I wondered if there is a way to have both sources and javadoc work with IntelliJ and SBT. I think I'm missing something.
There seem to be two issues: getting sbt to pull down sources and docs, and then getting Idea to show them to you. To solve the former problem see the sbt documentation -- about half way down there's a section called "Download Sources" which tells you what to add to your build.sbt:
libraryDependencies +=
"org.scalatest" % "scalatest_2.10" % "2.1.0" % "test" withSources() withJavadoc()

sbt disagreeing with library about scala version

I'm trying to use scala-time with scala 2.10, and have found that it doesn't work with sbt correctly.
given something like
scalaVersion := "2.10.2"
libraryDependencies += "org.scalaj" %% "scalaj-time" % "0.7"
sbt will happily try to resolve http://repo1.maven.org/maven2/org/scalaj/scalaj-time_2.10/0.7/scalaj-time_2.10-0.7.pom.
Unfortunately, scalaj-time is publised with full scala versions as can be seen at http://central.maven.org/maven2/org/scalaj/.
It can be resolved with
libraryDependencies += "org.scalaj" % "scalaj-time_2.10.2" % "0.7"
but I'm wanting to know if this is a change in sbt behaviour, a bug in scala-time's build or if there's a way to configure sbt to pass the 3-part version instead of 2-part.
As Seth noted jorgeortiz85/scala-time likely was published using sbt that predates binary cross versioning convention that was introduced in sbt 0.12. You could do:
libraryDependencies += "org.scalaj" % "scalaj-time_2.10.2" % "0.7"
or
libraryDependencies += "org.scalaj" % "scalaj-time" % "0.7" cross CrossVersion.full
But then you'd be stuck with using 2.10.2 while Scala 2.10.4 is already out.
There's a similar Joda time wrapper named nscala-time/nscala-time that seems more actively maintained. Last updated 3 days ago and Scala 2.11.0 is supported already, so that could also be your option.
libraryDependencies += "com.github.nscala-time" %% "nscala-time" % "1.0.0"

Casbah Scala Runtime Error

I have a Play framework based webapp which has the following defined in its build.sbt file:
....
version := "1.0-SNAPSHOT"
resolvers += "Sonatype Snapshots" at "http://oss.sonatype.org/content/repositories/snapshots/"
resolvers += "Sonatype Releases" at "https://oss.sonatype.org/content/groups/scala-tools"
resolvers += "Novus Releases" at "http://repo.novus.com/releases/"
libraryDependencies ++= Seq(
jdbc,
anorm,
cache,
"com.mongodb.casbah" % "casbah_2.9.0" % "2.2.0-SNAPSHOT",
"com.novus" %% "salat-core" % "1.9.2",
"org.scalatest" % "scalatest_2.10" % "2.0" % "test",
"com.typesafe" % "config" % "1.0.2"
)
....
The Scala version is 2.10.3 and when I try to run a unit test, I run into the following error:
A needed class was not found. This could be due to an error in your runpath. Missing class: scala/reflect/ClassManifest
java.lang.NoClassDefFoundError: scala/reflect/ClassManifest
at com.mongodb.casbah.Imports$.<init>(Implicits.scala:113)
at com.mongodb.casbah.Imports$.<clinit>(Implicits.scala)
at com.mongodb.casbah.MongoConnection.apply(MongoConnection.scala:177)
........
........
I'm completely clueless as to why this is happening? Which additional library is that I'm missing?
You can't mix major scala versions (see, casbah artifact is compiled against scala 2.9.*, whereas scala_test is for 2.10.*, and you're saying you use 2.10 in intellij).
The error says, that compiler can't find class that was cut out from scala library since 2.9.* times and solution is to pick proper scala version (any 2.10.* will fit).
Salat author here. The solution is to fix your deps. The latest stable release of Salat is 1.9.4, and it depends on Casbah 2.6.4 - both are available for 2.9 and 2.10.
Instead of using the casbah driver directly for play I have fallen back into using a plugin called play-salat (https://github.com/leon/play-salat).
The one I use at the moment is "se.radley" % "play-plugins-salat_2.10" % "1.3.0" and works well in Play 2.2.