Wrong scalac compiler version invoked by SBT - scala

I'm using Fedora Linux and sbt. I think I've tried everything, but I can't persuade sbt to use Scala 2.12.2 to compile my project.
When I compile the same project with:
Gradle,
IntelliJ IDEA,
SBT on a different Linux distro,
everything works. I just can't get it to work on Fedora Linux and sbt.
Here's my build.sbt:
scalaVersion := "2.12.2"
scalaVersion in ThisBuild := "2.12.2"
crossScalaVersions := Seq(scalaVersion.value)
scalacOptions := Seq("-unchecked", "-deprecation", "-feature", "-language:postfixOps")
libraryDependencies ++= Seq("org.xerial" % "sqlite-jdbc" % "3.18.0",
"org.apache.httpcomponents" % "httpclient" % "4.5.3",
"commons-codec" % "commons-codec" % "1.10",
"commons-cli" % "commons-cli" % "1.4",
"org.hjson" % "hjson" % "1.0.0",
"log4j" % "log4j" % "1.2.17",
"org.zeromq" % "jeromq" % "0.4.0",
"com.nimbusds" % "nimbus-jose-jwt" % "4.23",
"ws.wamp.jawampa" % "jawampa-core" % "0.5.0",
"ws.wamp.jawampa" % "jawampa-netty" % "0.5.0",
"org.glassfish.tyrus" % "tyrus-websocket-core" % "1.2.1",
"org.glassfish.tyrus.bundles" % "tyrus-standalone-client" % "1.13.1",
"org.scalactic" %% "scalactic" % "3.0.1",
"org.scalatest" %% "scalatest" % "3.0.1" % "test",
"org.postgresql" % "postgresql" % "42.1.1")
// https://mvnrepository.com/artifact/org.apache.commons/commons-compress
libraryDependencies += "org.apache.commons" % "commons-compress" % "1.14"
// https://mvnrepository.com/artifact/org.tukaani/xz
libraryDependencies += "org.tukaani" % "xz" % "1.6"
// https://mvnrepository.com/artifact/net.sourceforge.htmlunit/htmlunit
libraryDependencies += "net.sourceforge.htmlunit" % "htmlunit" % "2.27"
Here's my project/build.properties:
sbt.version=0.13.15
scala.version=2.12.2
build.scala.version=2.12.2
def.scala.versions=2.12.2
Every time I start sbt and try to compile, I'm getting some syntax errors (I've changed some strings to xxx, sorry):
[info] Compiling 76 Scala sources to /home/antek/dev/scala/xxx/target/scala-2.12/classes...
[error] /home/antek/dev/scala/xxx/src/main/scala/api/xxx.scala:168: macros cannot be partially applied
[error] httpPost(transApi, args + ("xxx" → f"$currentTime%d"), "API-Key", "API-Hash")
[error] ^
[error] /home/antek/dev/scala/xxx/src/main/scala/api/xxx.scala:39: macros cannot be partially applied
[error] httpPost(transApi, args + ("xxx" → f"$currentTime%d"), "API-Key", "API-Hash")
[error] ^
[error] /home/antek/dev/scala/xxx/src/main/scala/api/HttpSignOperation.scala:51: macros cannot be partially applied
[error] Log.put(f"HTTP GET returned status $status%d")
[error] ^
[error] /home/antek/dev/scala/xxx/src/main/scala/api/HttpSignOperation.scala:82: macros cannot be partially applied
[error] Log.put(f"HTTP POST returned status $status%d")
[error] ^
[error] /home/antek/dev/scala/xxx/src/main/scala/api/JWSOperation.scala:38: macros cannot be partially applied
[error] Log.put(f"HTTP POST returned status $status%d")
[error] ^
[error]
[error] while compiling: /home/antek/dev/scala/xxx/src/main/scala/api/Order.scala
[error] during phase: typer
[error] library version: version 2.10.4
[error] compiler version: version 2.10.4
[error] reconstructed args: -classpath /home/antek/dev/scala/xxx/target/scala-2.12/classes:/home/antek/.ivy2/cache/org.xer[... cut ...]64/jre/classes:/home/antek/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.12.2.jar

Related

Build.sbt breaks when adding GraphFrames build with scala 2.11

I'm trying to add GraphFrames to my scala spark application, and this was going fine when I added the one based on 2.10. However, as soon as I tried to build it with GraphFrames build with scala 2.11, it breaks.
The problem would be that there are conflicting versions of scala used (2.10 and 2.11). I'm getting the following error:
[error] Modules were resolved with conflicting cross-version suffixes in {file:/E:/Documents/School/LSDE/hadoopcryptoledger/examples/scala-spark-graphx-bitcointransaction/}root:
[error] org.apache.spark:spark-launcher _2.10, _2.11
[error] org.json4s:json4s-ast _2.10, _2.11
[error] org.apache.spark:spark-network-shuffle _2.10, _2.11
[error] com.twitter:chill _2.10, _2.11
[error] org.json4s:json4s-jackson _2.10, _2.11
[error] com.fasterxml.jackson.module:jackson-module-scala _2.10, _2.11
[error] org.json4s:json4s-core _2.10, _2.11
[error] org.apache.spark:spark-unsafe _2.10, _2.11
[error] org.apache.spark:spark-core _2.10, _2.11
[error] org.apache.spark:spark-network-common _2.10, _2.11
However, I can't troubleshoot what causes this.. This is my full build.sbt:
import sbt._
import Keys._
import scala._
lazy val root = (project in file("."))
.settings(
name := "example-hcl-spark-scala-graphx-bitcointransaction",
version := "0.1"
)
.configs( IntegrationTest )
.settings( Defaults.itSettings : _*)
scalacOptions += "-target:jvm-1.7"
crossScalaVersions := Seq("2.11.8")
resolvers += Resolver.mavenLocal
fork := true
jacoco.settings
itJacoco.settings
assemblyJarName in assembly := "example-hcl-spark-scala-graphx-bitcointransaction.jar"
libraryDependencies += "com.github.zuinnote" % "hadoopcryptoledger-fileformat" % "1.0.7" % "compile"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.0" % "provided"
libraryDependencies += "org.apache.spark" %% "spark-graphx" % "1.5.0" % "provided"
libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.7.0" % "provided"
libraryDependencies += "javax.servlet" % "javax.servlet-api" % "3.0.1" % "it"
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.7.0" % "it" classifier "" classifier "tests"
libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % "2.7.0" % "it" classifier "" classifier "tests"
libraryDependencies += "org.apache.hadoop" % "hadoop-minicluster" % "2.7.0" % "it"
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.2.0" % "provided"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test,it"
libraryDependencies += "graphframes" % "graphframes" % "0.5.0-spark2.1-s_2.11"
Can anyone pinpoint which dependency is based on scala 2.10 causing the build to fail?
I found out what the problem was. Apparently, if you use:
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.0" % "provided"
It uses the 2.10 version by default. It all worked once I changed the dependencies of spark core and spark graphx to:
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0"
libraryDependencies += "org.apache.spark" % "spark-graphx_2.11" % "2.2.0" % "provided"

Scala: object profile is not a member of package com.amazonaws.auth

I am having a build problem. Here is my sbt file:
name := "SparkPi"
version := "0.2.15"
scalaVersion := "2.11.8"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.10
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.0.1"
// old:
//libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.1"
// https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk
libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.0.002"
scalacOptions ++= Seq("-feature")
Here is the full error message I am seeing:
[info] Set current project to SparkPi (in build file:/Users/xxx/prog/yyy/)
[info] Updating {file:/Users/xxx/prog/yyy/}yyy...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Compiling 2 Scala sources to /Users/xxx/prog/yyy/target/scala-2.11/classes...
[error] /Users/xxx/prog/yyy/src/main/scala/PiSpark.scala:6: object profile is not a member of package com.amazonaws.auth
[error] import com.amazonaws.auth.profile._
[error] ^
[error] /Users/xxx/prog/yyy/src/main/scala/PiSpark.scala:87: not found: type ProfileCredentialsProvider
[error] val creds = new ProfileCredentialsProvider(profile).getCredentials()
[error] ^
[error] two errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 14 s, completed Nov 3, 2016 1:43:34 PM
And here are the imports I am trying to use:
import com.amazonaws.services.s3._
import com.amazonaws.auth.profile._
How do I import com.amazonaws.auth.profile.ProfileCredentialsProvider in Scala?
EDIT
Changed sbt file so spark core version corresponds to Scala version, new contents:
name := "SparkPi"
version := "0.2.15"
scalaVersion := "2.11.8"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.1"
// old:
//libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.1"
// https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk
libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.0.002"
scalacOptions ++= Seq("-feature")
You are using scalaVersion := "2.11.8" but library dependency has underscore 2.10 spark-core_2.10 which is bad.
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.0.1"
^
change 2.10 to 2.11
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.1"
`

play-json breaks sbt build

Suddenly as of today my project has stopped compiling successfuly. Upon further investigation I've found out the reason is play-json library that I include in dependencies.
Here's my build.sbt:
name := """project-name"""
version := "1.0"
scalaVersion := "2.10.2"
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.2.1",
"com.typesafe.akka" %% "akka-testkit" % "2.2.1",
"org.scalatest" %% "scalatest" % "1.9.1" % "test",
"org.bouncycastle" % "bcprov-jdk16" % "1.46",
"com.sun.mail" % "javax.mail" % "1.5.1",
"com.typesafe.slick" %% "slick" % "2.0.1",
"org.postgresql" % "postgresql" % "9.3-1101-jdbc41",
"org.slf4j" % "slf4j-nop" % "1.6.4",
"com.drewnoakes" % "metadata-extractor" % "2.6.2",
"com.typesafe.play" %% "play-json" % "2.2.2"
)
resolvers += "Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/"
If I try to create a new project in activator with all the lines except "com.typesafe.play" %% "play-json" % "2.2.2" then it compiles successfully. But once I add play-json I get the folloing error:
[error] References to undefined settings:
[error]
[error] *:playCommonClassloader from echo:run
[error]
[error] docs:managedClasspath from echo:run
[error]
[error] *:playReloaderClassloader from echo:run
[error]
[error] echo:playVersion from echo:echoTracePlayVersion
[error]
[error] *:playRunHooks from echo:playRunHooks
[error] Did you mean echo:playRunHooks ?
[error]
And I keep getting this error even if I remove play-json line. Why is it so? What should I do to fix it?

Errors while compiling project migrated to SBT - error while loading package and Assertions

I'm migrating a Scala application that compiles and runs fine by manually
including jars in the classpath to a SBT build configuration.
My build.sbt is as follows:
name := "hello"
version := "1.0"
scalaVersion := "2.9.2"
libraryDependencies += "org.slf4j" % "slf4j-simple" % "1.6.4"
libraryDependencies += "junit" % "junit" % "4.11"
libraryDependencies += "org.scalatest" % "scalatest_2.10" % "1.9.2"
libraryDependencies += "org.hamcrest" % "hamcrest-all" % "1.3"
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.0.13"
libraryDependencies += "com.github.scct" % "scct_2.10" % "0.2.1"
libraryDependencies += "org.scala-lang" % "scala-swing" % "2.9.2"
When I compile it I get the following errors:
Loading /usr/share/sbt/bin/sbt-launch-lib.bash
[info] Set current project to hello (in build file:/home/kevin/gitrepos/go-game-msc/)
> compile
[info] Updating {file:/home/kevin/gitrepos/go-game-msc/}go-game-msc...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Compiling 25 Scala sources to /home/kevin/gitrepos/go-game-msc/target/scala-2.9.2/classes...
[error] error while loading package, class file needed by package is missing.
[error] reference value <init>$default$2 of object deprecated refers to nonexisting symbol.
[error] error while loading Assertions, class file needed by Assertions is missing.
[error] reference value <init>$default$2 of object deprecated refers to nonexisting symbol.
[error] two errors found
[error] (compile:compile) Compilation failed
[error] Total time: 21 s, completed 09-Mar-2014 12:07:14
I've tried matching up the dependencies with the jar files I am using:
hamcrest-all-1.3.jar
logback-classic-1.0.13.jar
scalaedit-assembly-0.3.7(1).jar
scalatest_2.9.0-1.9.1.jar
slf4j-simple-1.6.4.jar
hamcrest-core-1.3.jar
logback-core-1.0.13.jar
scalaedit-assembly-0.3.7.jar
scct_2.9.2-0.2-SNAPSHOT.jar
junit-4.11.jar
miglayout-4.0.jar
scalariform.jar
slf4j-api-1.7.5.jar
Please advise.
Never mix scala binary versions.
Use always %% (instead of % and _2.x.x):
libraryDependencies +="org.scalatest" %% "scalatest" % "1.9.2"
libraryDependencies +="com.github.scct" %% "scct" % "0.2-SNAPSHOT"

How to run a Lift-2.4 web app with sbt

I'm trying to run a Lift-2.4 web app following this tutorial http://scala-ide.org/docs/tutorials/lift24scalaide20/index.html
The problem is how to run this app (either on jetty, tomcat or other server)?!
I'm trying the command jetty-run, but I've got this error:
> jetty-run
[error] Not a valid command: jetty-run
[error] Expected '/'
[error] Expected ':'
[error] Not a valid key: jetty-run (similar: run)
[error] jetty-run
[error] ^
And when I do container:start, I've also got an error:
> container:start
[error] Not a valid key: start (similar: state, target, start-year)
[error] container:start
[error]
^
My configurations are:
The file "build.sbt" contains:
name := "lift-basic"
organization := "my.company"
version := "0.1-SNAPSHOT"
scalaVersion := "2.9.1"
EclipseKeys.createSrc := EclipseCreateSrc.Default + EclipseCreateSrc.Resource
libraryDependencies ++= {
val liftVersion = "2.4"
Seq(
"net.liftweb" %% "lift-webkit" % liftVersion % "compile",
"net.liftweb" %% "lift-mapper" % liftVersion % "compile",
"org.mortbay.jetty" % "jetty" % "6.1.26" % "test",
"junit" % "junit" % "4.7" % "test",
"ch.qos.logback" % "logback-classic" % "0.9.26",
"org.scala-tools.testing" %% "specs" % "1.6.9" % "test",
"com.h2database" % "h2" % "1.2.147"
)
}
And the file ".sbt/plugins/build.sbt" contains:
//Eclipse Plugin
resolvers += Classpaths.typesafeResolver
addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "2.0.0")
It sounds like you need to install the xsbt-web-plugin plugin for sbt.
Instructions are available here: https://github.com/JamesEarlDouglas/xsbt-web-plugin/wiki
That should provide you with container:start as well as jar packaging.