Missing dependencies in Apache Crunch Scala build - scala

I'm trying to build the Apache Crunch source code on my CentOS 7 machine, but am getting the following error in the crunch-spark project when I execute mvn package:
[ERROR] /home/bwatson/programming/git/crunch/crunch-spark/src/it/scala/org/apache/crunch/scrunch/spark/PageRankClassTest.scala:71: error: bad symbolic reference. A signature in PTypeH.class refers to term protobuf
[ERROR] in package com.google which is not available.
[ERROR] It may be completely missing from the current classpath, or the version on
[ERROR] the classpath might be incompatible with the version used when compiling PTypeH.class.
[ERROR] .map(line => { val urls = line.split("\\t"); (urls(0), urls(1)) })
[ERROR] ^
Other SO questions about similar errors (here and here) seem to involve PATH or version issues. I've been messing around but can't seem to resolve them. For completeness:
[bwatson#ben-pc crunch]$ scala -version
Scala code runner version 2.11.5 -- Copyright 2002-2013, LAMP/EPFL
[bwatson#ben-pc crunch]$ java -version
java version "1.8.0_31"
Java(TM) SE Runtime Environment (build 1.8.0_31-b13)
Java HotSpot(TM) 64-Bit Server VM (build 25.31-b07, mixed mode)
[bwatson#ben-pc crunch]$ mvn -version
Apache Maven 3.0.5 (Red Hat 3.0.5-16)
Maven home: /usr/share/maven
Java version: 1.8.0_31, vendor: Oracle Corporation
Java home: /usr/java/jdk1.8.0_31/jre
Default locale: en_GB, platform encoding: UTF-8
OS name: "linux", version: "3.10.0-123.20.1.el7.x86_64", arch: "amd64", family: "unix"
Any advice? I'm not really sure where Scala is looking for its dependencies, but I'd have thought that Maven would take care of it.

Unfortunately Different versions of Scala are binary incompatible. Currently by default Apache Spark uses Scala 2.10.4, not Scala 2.11. Apache Scrunch is dependent on Spark. Maven does not know anything about this so it can't help. It is necessary to make some modifications to Scrunch to get it to compile for Scala 2.11 / JDK 1.8. I am working on this at the moment, but I don't have a solution yet. However I get the error message you report if I compile Scala 2.10.4 with JDK 1.8, not Scala 2.11, so I don't think it is doing quite what you intend. The error seems be coming from the Protobuf compiler or jar but I don't know why that is.
When I solve it myself, I will report back!

It turns out the official documentation for Crunch was missing a Maven parameter. The issue was solved by building using:
mvn package -Dcrunch.platform=2

Related

Cross-compiled with an incompatible version

I am using eclipse with m2eclipse-scala plugin. Currently, I get the following error message:
exampleA_2.10-2.0.1.jar of module build path is cross-compiled with an incompatible version of Scala (2.10.0). In case this report is mistaken, this check can be disabled in the compiler preference page
It looks like the versions of extracted Scala and Scala IDE match. I just wanted to make sure that this is a "false-negative" as described here and can be safely turned off.
As #The Archetypal Paul suggested, it was because I was using wrong Scala library.
If you are using Scala 2.11 (check at About Scala IDE -> installation details), you can downgrade by following instruction here. It's a lot easier than uninstalling and re-installing Scala IDE as other Stackoverflow posts recommend.
I also faced the same issue->
I am trying to use casbah jar in scala to integrate with mongodb.
After analyzing the problem i found that ->
i am trying to use casbah 2.9.1 version and my scala version is 2.11.8
Root-Cause of such error is : your jar is compiled in 2.9.0 version and you are using scala 2.11.8 version
So, to resolve that i use the jar that is compiled into 2.11 scala version-
<groupId>org.mongodb</groupId>
<artifactId>casbah-core_2.11</artifactId>
<version>3.1.1</version>
I was facing similar issue in Eclipse IDE where I had built a Spark scala project in Maven. The scala version was set to 2.11.
Later, I upgraded Scala-Ide plugin in Eclipse after which my project marked below errors,
exampleA_2.10-2.0.1.jar of module build path is cross-compiled with an incompatible version of Scala (2.10.0). In case this report is mistaken, this check can be disabled in the compiler preference page
Right click project folder > scala > set scala version. Here my scala version was displayed as 2.10. I selected 2.11 and removed all the error messages.

.jar files cross-compiled with an incompatible version of Scala (2.10)

I am building my first Scala/Play application and after I create and import an eclipse project from the Play shell, I get 17 errors.
akka-actor_2.10.jar is cross compiled with an incompatible version of Scala (2.10)
akka-slf4j_2.10.jar is cross compiled with an incompatible version of Scala (2.10)
anorm_2.10.jar ...
play_2.10.jar ...
play_iterates_2.10.jar ...
The list goes on to include the Scala jars, scalaz jars, etc...
I am using:
Eclipse 4.2.2
Scala ide 3.0.0.nightly-2_09
Scala 2.10
Play 2.1
Has someone experienced the same thing?
Is it possible you are using Scala IDE built for for Scala 2.9.x ?
Make sure you use the one for 2.10.x:
http://download.scala-ide.org/sdk/e38/scala210/dev/site/

Using guava in griffon gives Prohibited package exception

I am using Griffon and want to add the guava libraries as a dependency in my project. However, when I do this, even without using 1 class of it, I get the following exception:
Compilation error: BUG! exception in phase 'canonicalization' in source unit
'/home/wdb/myproject/griffon-app/controllers/MyController.groovy' Prohibited
package name: java.util.concurrent
Any idea what might be wrong? This is my java version (on Ubuntu 11.10):
wdb#wdb-laptop:~$ java -version
java version "1.6.0_27"
Java(TM) SE Runtime Environment (build 1.6.0_27-b07)
Java HotSpot(TM) Server VM (build 20.2-b06, mixed mode)
I found this link that talks about using the bootclasspath for a similar problem, but that seems a bit drastic.
regards,
Wim
My wild guess is that our bootclasspath copy of java.util.concurrent.ExecutorService (necessary due to an incompatible change between JDK5 and JDK6) is showing up in your classpath. I don't really know Maven, but I would think that, because we identify the dependency as "provided", this shouldn't be happening.
That's not really an answer, but I hope it's enough to get you or someone else started.
It must be that Griffon does not honor 'provided' scope. I managed to get it working by editing BuilderConfig.groovy to:
compile( 'com.google.guava:guava:10.0.1' ) {
exclude 'guava-bootstrap'
}

Setting up sbt to use Java 7 for compilation?

I'm getting compile errors when running the compile task as the sources reference new classes in java.nio.file package that only appeared in Java 7.
I have the following in build.sbt:
javaHome := Some(file("/opt/jdk/jdk1.7.0"))
fork := true
In sbt:
> show java-home
[info] Some(/opt/jdk/jdk1.7.0)
It compiles and runs fine in Eclipse. How can I set up sbt to use Java 7 for compilation?
The most reliable (perhaps only) way to do this at the moment it to start SBT with java in the JDK7 folder.
Modify your sbt launcher script; or use this one that allows you to specify Java Home (and so much more!) as command line options.
~/code/scratch/20111009 sbt -java-home /Library/Java/JavaVirtualMachines/openjdk-1.7-x86_64/Contents/Home
Starting sbt: invoke with -help for other options
[info] Loading global plugins from /Users/jason/.sbt/plugins
[info] Set current project to default-3e990a (in build file:/Users/jason/code/scratch/20111009/)
> console
[info] Compiling 1 Scala source to /Users/jason/code/scratch/20111009/target/scala-2.9.1/classes...
[info] Starting scala interpreter...
[info]
Welcome to Scala version 2.9.1.final (OpenJDK 64-Bit Server VM, Java 1.7.0-internal).
Type in expressions to have them evaluated.
Type :help for more information.
scala> java.util.Objects.equals(null, null)
res0: Boolean = true
Simply setting javaHome := Some(file("/Library/Java/JavaVirtualMachines/openjdk-1.7-x86_64/Contents/Home")) changes the Java version used to compile and fork processes, but does not change the version of the Java standard library on the classpath, nor the version used to run tests, which are always run the the same JVM as SBT.
If you use Linux or Mac, another possibility is to look at jenv, a command line Java manager.
It allows you to choose per project which JDK to use.
I use virtualenv, which is a tool from the Python ecosystem. In a nutshell, it is a shell script which allows you to change your PATH variable easily and get back to what it was before, if you need to.
First install virtualenvwrapper (a wrapper around virtualenv):
$ apt-get install virtualenvwrapper
Now create a virtual environment for, say, Java8 with Scala-2.11.
$ mkvirtualenv j8s11
Now, adjust ~/.virtualenvs/j8s11/bin/postactivate so that you define locations for all your tools. You can see an example below which works for me:
#!/bin/bash
JAVA_VERSION=1.8.0_31
SCALA_VERSION=2.11.5
SBT_VERSION=0.13.7
ANT_VERSION=1.9.4
M2_VERSION=3.2.5
GRADLE_VERSION=1.6
PLAY_VERSION=2.3.7
ACTIVATOR_VERSION=1.2.12
IDEA_VERSION=IC-135.475
PYCHARM_VERSION=community-3.4.1
TOOLS_HOME=/opt/developer
export JAVA_HOME=${TOOLS_HOME}/jdk${JAVA_VERSION}
export SCALA_HOME=${TOOLS_HOME}/scala-${SCALA_VERSION}
export SBT_HOME=${TOOLS_HOME}/sbt-${SBT_VERSION}
export ANT_HOME=${TOOLS_HOME}/apache-ant-${ANT_VERSION}
export M2_HOME=${TOOLS_HOME}/apache-maven-${M2_VERSION}
export GRADLE_HOME=${TOOLS_HOME}/gradle-${GRADLE_VERSION}
export PLAY_HOME=${TOOLS_HOME}/play-${PLAY_VERSION}
export ACTIVATOR_HOME=${TOOLS_HOME}/activator-${ACTIVATOR_VERSION}
export IDEA_HOME=${TOOLS_HOME}/idea-${IDEA_VERSION}
export PYCHARM_HOME=${TOOLS_HOME}/pycharm-${PYCHARM_VERSION}
PATH=${PYCHARM_HOME}/bin:$PATH
PATH=${IDEA_HOME}/bin:$PATH
PATH=${ACTIVATOR_HOME}:$PATH
PATH=${PLAY_HOME}:$PATH
PATH=${GRADLE_HOME}/bin:$PATH
PATH=${M2_HOME}/bin:$PATH
PATH=${ANT_HOME}/bin:$PATH
PATH=${SBT_HOME}/bin:$PATH
PATH=${SCALA_HOME}/bin:$PATH
PATH=${JAVA_HOME}/bin:$PATH
export PATH
Now you can just use workon to switch between environments. Example:
rgomes#terra:~$ workon j8s11
(j8s11)rgomes#terra:~$ java -version
java version "1.8.0_31"
Java(TM) SE Runtime Environment (build 1.8.0_31-b13)
Java HotSpot(TM) 64-Bit Server VM (build 25.31-b07, mixed mode)
(j8s11)rgomes#terra:~$ scala -version
Scala code runner version 2.11.5 -- Copyright 2002-2013, LAMP/EPFL
(j8s11)rgomes#terra:~$ workon j7s10
(j7s10)rgomes#terra:~$ java -version
java version "1.7.0_71"
Java(TM) SE Runtime Environment (build 1.7.0_71-b14)
Java HotSpot(TM) 64-Bit Server VM (build 24.71-b01, mixed mode)
(j7s10)rgomes#terra:~$ scala -version
Scala code runner version 2.10.4 -- Copyright 2002-2013, LAMP/EPFL
I'm assuming you want to change whatever you have set in JAVA_HOME by default, which you can do when invoking sbt:
JAVA_HOME=<path-to-jdk-home> sbt
This works for me on OSX with sbt 0.13.8
change javacOption to 1.7? I don't think setting the javaHome is necessary.

compiling scala program on jvm

What version of the scala library jar will support java 1.5 and 1.6
And also how to compile in java?
Scala's jar files are compiled with JVM 1.5, as can be seen by looking into the META-INF/MANIFEST.MF file. Here is 2.8.1: Created-By: 1.5.0_22-b03 (Sun Microsystems Inc.).
The Scala compiler generates output that is compatible with jvm-5 (see scalac -target). Also, it has some magic to enable use with jdk 1.6, in which String has added a method that Scala defines on WrappedString.
Scala is compiled with the scalac compiler and not javac. To run the Scala code you need Java runtime/sdk version 1.5 or later and the scala library jar.
Version 2.7.7 and 2.8.1 are the common versions used today. If you start a new project now you should aim for 2.8.1 or 2.9 (which is going to be available soon-ish). Download site is here
See the FAQ