I have downloaded the spark source from apache site, then I built the source using maven.
spark - version 1.6.3
hadoop - version 2.7.3
scala - version 2.10.4
I have used below command for build the project
.build/mvn -Pyarn -Phadoop-2.7 -Dhadoop.version=2.7.3 -Phive -Phive-thriftserver -Dscala-2.10 -DskipTests clean package
I have tried with version - Phadoop-2.4, 2.6 but every time I am getting error at the time of hive build --
Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-hive_2.10: wrap: scala.reflect.internal.MissingRequirementError: object scala.runtime in compiler mirror not found. -> [Help 1]
Related
Instructions for the course say to use verion 0.13.x.
I installed the latest msi from the sbt site, but when I type "sbt about", I get:
Microsoft Windows [Version 10.0.15063]
(c) 2017 Microsoft Corporation. All rights reserved.
C:\Users\reall>sbt about
Error: Unable to access jarfile
Copying runtime jar.
The filename, directory name, or volume label syntax is incorrect.
Error: Unable to access jarfile
"C:\Users\reall\.sbt\preloaded\org.scala-sbt\sbt\"1.0.2"\jars\sbt.jar"
Java HotSpot(TM) 64-Bit Server VM warning: Ignoring option MaxPermSize; support was removed in 8.0
[info] Loading project definition from C:\Users\reall\project
[info] Set current project to reall (in build file:/C:/Users/reall/)
[info] This is sbt 1.0.2
[info] The current project is {file:/C:/Users/reall/}reall 0.1-SNAPSHOT
[info] The current project is built against Scala 2.12.3
[info] Available Plugins: sbt.plugins.IvyPlugin, sbt.plugins.JvmPlugin, sbt.plugins.CorePlugin, sbt.plugins.JUnitXmlReportPlugin, sbt.plugins.Giter8TemplatePlugin
[info] sbt, sbt plugins, and build definitions are using Scala 2.12.3
i.e., a jar file error and the sbt version 1.0.2.
Any idea what I'm doing wrong?
You are not doing anything wrong, it's just that the required version 0.13.x is not the latest anymore. So you can either follow #dmytro-mitin's answer and reinstall sbt, or you can still use the one you already have: what you installed now is the sbt launcher, it can be used to run different versions of sbt depending on a project. So it's not important which launcher version you are using (unless you're working on something very sbt-specific).
Normally, every sbt project has a project/build.properties file with the sbt version that is needed to work with it:
sbt.version=0.13.16
So you can change (or create) this file and when you run sbt in the project root folder, it will launch sbt version 0.13.16.
Another way to launch specific version of sbt is to run it with the -sbt-version option :
sbt -sbt-version 0.13.16
or using -D flag:
sbt -Dsbt.version=0.13.16
which has exactly the same effect as editing project/build.properties.
Install not the latest version. The latest one is 1.0.2.
Install 0.13.16.
You can download it here: http://www.scala-sbt.org/download.html
There are msi and zip files.
Installing sbt on Windows
I'm trying to build Apache Spark 1.4.0 with maven, the build never ends; it freezes and no error is shown.
I have no idea why but it freezes on this line :
[INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) # spark-launcher_2.10 ---
I try to update scala (2.11.4) and Maven (3.0.5) and it's the same.
Are you building it the way they suggest it?
Which is by running the included mvn command (see README).
I'm trying to build the project of hadoop. I followed the official document as follows on how to do that.
http://wiki.apache.org/hadoop/HowToContribute
https://wiki.apache.org/hadoop/EclipseEnvironment
I've git clone the project, maven install it successfully, but when I import the project, or even a sub-project like 'hadoop-yarn-api', I got the following errors on maven:
Description Resource Path Location Type
Plugin execution not covered by lifecycle configuration: org.apache.hadoop:hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (execution: compile-protoc, phase: generate-sources) pom.xml /hadoop-yarn-api line 73 Maven Project Build Lifecycle Mapping Problem
Then I'm trying to mvn clean install the project via External Tools Configuration in eclipse, it also failed on:
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc) on project hadoop-yarn-api: org.apache.maven.plugin.MojoExecutionException: 'protoc --version' did not return a version -> [Help 1]
But the curious part is that when I cd into the root directory of 'hadoop-yarn-api' and invoke mvn clean install, it can be built successfully.
I'm using m2ecipse in eclipse, and I'm sure that I've changed to the maven which is exactly the one that I'm used in command line, not the embedded one.
And I've installed protocol buffers 2.5.0:
$ protoc --version
libprotoc 2.5.0
Could anyone give me some idea? Many thanks!
P.S.
Eclipse Java EE IDE - Juno Service Release 2
m2e - 1.4.1.20140328-1905
Mac 1- 0.9.4
maven - 3.0.5
Hadoop - 2.2.0
I am trying to build Kafka with Scala 2.10.1. I tried following steps given on Git-hub. At the end it generates a Jar in Target directory, however that Jar is empty and the size is 5kb. Am I missing something here ? I am totally new to SBT.
1) ./sbt update
2) ./sbt package
3) ./sbt assembly-package-dependency
To build for a particular version of Scala (either 2.8.0, 2.8.2, 2.9.1, 2.9.2 or 2.10.1), change step 2 above to: 2. ./sbt "++2.8.0 package"
Actually Kafka jar is located in core/target/scala-2.10/ directory, and dependencies are in the Ivy cache.
Execute ./sbt release-zip to get an archive in target/RELEASE/ with all dependencies and shell scripts packaged.
To build release for a particular Scala version, add version param to the build command:
./sbt "++2.10.1 release-zip"
Trying to configure Eclipse and Hadoop, following this site.
I successfully ran:
git clone git://git.apache.org/hadoop-common.git
Now I get a failure on:
mvn install -DskipTests
[INFO] Apache Hadoop Main ................................ SUCCESS [6.688s]
[INFO] Apache Hadoop Project POM ......................... FAILURE [1.548s]
...
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (create-testdirs) on project hadoop-project: Error executing ant tasks: /usr/local/hadoop/hadoop-common/hadoop-project/target/antrun/build-main.xml (No such file or directory) -> [Help 1]
The reason being I don't have a "target" folder in .../hadoop-project. My /hadoop-project folder contains only a "src" folder and that's it.
So it appears that folder was not established. Any ideas? Here is the environment:
Ubuntu 12.1
Hadoop 1.0.4
Maven 3.0.4
The site mentioned in the query is for development of Hadoop and not for developing applications using Hadoop. There is a difference between the two.
Here are the instructions for for developing/debugging MapReduce programs in Eclipse. Note that the procedure applies for Linux OS only.