Trying to configure Eclipse and Hadoop, following this site.
I successfully ran:
git clone git://git.apache.org/hadoop-common.git
Now I get a failure on:
mvn install -DskipTests
[INFO] Apache Hadoop Main ................................ SUCCESS [6.688s]
[INFO] Apache Hadoop Project POM ......................... FAILURE [1.548s]
...
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (create-testdirs) on project hadoop-project: Error executing ant tasks: /usr/local/hadoop/hadoop-common/hadoop-project/target/antrun/build-main.xml (No such file or directory) -> [Help 1]
The reason being I don't have a "target" folder in .../hadoop-project. My /hadoop-project folder contains only a "src" folder and that's it.
So it appears that folder was not established. Any ideas? Here is the environment:
Ubuntu 12.1
Hadoop 1.0.4
Maven 3.0.4
The site mentioned in the query is for development of Hadoop and not for developing applications using Hadoop. There is a difference between the two.
Here are the instructions for for developing/debugging MapReduce programs in Eclipse. Note that the procedure applies for Linux OS only.
Related
I am running into a weird error with my IntelliJ setup for Scala. When I try to run any Unit Test for my project, I get the below error.
scalac: Output directory not specified for ...
From Maven, I am able to successfully run the tests. No issues there.
To ensure my Scala setup is correct, I created a simple HelloWorld scala project and I am able to successfully add a unit test and run it from IntelliJ.
I am running into this error only for my project repo that I downloaded from Git. Any suggestions on how I can identify what I am missing in the setup?
Things I tried so far:
I ensured that my Java and Maven version in IntelliJ matches what I have on my Mac
I tried deleting the .idea folder, "invalidate caches and restart". No luck after any of these.
➜ ~ mvn -version
Apache Maven 3.6.3 (cecedd343002696d0abb50b32b541b8a6ba2883f)
Maven home: /Users/syncmaster/.asdf/installs/maven/3.6.3
Java version: 11.0.14.1, vendor: Azul Systems, Inc., runtime: /Users/syncmaster/.asdf/installs/java/zulu-11.54.25/zulu-11.jdk/Contents/Home
I am trying to build a SBT based Scala project. The project has dependencies on packages on my company's local maven server. From within the company's network, I was able to compile/package(from cli and IntelliJ) the project successfully yesterday.
Today from home from command prompt, When I do:
sbt assembly
to build the repo it works fine.
But when I build from IntelliJ or do:
sbt compile|package
It is hitting the maven server and failing. My question is if the packages are already in the cache - ~/.ivy2/ then why maven server is hit. Is there a way to avoid server hit?
Doing a little search, found the answer
$ sbt "set offline := true" run
I'm trying to build the project of hadoop. I followed the official document as follows on how to do that.
http://wiki.apache.org/hadoop/HowToContribute
https://wiki.apache.org/hadoop/EclipseEnvironment
I've git clone the project, maven install it successfully, but when I import the project, or even a sub-project like 'hadoop-yarn-api', I got the following errors on maven:
Description Resource Path Location Type
Plugin execution not covered by lifecycle configuration: org.apache.hadoop:hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (execution: compile-protoc, phase: generate-sources) pom.xml /hadoop-yarn-api line 73 Maven Project Build Lifecycle Mapping Problem
Then I'm trying to mvn clean install the project via External Tools Configuration in eclipse, it also failed on:
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc) on project hadoop-yarn-api: org.apache.maven.plugin.MojoExecutionException: 'protoc --version' did not return a version -> [Help 1]
But the curious part is that when I cd into the root directory of 'hadoop-yarn-api' and invoke mvn clean install, it can be built successfully.
I'm using m2ecipse in eclipse, and I'm sure that I've changed to the maven which is exactly the one that I'm used in command line, not the embedded one.
And I've installed protocol buffers 2.5.0:
$ protoc --version
libprotoc 2.5.0
Could anyone give me some idea? Many thanks!
P.S.
Eclipse Java EE IDE - Juno Service Release 2
m2e - 1.4.1.20140328-1905
Mac 1- 0.9.4
maven - 3.0.5
Hadoop - 2.2.0
I am trying to build Kafka with Scala 2.10.1. I tried following steps given on Git-hub. At the end it generates a Jar in Target directory, however that Jar is empty and the size is 5kb. Am I missing something here ? I am totally new to SBT.
1) ./sbt update
2) ./sbt package
3) ./sbt assembly-package-dependency
To build for a particular version of Scala (either 2.8.0, 2.8.2, 2.9.1, 2.9.2 or 2.10.1), change step 2 above to: 2. ./sbt "++2.8.0 package"
Actually Kafka jar is located in core/target/scala-2.10/ directory, and dependencies are in the Ivy cache.
Execute ./sbt release-zip to get an archive in target/RELEASE/ with all dependencies and shell scripts packaged.
To build release for a particular Scala version, add version param to the build command:
./sbt "++2.10.1 release-zip"
I'm trying to generate a release with goal release:prepare, but when I run it on Eclipse I got the error:
Failed to execute goal
org.apache.maven.plugins:maven-release-plugin:2.0-beta-7:prepare
(default-cli) on project mwframework: Can't run goal clean verify:
Error while executing process. Cannot run program "mvn" (in directory
"/home/gnng/Development/work-7-maven/mwframework"):
java.io.IOException: error=2, No such file or directory -> [Help 1]
I'm using Eclipse Maven Embedded, what I doing wrong?
Thanks in advanced.
The Maven release plugin always fork a child Maven build. So, you need to use external Maven installation (e.g. configure it in Eclipse).