why eclipse doesn't support new hadoop version - eclipse

I want to integrate hadoop with eclipse. While it seems eclipse plugin only supports old version of eclipse and hadoop,new version will cause an error about EOFException,detail here)I finish my configuration with Eclipse3.3 and hadoop0.20.0 and it works well. I want to know what should I do if I want to use new version of eclipse and hadoop, or other way to develop hadoop program.
Could anyone help me? Thanks very much~~

Related

What version of Eclipse with what version of plugin Hadoop

I tried various versions of Eclipse with plugins downloaded from internet but I always have problem. With latest version of eclipse and the plugin developed for Hadoop2X, I could not open the Hadoop location configuration wizard. I reinstalled eclipse with older version Ganymede. I used the plugin the one for this version, I am able to open the wizard for Hadoop location configuration but when I try to create the project, I have given the hadoop installation direction, even then it shows the error as configure Hadoop installation directory and its not letting me move to the next section of wizard.
So can someone suggest me what version of eclipse should I use and where can I download the suitable plugin. I am using Hadoop2.5.1.
Thank You. Please help me for the same.

Building Hadoop(1.2.1) Eclipse(Kepler 4.3.1) Plugin

I tried building an eclipse plugin for Hadoop for my Eclipse Kepler V 4.3.1
In the web I see a lot of information for JUNO and some prior versions but tried copying those jars into my eclipse->plugin directory brought me no luck. hence tried building my own eclipse plugin but even that is not working for me. Has anyone here has a Hadoop plugin working for Kepler?
Below are my other config details :
Mac OSX
java version "1.6.0_65"
I managed to sucessfully build it using this guide
with some slight modifications, i am however using Eclipse 4.3.2 on Arch and java 7.
Either way here is my plugin if it helps.

Failed to create javafx flatform in netbeans

I got this error when I'm trying to create a Java Swing Application, by using Netbeans.
I really don't know how to fix this error. Internet did't give me a really good answer.
I Unistalled Netbeans & then reInstalled. but It doesn't work.
How can I fix this error. Thank you.
I don't have a NetBeans installation at hand, but have a look at your Java installations configured in NetBeans ("Java Platform Manager" in Tools menu?). Select the Java version you are using. There should be a JavaFX tab. I think you can disable it there, if you don't need it.
Note: If you're using the latest Java version from Oracle, then JavaFX should automatically be configured correctly, since JavaFX is now shipped with the JDK.

Plugin can not be seen by rcp application

i habe a plugin written in Eclipse which used to work on a given rcp application just by adding the .jar file to the "plugin"-folder of my rcp application. Now i have a new machine (win7 64bits, JRE 1.6) and tried to export a new .jar after making minor changes to the project. Now my rcp application can not "see" my plugin any more. The old machine used to have JRE 1.5 amd was 32bits. I would appreciate it if somebody could give me a hint. I have been trying to solve this issue for 5 days without much success. Thank you very much!
Probobly the problem is with the lunch configuration or with the JRE!
May be you need to update your eclipse with deltapack to support 64bit/32bit Platform related jars.
Example:
http://archive.eclipse.org/eclipse/downloads/drops/R-3.7.2-201202080800/#DeltaPack

Hadoop 0.20.2 Eclipse plugin not fully functioning - can't 'Run on Hadoop'

I've just finished installing Hadoop 0.20.2 under Cygwin on Windows 7 with Eclipse Helios (3.6). Hadoop is now fully started, and I'm trying to run a test application within a newly created MapReduce test project in Eclipse. I'm using the Hadoop 0.20.2 plugin from the Hadoop download.
The Map/Reduce Location perspective operates correctly, as does DFS Locations tree in the Package Explorer. However, when I right-click the driver, select 'Run As' > 'Run on Hadoop', nothing happens and no errors spawn on the Console (silent fail :(). I believe a dialog window should appear asking for config before it runs, but this is not happening.
There seems to be a few others with the same problem, but I've yet to find an answer that works. I've tried the 0.20.1 plugin (total fail). The following bug report seems to describe my issue, though I'm a bit of a newbie to all this, so could do with a hand / voice of experience to help out: https://issues.apache.org/jira/browse/MAPREDUCE-1280
The hadoop eclipse plugin bundled with the hadoop distribution is compatible with eclipse up to version 3.3. The JIRA-ticket MAPREDUCE-1280 contains a patch for running the plugin in eclipse 3.4 and upwards.
I just compiled the patched plugin with the fixes from the JIRA-ticket MAPREDUCE-1280. The file is attached to the ticket. You can find it here.
Simply remove the old plugin from your eclipse-installation and put the new version of the plugin into the dropins-folder of your eclipse-installation.
After upgrading from an older version of the plugin you will have to start eclipse with the "-clean" command line switch. Help on eclipse command line switches can be found here.
I don't know whether the plugin has been updated or not, but as far as I know, that one is out of maintenance for several previous releases.
One of the solution is that you should download the source code and try to re-compile the jar file for that plugin (for the latest version of eclipse), however I didn't try it so don't know whether it is working.
Maybe you can try to use Karmasphere.
askswOrder is correct that the Eclipse plugin has not seen much attention for quite a long time. The JIRA you reference does provide a fix, but it's only been applied to Hadoop 20.3 and above. One option would be to try to apply the patch to 20.2 and recompile, but that's asking quite a lot from a newbie. I'd second the suggestion to use Karmasphere; it's a great product for working with MapReduce and those gents have taken on the work of staying current with Hadoop releases.