how to set the mapreduce location in hadoop? - hadoop-plugins

I'm new to the Apache hadoop. I install the prerequisite software and configure the every thing and eclipse plugins also done but when i click the new hadoop location it's not working .Any one can help me...

Related

how to install hive plugin in eclipse kepler version?

I am trying to install hive plugin in kepler version of eclipse, but i am unable to find necessary hive plugins to do so. what is the best way to resolve this?
I think Hive is no longer available in eclipse market place. However you may go through the following link to configure Hive. Note that you may have to download the necessary jar files for Hive.
Connecting Eclipse To Hive

What version of Eclipse with what version of plugin Hadoop

I tried various versions of Eclipse with plugins downloaded from internet but I always have problem. With latest version of eclipse and the plugin developed for Hadoop2X, I could not open the Hadoop location configuration wizard. I reinstalled eclipse with older version Ganymede. I used the plugin the one for this version, I am able to open the wizard for Hadoop location configuration but when I try to create the project, I have given the hadoop installation direction, even then it shows the error as configure Hadoop installation directory and its not letting me move to the next section of wizard.
So can someone suggest me what version of eclipse should I use and where can I download the suitable plugin. I am using Hadoop2.5.1.
Thank You. Please help me for the same.

How to configure solr4 with eclipse Kepler?

I am trying to configure solr with eclipse kepler.I have done so far is ,i installed run-jetty-run plug-in from install new software menu. I started to follow "http://eclipse-jetty.sourceforge.net/update/" but im not getting options those are illustrated in these tutorials. so can any one tell me the steps to follow to configure eclipse with solr.
I am new to solr as well as eclipse ,so please bare with me.Sorry if i am missing something to mention.
Following is the screen shot of my project explorer window:
This is my debug configuration window:
Try this link : http://shrutiags.wordpress.com/2012/05/24/adding-solr-to-your-web-application-part-1-installing-solr-in-tomcat/
This link gives a set by step tutorial on how to setup solr in your development environment. It's a 3 part tutorial that will give you whatever you need to run solr in eclipse.

why eclipse doesn't support new hadoop version

I want to integrate hadoop with eclipse. While it seems eclipse plugin only supports old version of eclipse and hadoop,new version will cause an error about EOFException,detail here)I finish my configuration with Eclipse3.3 and hadoop0.20.0 and it works well. I want to know what should I do if I want to use new version of eclipse and hadoop, or other way to develop hadoop program.
Could anyone help me? Thanks very much~~

Hadoop 0.20.2 Eclipse plugin not fully functioning - can't 'Run on Hadoop'

I've just finished installing Hadoop 0.20.2 under Cygwin on Windows 7 with Eclipse Helios (3.6). Hadoop is now fully started, and I'm trying to run a test application within a newly created MapReduce test project in Eclipse. I'm using the Hadoop 0.20.2 plugin from the Hadoop download.
The Map/Reduce Location perspective operates correctly, as does DFS Locations tree in the Package Explorer. However, when I right-click the driver, select 'Run As' > 'Run on Hadoop', nothing happens and no errors spawn on the Console (silent fail :(). I believe a dialog window should appear asking for config before it runs, but this is not happening.
There seems to be a few others with the same problem, but I've yet to find an answer that works. I've tried the 0.20.1 plugin (total fail). The following bug report seems to describe my issue, though I'm a bit of a newbie to all this, so could do with a hand / voice of experience to help out: https://issues.apache.org/jira/browse/MAPREDUCE-1280
The hadoop eclipse plugin bundled with the hadoop distribution is compatible with eclipse up to version 3.3. The JIRA-ticket MAPREDUCE-1280 contains a patch for running the plugin in eclipse 3.4 and upwards.
I just compiled the patched plugin with the fixes from the JIRA-ticket MAPREDUCE-1280. The file is attached to the ticket. You can find it here.
Simply remove the old plugin from your eclipse-installation and put the new version of the plugin into the dropins-folder of your eclipse-installation.
After upgrading from an older version of the plugin you will have to start eclipse with the "-clean" command line switch. Help on eclipse command line switches can be found here.
I don't know whether the plugin has been updated or not, but as far as I know, that one is out of maintenance for several previous releases.
One of the solution is that you should download the source code and try to re-compile the jar file for that plugin (for the latest version of eclipse), however I didn't try it so don't know whether it is working.
Maybe you can try to use Karmasphere.
askswOrder is correct that the Eclipse plugin has not seen much attention for quite a long time. The JIRA you reference does provide a fix, but it's only been applied to Hadoop 20.3 and above. One option would be to try to apply the patch to 20.2 and recompile, but that's asking quite a lot from a newbie. I'd second the suggestion to use Karmasphere; it's a great product for working with MapReduce and those gents have taken on the work of staying current with Hadoop releases.