Running Simple Hadoop Programs On Eclipse - eclipse

I am pretty new to hadoop & ubuntu so please bear with me. I find it very inconvenient to compile my hadoop .java files from command line. So I have created an eclipse project & imported all the hadoop libraries so that the eclipse does not throw any reference errors. And it does not. However when I run the files as a standalone java application I get the following error
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
I am running on ubuntu and I have researched this problem elsewhere on web. I do not expect to see this error since the only difference is that I am running it within eclipse and not from command line. Where I am going wrong. Is there a specific way in which I need to add hadoop dependencies to my hello world hadoop projects? Will a simple buildpath configuration and importing of the necessary libraries not suffice? Appreciate all your responses.

you can try right-clicking the Project, ->Build Path -> Configure Build Path
Go to your src folder, point to "Native Library", then edit the location to the location of your hadoop native library folder (normally: ~/hadoop-x.x.x/lib/native/"folder-depending-on-your-system")

It is a warning and not an error which tells you that there is some problem in loading the native libraries which Hadoop makes use of. It should not have any negative impact on your job's output though. Remember Hadoop has native implementations of certain components for performance reasons and for non-availability of Java implementations. On the *nix platforms the library is named libhadoop.so. Using Eclipse doesn't make any difference the way Hadoop works. It's just that your Eclipse is unable to load the native libraries due to some reasons.
One possible reason might be that there is some problem with your java.library.path. You can configure Eclipse to load the proper libraries by configuring the build path as per your environment. To know more about Hadoop's native libraries, and how to build and use them you can visit this link.

Related

IntelliJ download the libraries once and use in multiple projects

I have an IntelliJ with Scala plugin intalled on the server. The server is disconnected from the global internet and all the updates can be done ocsasionally only.
Would like to download some libraries once (e.g. Spark libraries, some libraries from Java) and use them in IntelliJ in the multiple projects without need of downloading them, but loading from local direcories. Also it would be great to have a 'full' bundle of libraries (e.g. all Spark libraries) and be able to use only particular classes when it's necessary (e.g. Spark Context only).
TK
P.S. Question is somewhat related to the: Use Scala on computer without internet connection
As #MarioGalic sugested, the cluse was to move a required libraries to the ~/.ivy2 directory.
Somethimes the case is to add libraries manually in IntelliJ project setup, insted of using SBT or Maven to manage the dependencies.

How can I create an Upstart Zip package using sbt-native-packager?

I am unable to generate an upstart zip package.
I have added the following to my sbt project
...
enablePlugins(UpstartPlugin).
enablePlugins(JavaServerAppPackaging)
and then run
sbt clean universal:packageBin
And a Zip file is produced but it contains the shell script in ./bin and the jars in ./lib that look like what is produced when I use JavaAppPackaging!
Where are the conf files etc needed for upstart?
Am I missing the secret sauce or using the wrong incantation?
Reading all the docs at this page I am under the impression that the archetype plugins determine what goes into my package while the format plugins determine what form the package takes.
So for example I could have a Java Server project that is designed to launch running as a daemon user using the JavaServerAppPackaging (archetype) and adding the daemonUser setting, but then have this bundled up as a Zip or Tar.gz using the Universal (format) plugin or a .deb file using the Debian (format) plugin.
Well I want a Java Service with all the files necessary to be started with the upstart System Loader but packaged as a Zip file. So I assume I need the upstart (archetype system loader) plugin with the universal (format) plugin.
There is even a tip on the system loader docs for the upstart plugin saying
You can use systemloaders with the Java Application Archetype or the Java Server Application Archetype!
Well that exactly what I want however it doesn't state how to do it!
Please can someone tell me how to get a zip bundle with an upstart layout that starts a Java Server application? And if you can point out the documentation I have clearly missed than that would help my understanding too :-)
Cheers
Karl
Thanks for ready the docs so carefully and sorry that the docs are lacking an explanation what package types support system loaders.
The short answer is: only rpm and debian packages support system loaders ( systemd, systemV and upstart ).
Why is that so? A system loader is tight to the target operation system. It's nothing universal, which is the reason it's not in there. The configuration files and or scripts and platform dependent and this is not part of your regular application directory.
Cheers,
Muki

ClassNotFoundException: com.itextpdf.text.Element

Situation: I have a Java file in my project that uses the features of the iTextPDF library. The project compiles properly. I use JDK 1.7, Tomcat 7.45 and Eclipse Neon.3 Release (4.6.3).
Problem: While starting the server via Eclipse, I get an error:
java.lang.ClassNotFoundException: com.itextpdf.text.Element
What I've tried so far:
Ensured that only 1 version of iTextPdf 5.4.jar is available in the entire project. It's there in WEB-INF/lib folder. It's not there in any of the externally referenced libraries.
I updated my Eclipse.
Any help will be greatly appreciated.
Well, as a starting point- try and expanding the JAR, and see if you can search for or manually find com.itextpdf.text.Element class.
if it's not found there, you know there's nothing wrong with your eclipse or project settings, and nothing wrong with your jar imports.
You should then determine between 3 options:
Is the JAR even on the classpath? it's possible everything is present there, but the project does not even consider looking in it.
Should this class be in the JAR? is it available on other versions of this JAR?
Is this class neccesasry for you application? why is eclipse looking for it, where in the code it is referecened? can you live without it? or, can you manullay replace it with a class file you can find online? (this will take some debug time, and some more research on your part)

Eclipse plugin: Self hosting work's but jar give unexpected behavior

I have finished developing a plugin for eclipse, along the way I have been using the self hosting feature of eclipse to test and debug my plugin. However, after exporting the plugin and installing it into my own eclipse host, hardly anything works.
I have JavaFX UI's which wont appear anymore, file's cannot be read due to URI's not being hierarchical, and other parts working very strangely.
I came here to ask why does the plugin work on a self hosted eclipse application while when exported and installed on my current host does not work?
Could it be something to do with other plugins causing conflict ?
Does self-hosting work differently than installations of jars?
The main difference is that your code is packed in to a jar. If you are trying to access files in your plug-in using things like File or FileInputStream or anything else that expects a file these will not work. There are specific Eclipse APIs you must use to access resources in plug-in jars (mainly FileLocator).
Another common mistake is not including everything that is required in the plug-in in the build.properties file. The plug-in jar only includes files listed in this file. When you test locally this requirement is not checked.

Build vs Deploy vs Publish (Eclipse IDE)

I'm a newbie to J2ee though not a complete newbie. I'm unable to find a good resource (book or video) that could help me understand what exactly happens when we build, deploy and publish. I have a fair idea though. So my questions are -
Is there a good resource out there that can help me understand these concepts? I've read some books on struts and servlets/jsp but they don't delve into eclipse and how/what it does. The eclipse documentation has been helpful but only slightly.
When we build an application the the java files are converted into the class files and stored in the java build path. What else happens during build? Many people use the term 'library dependencies', what does this mean? Also, when people refer to dependencies do they refer to files like xml and tld?
At what stage (build or run on server) does the container check to see if the dependencies are alright? Say for instance, if the servlet class/name in the web.xml file.
Is it appropriate to say that build is basically compilation while deploying the project and running it is the same as executing it?
Familiarity with the servlet specification would help you (perhaps some older version would be quicker to read like 2.4), but general understanding of what you build and how you do it in Eclipse is what you are after.
The way I see it is that during the build Eclipse creates almost complete version of WAR (or some other archive, if you use EJBs for instance) and by publishing you deploy it to some server (this is practically the same thing although Eclipse might just configure the server to use exploded WAR that it just prepared instead of copying it to some "deploy" dir that you are supposed to do if you work without an IDE).
If you configure your project well, the build can only mean compilation, but if you have more ceremony in it, then some source generation and moving files around might happen too.
To address your second question, library dependencies can be files that reside in WEB-INF/lib for instance. Read the spec to know what should be there and what should not. Eclipse tries to copy there all defined dependencies of your project.