Adding Dependency Management to an Existing Java Project - eclipse

I'm working on upgrading a legacy Java project to be compatible with jboss wildfly. As part of that process, I'm replacing our old system of managing dependencies (manually scanning for jars in a folder) with an automated system.
My first thought was to use maven, which worked well initially. The maven plugin for eclipse was able to scan my project and create a pom with most of the required dependencies. That works fine for compiling and running with eclipse, but production deployment uses an ant build script. I looked into maven-ant-resolver ( https://maven.apache.org/resolver-ant-tasks/index.html ) but as far as I can tell that project doesn't have a way to add dependencies to the classpath, the best it can do is bundle them into a jar.
The other option I looked at was Ivy. It seems better suited to integration with ant. Unfortunately, the tooling for ivy seems primitive compared to maven. From what I can tell, there is no option to generate the dependency file (ivy.xml) from an existing project. With the number of dependencies I'm dealing with, especially from jboss, creating the dependency xml from scratch is not a realistic option.
What are my options for solving this problem? Is there a way to do what I want with maven or ivy that I'm not seeing? Is there another dependency management tool out there that offers all the features I need?

The maven-assembly-pluginis what i can recommend for likely usecases. Not sure if it suits you though.
In a nutshell:
You can pack folders, jars, resources, dependencies, whatever into a jar for production deployment. This jar is packaged with the, from maven-assembly-plugin internally used and thus not needed to be referenced explicitly, maven-archiver-plugin which also stores a MANIFEST.MF with the classpath in it (not by default but with few codes of tweaking).
Useful to know though: Maven allows you to quite easily create own Plugins that completely do what you want. If its just a file with the stored classpath, this could be a clean solution.

Related

How to Create a Spring+Primefaces+Hibernate (no maven) project in eclipse?

I am new to J2EE. I would like to create a Spring+Primefaces+Hibernate project.
I googled for it.
But I found all projects examples show in internet contains maven. My questions are
Is it possible to create a spring+primefaces+hibernate project in eclipse without Maven? If no, what is need of maven?
How to add the jar file of primefaces and spring and hibernate in eclipse?
Will the spring controller xml file (spring context or dispatcher servlet) be created automatically or manually?I mean Spring MVC.
Will the hibernate file (mapping file) also be created automatically or manually?
If possible, can anyone guide me to tutorial (preferably video) to implement the same?
I am using tomcat 7 and Eclipse - kepler.
Any help is appreciated.
If this is downvoted , do specify the reason also.
Although it's not a 'must' to use Maven or any other build tool, you should strongly consider using one.Eclipse Kepler has by default maven support but feel free to use other build tools(Gradle, Ant) or none(see 2.).Maven and the other build tools remove the headache of scaffolding, searching for dependencies(external jars like spring-mvc, hibernate, some db drivers), even deploying applications in a server.
If you chose not to use a build tool you have to manually get your project dependencies and enter them
into your project's buildpath(Right Click -> Build Path then enter their location).As you have noticed this step can be really really time consuming...
No, you have to manually create the configuration unless you use another project that already has what you need, again this might get easier with a build tool(maven archetypes for example)
The same as 3.
You won't have a hard time finding resources about these technologies, they are being used practically everywhere, and I think the Spring team has some videos in their YouTube channel.
Hope that helps a little!
1:* The fundamental difference between Maven and Ant is that Maven's design regards all projects as having a certain structure and a set of supported task work-flows (e.g., getting resources from source control, compiling the project, unit testing, etc.). While most software projects in effect support these operations and actually do have a well-defined structure, Maven requires that this structure and the operation implementation details be defined in the POM file. Thus, Maven relies on a convention on how to define projects and on the list of work-flows that are generally supported in all projects.
This design constraint resembles the way that an IDE handles a project, and it provides many benefits, such as a succinct project definition, and the possibility of automatic integration of a Maven project with other development tools such as IDEs, build servers, etc.
But one drawback to this approach is that Maven requires a user to first understand what a project is from the Maven point of view, and how Maven works with projects, because what happens when one executes a phase in Maven is not immediately obvious just from examining the Maven project file. In many cases, this required structure is also a significant hurdle in migrating a mature project to Maven, because it is usually hard to adapt from other approaches.
In Ant, projects do not really exist from the tool's technical perspective. Ant works with XML build scripts defined in one or more files. It processes targets from these files and each target executes tasks. Each task performs a technical operation such as running a compiler or copying files around. Targets are executed primarily in the order given by their defined dependency on other targets. Thus, Ant is a tool that chains together targets and executes them based on inter-dependencies and other Boolean conditions.
The benefits provided by Ant are also numerous. It has an XML language optimized for clearer definition of what each task does and on what it depends. Also, all the information about what will be executed by an Ant target can be found in the Ant script.
A developer not familiar with Ant would normally be able to determine what a simple Ant script does just by examining the script. This is not usually true for Maven.
However, even an experienced developer who is new to a project using Ant cannot infer what the higher level structure of an Ant script is and what it does without examining the script in detail. Depending on the script's complexity, this can quickly become a daunting challenge. With Maven, a developer who previously worked with other Maven projects can quickly examine the structure of a never-before-seen Maven project and execute the standard Maven work-flows against it while already knowing what to expect as an outcome.
It is possible to use Ant scripts that are defined and behave in a uniform manner for all projects in a working group or an organization. However, when the number and complexity of projects rises, it is also very easy to stray from the initially desired uniformity. With Maven this is less of a problem because the tool always imposes a certain way of doing thi
2:* You have to download all required jars file for hibernate/spring/primefaces from internet and place them in your project build path or in lib folder.
3:* Spring configuration files need to be created by you so that you can get the concept.
4:* Hibernate mapping files can be created by using reverse Engineering techniques for hibernate from where you can generates hbm files or you can use annotations if you dont want xml.
I suggest you to first create a sample java project in eclipse then download all required jars and place them in lib folder. Then configure hibernate in projects and spring integration.

The best practice to use Tycho/Maven to remove jars dependencies in Eclipse RCP?

I'm working one an Eclipse RCP project. Currently we create a dependencies plug-in project and put all jars libraries into that project and export all packages. This method will give a huge repo, thus we want to use Tycho/Maven and let it figure out the dependencies for us.
The first approach is removing dependenciec project and use p2-maven-pluging to transform existing jars libraries to p2 format repo. Install all libraries from p2 repos and add required bundle in Require-Bundle section in each MANIFEST.MF. This is a little bit tedious since in every project having dependencies in Require-Bundle, I have to manually replace it to corresponding bundle names. And in the end, the project build using Tycho could successfully run, but in Eclipse it gives me java.lan.NoClassDefFoundError: Could not initialize class X.
I think there are few configuration files, where Tycho depends on some of them and Eclipse depends on the rest, but I'm not sure what it is.
The second approach is removing all jars in dependencies project but adding them in Require-Bundle or Import-Package. However, both won't work since in Export-Package section Eclipse will complain these packages are not existed. Thus other projects depends on this dependencies project won't find those packages they need, which causing more errors in Eclipse.
Does anyone know the best practice to deal with this issue?
Update:
I'm using basically the first approach, but add dependencies in Import-Package in each project instead of Require-Bundle. This would eliminate the need to specify the specific bundle version, as long as they provide the same API and they are compatible, your application would work. So everytimes I update private p2 repository, I don't need to change MANIFEST.MF in each project.
The only MANIFEST.MF I need to manually add dependencies in Require-Bundle is a library developed by our self. Without it, Tycho won't fetch required dependencies from private p2 repository. If still get NoClassDefFoundError, try adding all plugins in Run -> Run Configuration .. -> plug-ins, it may help.
I definitely not apply your approach 1, with the mega-plugin of exports. There's a related discussion here: Handling non-OSGi dependencies when integrating Maven, Tycho and Eclipse
As a rule, use Import-Package instead of Require-Bundle.
To get bundles will appear in the Export-Package section Eclipse:
if they are non-Eclipse (maven libraries), then build the project and reference the libraries in the Eclipse runtime section.
if they are Eclipse dependences, they should be in your workspace or Target Platform.
More generally, it may help for you to define a Target Platform. You can build/deploy all of your locally created plugins into a local p2 repository (see http://www.sonatype.org/nexus/). Then add that p2 site to your Target Platform.

is there a way to generate a pom.xml with dependencies from an eclipse project?

I have inherited a big project with several subprojects.
all of them use several jar files, all of them located under each project's lib directory. I want to take all the projects and migrate them to maven, but dependencies are a problem (too many of them), some of them are commonly used libraries (apache projects, xerces, jms, etc) and others are not.
is there a way to autogenerate maven dependencies for those jars that can be found on public maven repositories. for example, see that my project use the spice-jndikit-1.2.jar file and automatically get the appropiate depedency with group, artifact and (if possible) version?
thank you
I wrote a groovy script to generate a starting set of Apache ivy files.
https://github.com/myspotontheweb/ant2ivy
In my case, I wanted to "Maven-ize" my ANT builds without switching completely away from ANT.
It is feasible to extend this code to generate a Maven POM, if people were interested in this feature.
You can convert a project to Maven using the m2e plugin, but this erases your jar references, and should not be used.
I doubt that such a thing exists since typical jars (unless themselves built with Maven) don't have the necessary information to correlate the groupId, artifactId and version back to a repository to get the proper path.
You might be able to write something that parses the file name for the name and version, but you still have the package-based path to figure out.
If you're building using Ant, you might also consider using Apache Ivy, and its file-system based resolution (very fast and easy to configure), to get you started, and slowly role over to the Maven repos for the artifacts, this way you're not spending a lot of time up-front finding Maven dependencies.

Extract eclipse project .classpath dependencies into ant script

I have a list of Eclipse projects that I would like to compile based on the existing project configuration.
As far as I can tell, if an ant script could read the .classpath files, it would pretty much be able to infer the project dependencies and perform a "javac" compilation in the right order. This would save time in describing the same dependencies again in the ant script or a Makefile.
The dependencies I am interested in are JAR Dependencies, JRE dependencies, and inter-project dependencies. These are -- as far as I can tell -- part of the .classpath XML file.
Any ideas on how Eclipse project dependencies could used in an ant script?
Right click on your Project -> Export
"General/Ant Buildfiles".
Choose the projects and there you go.
Otherwise...
I have some experience with ant4eclipse and it is a hassle to get it stable.
Go check Buckminster or Maven Tycho for a good solution.
I'm currently using Ivy along with Ant, Eclipse and Maven.
I just love the way Ivy works.
Currently, we have a workspace with many projects using Liferay (with Tomcat) for the front-end and Glassfish for the back-end.
We were looking for a way to manage our dependencies a lot better than how we were doing it.
So I took Ivy, replaced all of the classpaths and deployment dependencies in eclipse and was able to build my application using 1 ivy file per project using either Eclipse or Ant.
Ivy integrates like a charm in ant and builds are done either from the workspace or by command line.
I strongly suggest you look at this avenue. Additionnaly, by adding Artifactory, we have a local repository in which the ivy files look for dependencies. This helps us maintain and rule which jars are to be used by developpers. Once everything is setup, we will build our application nightly using Jenkins and these builds will be using our Artifactory repository to resolve dependencies since our build servers do not have access to the internet.
Hope this helped
If you are running the Ant script only from eclipse using the "External Tools Configurations", you can add the variable ${project_classpath} to the Classpath.
Depending on if you are in a plugin project and dependencies you might need to add the
${eclipse_home}.
In case you get an error launching Variable references empty selection: ${project_classpath}, make sure the ant xml file or at least the project is selected. This is important.
I believe the ant4eclipse project provides support for executing Ant builds based on Eclipse metadata files.
However, in my opinion that is doing things back to front. You shouldn't have your build (Ant) depending on your IDE (Eclipse) environment. But it is useful if you can derive your Eclipse environment from your Ant build.
This is an approach used successfully in a team I worked in. We had a helper Ant target which applied XLST to project build.xml files to transform these into Eclipse .classpath files. Thus the Ant build.xml files were the single configuration point for our projects.

eclipse, one classpath for compiling, another for launching

example:
For logging, my code uses log4j. but other jars my code is dependent upon, uses slf4j instead. So both jars must be in the build path. Unfortunately, its possible for my code to directly use (depend on) slf4j now, either by context-assist, or some other developers changes. I would like any use of slf4j to show up as an error, but my application (and tests) will still need it in the classpath when running.
explanation:
I'd like to find out if this is possible in eclipse. This scenario happens often for me. I'll have a large project, that uses alot of 3rd party libraries. And of course those 3rd party jars have their own dependencies as well. So I have to include all dependencies in the classpath ("build path" in eclipse) for the application and its tests to compile and run (from within eclipse).
But I don't want my code to use all of those jars, just the few direct dependencies I've decided upon myself. So if my code accidentally uses a dependency of a dependency, I want it to show up as a compilation error. Ideally, as class not found, but any error would do.
I know I can manually configure the classpath when running outside of eclipse, and even within eclipse I can modify the classpath for a specific class I'm running (in the run configurations), but thats not manageable if you run alot of individual test cases, or have alot of main() classes.
It sounds like your project has enough dependency relationships that you might consider structuring it with OSGi bundles (plug-ins). Each bundle gets its own classloader and gets to specify what bundles (and optionally what version ranges, etc.) it depends on, what packages it exports, whether it re-exports stuff from its dependencies, etc.
Eclipse itself is structured out of Eclipse plug-ins and fragments, which are just OSGi bundles with an optional tiny bit of additional Eclipse wiring (plugin.xml, which is used to declare Eclipse "extension points" and "extensions") attached. Eclipse thus has fairly good tooling for creating and managing bundles built-in (via the Plug-in Development Environment). Much of what you find out there may lead you to conflate "OSGi bundle" with "plug-in that extends the Eclipse IDE", but the two concepts are quite separable.
The Eclipse tooling does distinguish rather clearly (and sometimes annoyingly, but in the "helpful medicine" way) between the bundles in your build environment vs. the bundles that a particular run configuration includes.
After a few years of living in OSGi land, the default Java "flat classpath" feels weird and even kind of broken to me, largely because (as you've experienced) it throws all JARs into one giant arena and hopes they can sort of work things out. The OSGi environment gives me a lot more control over dependency relationships, and as a "side effect" also naturally demands clarification of those relationships. Between these clear declarations and the tooling's enforcement of them, the project's structure is more obvious to everyone on the team.
if my code accidentally uses a dependency of a dependency, I want it to show up as a compilation error. Ideally, as class not found, but any error would do.
Put your code in one plug-in, your direct dependencies in other plug-ins, their dependencies in other plug-ins, etc. and declare each plug-in's dependencies. Eclipse will immediately do exactly what you want. You won't be offered dependencies' dependencies' contents in autocompletes; you'll get red squiggles and build errors; etc.
Why not use access rules to keep your code clean?
It looks like it would better be managed with maven, integrated in eclipse with m2eclipse.
That way, you can only execute part of the maven build lifecycle, and you can manage separate set of dependencies per build steps.
In my experience it helps to be more resrictive, I made the team filling out (paper) forms why this jar is needed and what license...
and they did rather type in a few lines of code instead of drag along 20 jars to open a file using only one line of code, or another fancy 'feature'.
Using maven could help for a while, but when you first spot jars having names like nightly-build or snapshot, you will know you're in jar-hell.
conclusion: Choose dependencies well
Would using the slf4j-over-log4j jar be useful? That allows using slf4j with actual logging going to log4j.