Get Jenkins plugin dependencies auto installed - plugins

I'm developing Jenkins' plugin which is dependent on another plugin (specifically MultiJob plugin, but it can be any other one of course).
Obviously, the dependency is found in POM, so I can actually use the classes of it.
The problem: if I'm trying to install my plugin in Jenkins that the dependency is not found in it Jenkins doesn't installs it automatically and upon first usage my plugin throws an exception NoClassDefFoundError, of course.
Question: can I make Jenkins to install my dependencies as prerequisites and if yes, how?
Note: I do see that other plugins somehow cause the dependencies to be installed (Git plugin for instance makes GitClient installed during its installation).
Thanks in advance.

It's been a while since i've raised the question, but if anybody will look for something similar, here is what i've finally came up with:
Since the dependency classes are only needed in case they are really there, i've decided to use Java's lazy linkage behavior and actually refer the relevant classes only on demand.
So practically, made a factory that has a list of class names of interest and every time i need to process some object i'm checking it's class against this list. If matched - the class is loaded and therefore it's okay to init the linking/initiation logic.
Last one, if you plan to use such a pattern do not forget to sign those dependency plugins as optional in your pom.xml.

Related

NoClassDefFoundError: org/eclipse/search/ui/text/TextSearchQueryProvider

I am writing a plugin for eclipse. When calling TextSearchQueryProvider sqProvider = TextSearchQueryProvider.getPreferred();, I get a NoClassDefFoundError.
The funny thing is, I only get this on the exported jar-plugin, not while debugging the plugin. I figured, it might be related to exporting the org.eclipse.search-plugin, but that plugin is so basic, that eclipse doesn't run without it anyway. So I guess the plugin should be there.
I am running eclipse photon (4.8.0).
Some more clarifications:
I have specified org.eclipse.search as dependency in MANIFEST.MF:
Require-Bundle: javax.inject,
org.eclipse.search;bundle-version="3.0.0";visibility:=reexport,
....
I have imported org.eclipse.search.ui.text.TextSearchQueryProvider
Do I need to do anything else, that I am not aware of?
Addition:
The plug-in related views did not show any obvious problems.
Specifically, the 'org.eclipse.search'-dependency is being satisfied by the version '3.11.200.v20180503-1856', which to me implies, that the plugin has been successfully linked?
The problem vanished, after I exported the plug-in with another version postfix.
I had originally called the postfix "beta". After giving it a new postfix with date,
the dependency tree seems to work correctly. It might be that the original package was bad and misconfigured in the MANIFEST, and only after reexporting with a different name, the pacakge dependencies were reevaluated corrrectly.

How to resolve classpath incompatibilities between plugin code and IntelliJ SDK

I am currently trying to develop a plugin for IntelliJ that will use a "core" library. The core library already has it's own dependencies (JAR files) and is used in other non-IntelliJ projects. Unfortunately some of the dependencies of the IntelliJ SDK are the same as that for the plugin core, but with conflicting versions. So far this has been manageable because we remove the dependencies in the SDK and provide the core's dependencies instead, and running the plugin through IntelliJ to work fine. However, I really want to be able to write automated unit tests for the plugin, and this causes problems.
Following the instructions from here, I set my first unit test to extend LightCodeInsightFixtureTestCase. However, this fails to get past the setUp method, throwing NoClassDefFoundErrors. See the gist of the error here (picocontainer is the conflicting dependency).
By inspecting the classes loaded while running the plugin, I can see that the same class from a conflicting dependency is loaded in two different classloaders, a URLClassLoader for the com.intellij dependency, a PluginClassLoader for my plugin's dependency. This explains why the plugin can be executed successfully, but the test fails.
A small, self contained example of a project that fails in this way is available here: https://github.com/holger-s/libraryconflict
My question is, what is the recommended way to resolve these conflicts that allows unit testing with the IntelliJ test fixtures?
Full disclosure: I have also sought an answer on the IntelliJ Plugin Development forum.
So this is late, but was running into the same problem, and I fixed it by adding the library under Project Structure/Libraries, then going to Modules/Dependencies, and changing the scope to Provided.
https://confluence.jetbrains.com/display/PhpStorm/Setting-up+environment+for+PhpStorm+plugin+development

How to configure maven or eclipse in order to use the RELEASE constant within versions?

All our projects are built using maven.
we have centralized some of our main configuration within a super pom.
In order to always have an update version of this super pom (without having to modify the version), we have used the following syntax :
<parent>
<groupId>my.organization</groupId>
<artifactId>superPom</artifactId>
<version>RELEASE</version>
</parent>
The problem is that Maven Eclipse plugin (m2e) doesn't understand this syntax (the RELEASE constant is not resolved).
So, our Eclipse users can't use built-in compilation.
What do you suggest to overcome this problem ?
By the way, we have tried several options from a maven point of view (especially those described here), but the version.RELEASE is the easiest for everybody (except those who are using Eclipse).
EDIT:
Our projects sources are split within multiple SVN repositories.
This super pom is an independent project. It is retrieved through our Nexus server.
You are trying to go into the wrong direction. A release in maven is a particular version like 1.0.0 and it indicates that you have a defined state of that artifact. In your case you super pom has a particular state. If you are trying to define the version to "RELEASE" you are saying my release is always the same but in reallity it's not true.
Usually such a super pom will change over the time lets say today you have defined some particular dependency versions in it (dependencyManagemet). And tomorrow you change those definition. Now the 1.000.000$ questions which state of the super pom is used in a build which has been done today? Ok in that simple scenario you can answer the question but if you have changed the super pom sometime yesterday you can't answer the question accurately.
Furthermore if you try to recreate an artifact of let's say last week you can't say which exact state of super pom has been used at that particular time cause you have no indicator which gives you the chance to see it.
And that's the reason why you need real versions like 1.0.0 or 1.1.0 etc.
I can strongly recomment to use real versions like 1.0.0 etc. but NOT things like "RELEASE" that will creep in the Maven system with its corrdinate group, artifact and version.
Version ranges and expansion indeed do not work for parent artifacts.
Someone advised to invoke the version plugin instead :
mvn versions:update-parent
which does not cover exactly your need, but I am afraid there is no better workaround. Other ideas : using a SNAPSHOT parent pom (not very satisfactory I admit). See also Maven2 cannot find parent from relative path.

maven: automatic upgrade version in all dependent projects

Before the question I'd like to describe the methodology I use.
I have a lot of projects under version control folder, some of them multi maven projects, some of them standalone bundles, some of them maven plugins or archetypes. All jars are snapshot (currently we can not use release artifacts). So for example application A1 depended on bundle B, which depended on utility C, another application A2 directly depended on utility C. When I change code in C I need to update it's version and then update B and A2, then A1. It is really annoying to update all those poms once at week. So I'm looking for some automatic solution that can handle it for me (like if C has new version all depended modules have to be updated).
Does any body have idea?
Thanks in advance
P.S. I thought to make a MOJO which can handle this, but I faced with some difficulty since not all projects has common parent project ...
Sounds like something the versions plugin can handle... http://mojohaus.org/versions-maven-plugin/
This is considered a bad practice, but if you deploy your projects using -DupdateReleaseInfo=true (or with the release plugin), then you can set the dependency version to RELEASE
<dependency>
<groupId>some.groupid</groupId>
<artifactId>some.artifactid</artifactId>
<version>RELEASE</version>
</dependency>
and you will always get the latest release version
If you're using SNAPSHOT-s only, you could always go for parent based projects. Define the versions in a parent and make the children extend it. You can also choose to use versions such as RELEASE or LATEST.
Consider using a Continuous integration engine to watch all projects and build them when changed.
If you use Jenkins you can set it up to provide the built maven artifacts as a Maven repository, which you can then use in your own Maven configuration.
This should be enough - the snapshot mechanism handles the rest.

eclipse, one classpath for compiling, another for launching

example:
For logging, my code uses log4j. but other jars my code is dependent upon, uses slf4j instead. So both jars must be in the build path. Unfortunately, its possible for my code to directly use (depend on) slf4j now, either by context-assist, or some other developers changes. I would like any use of slf4j to show up as an error, but my application (and tests) will still need it in the classpath when running.
explanation:
I'd like to find out if this is possible in eclipse. This scenario happens often for me. I'll have a large project, that uses alot of 3rd party libraries. And of course those 3rd party jars have their own dependencies as well. So I have to include all dependencies in the classpath ("build path" in eclipse) for the application and its tests to compile and run (from within eclipse).
But I don't want my code to use all of those jars, just the few direct dependencies I've decided upon myself. So if my code accidentally uses a dependency of a dependency, I want it to show up as a compilation error. Ideally, as class not found, but any error would do.
I know I can manually configure the classpath when running outside of eclipse, and even within eclipse I can modify the classpath for a specific class I'm running (in the run configurations), but thats not manageable if you run alot of individual test cases, or have alot of main() classes.
It sounds like your project has enough dependency relationships that you might consider structuring it with OSGi bundles (plug-ins). Each bundle gets its own classloader and gets to specify what bundles (and optionally what version ranges, etc.) it depends on, what packages it exports, whether it re-exports stuff from its dependencies, etc.
Eclipse itself is structured out of Eclipse plug-ins and fragments, which are just OSGi bundles with an optional tiny bit of additional Eclipse wiring (plugin.xml, which is used to declare Eclipse "extension points" and "extensions") attached. Eclipse thus has fairly good tooling for creating and managing bundles built-in (via the Plug-in Development Environment). Much of what you find out there may lead you to conflate "OSGi bundle" with "plug-in that extends the Eclipse IDE", but the two concepts are quite separable.
The Eclipse tooling does distinguish rather clearly (and sometimes annoyingly, but in the "helpful medicine" way) between the bundles in your build environment vs. the bundles that a particular run configuration includes.
After a few years of living in OSGi land, the default Java "flat classpath" feels weird and even kind of broken to me, largely because (as you've experienced) it throws all JARs into one giant arena and hopes they can sort of work things out. The OSGi environment gives me a lot more control over dependency relationships, and as a "side effect" also naturally demands clarification of those relationships. Between these clear declarations and the tooling's enforcement of them, the project's structure is more obvious to everyone on the team.
if my code accidentally uses a dependency of a dependency, I want it to show up as a compilation error. Ideally, as class not found, but any error would do.
Put your code in one plug-in, your direct dependencies in other plug-ins, their dependencies in other plug-ins, etc. and declare each plug-in's dependencies. Eclipse will immediately do exactly what you want. You won't be offered dependencies' dependencies' contents in autocompletes; you'll get red squiggles and build errors; etc.
Why not use access rules to keep your code clean?
It looks like it would better be managed with maven, integrated in eclipse with m2eclipse.
That way, you can only execute part of the maven build lifecycle, and you can manage separate set of dependencies per build steps.
In my experience it helps to be more resrictive, I made the team filling out (paper) forms why this jar is needed and what license...
and they did rather type in a few lines of code instead of drag along 20 jars to open a file using only one line of code, or another fancy 'feature'.
Using maven could help for a while, but when you first spot jars having names like nightly-build or snapshot, you will know you're in jar-hell.
conclusion: Choose dependencies well
Would using the slf4j-over-log4j jar be useful? That allows using slf4j with actual logging going to log4j.