Target Platform for PDE Headless build does not work - eclipse

I am currently trying to get my headless pde-build working but I am stuck on a point where I do not know how to continue.
The problem is how to define the related target platform to compile the plugins against.
I have a build.bat with the following call (all in one line!):
java -jar D:\target\eclipse\plugins\org.eclipse.equinox.launcher_1.0.201.R35x_v20090715.jar
-application org.eclipse.ant.core.antRunner
-f D:\target\eclipse\plugins\org.eclipse.pde.build_3.5.2.R35x_20100114\scripts\productBuild\productBuild.xml
-Dbuilder=c:\pde-build\scripts %*
I tried to create the target eclipse platform from different parts: The eclipse SDK, RCP SDK, Delta Pack, PDE-SDK in all combinations but none of them worked well.
I got the following error:
BUILD FAILED
D:\target\eclipse\plugins\org.eclipse.pde.build_3.5.2.R35x_20100114\scripts\productBuild\productBuild.xml:18: Cannot fin
d ${eclipse.pdebuild.scripts}/build.xml imported from D:\target\eclipse\plugins\org.eclipse.pde.build_3.5.2.R35x_2010011
4\scripts\productBuild\productBuild.xml
where the variable ${eclipse.pdebuild.scripts} does not got resolved. I also tried to give this parameter via the command line but then I got another error regarding missing svn task which is absolutely confusing as this is working with my local eclipse installation referenced.
When I replace the path from d:/target/eclipse to my local eclipse installation the pde build works as expected!
This leads my to the point that the configuration of the target eclipse is not correct but in the moment I have no idea how to configure this!
My goal is the automate the pde build first on my local site without referencing my local eclipse and later on integrate this building process into our running cruisecontrol instance.
As I saw already another question about defining the target eclipse I would be happy if anyone can contribute hints or facts regarding the problem.
Regards,
Andreas

When performing a headless build, the target can be separate from the eclipse that is actually running the build itself. The problem you had here is that the eclipse that you were using to run the build did not have PDE/Build properly installed.
This is why the ${eclipse.pdebuild.scripts} was not set, because PDE/Build was not installed into that eclipse instance, the org.eclipse.pde.build bundle was not resolved and the code that sets this property never got called. Similarly, the necessary ant classpath entries for PDE/Build tasks would not have been set up properly either.
You need the Eclipse with PDE installed inside to run the build, but the target for the build can be separate from this.
In the build.properties file found under -Dbuilder=c:\pde-build\scripts you can set several properties:
baseLocation This is a path to an eclipse that is your target.
buildDirectory This is where the build will actually take place, source is fetched to plugins/ and features/ subfolders, but if there are already binary plugins located here then those become part of the target as well.
pluginPath This is a list of paths (separated with ';' on windows or ':' on linux) containing other locations that should be considered as part of your target. These locations can be several things:
The root of an eclipse-like install with plugins/ and features/ subfolders. This is a good way to provide the delta-pack instead of just unzipping it on top of an eclipse install.
The root of a workspace-like folder, where all subfolders are treated as plugins or features depending on the presence of a manifest or feature.xml.
The root of a bundle or feature, or the jar for a bundle.
If you are doing a p2 build (p2.gathering = true) you can also provide p2 repositories under a ${repoBaseLocation} which will be transformed and placed under ${transformedRepoLocation} and will become part of your target, and the p2 metadata there will get reused during the build.

after some more time of investigation I found out, what I did wrong so far. As I mentioned above defining the target platform is not that easy as copying the SDK and plugins in into one location (as it was in early times of eclipse dev).
The working solution by now is the following: Copying the eclipse SDK into the target location and run this version. Install inside this the neccessary PDE-Tools to enable plugin development. After that, close the IDE and copy the delta pack + the respective svn plugin (I used org.eclipse.pde.build.svn-1.0.1RC2 from sourceforge) into the target platform and you're done.
Now my automated PDE build is running as expected.
Only minor issue now is the following: The result product contains eclipse-specific menu entries which are not there when I ran this from inside my dev-eclipse.
Any hints on that?

I just posted an answer to my question on this kind of topics, may be this can help you:
Plugin product VS Feature product

Related

Define a Java 9 multi-moduled project in Eclipse

I'm trying out Java 9 Jigsaw module system (no module experience yet) and would like to use it for capsuling the classes within my project, but it's confusing.
According to this article it should be possible to have multiple modules within ONE project. I made a new project in Eclipse Oxygen (Java 9 is supported) with the same structure as shown in the article. But Eclipse keeps telling me that I must not have more than one module-info.java in a project.
I really don't know how to tell Eclipse that it should use the "multi-module-mode". And I really would appreciate not having to create a new project for every single module.
This works:
This not:
But according to this article something like that should work:
And how about deployment of a modularized project with Eclipse? There is nothing to see about the new jmod extension. Do I still export it as a runnable JAR file like before?
Notice that my questions refer to working with the IDE (no command line, I mean with an IDE that should be possible, right?) Thank you for enlightening me.
Currently, Eclipse requires you to create a separate project for each module (e. g. because each module has its own Java Build Path).
To understand this design decision, consider that Java modules correspond to OSGi bundles / Eclipse plug-ins and it has always been to have a separate project for each bundle/plug-in. If you come from the Maven world, you would probably expect a deeper folder structure instead. But modules are self-contained and combining several modules into one project would only add an additional folder level without meaning. However, Eclipse supports nested projects and so-called working sets if you need an additional folder level.
Exporting modules as images is planned for Eclipse 2019-03 (4.11), on March 20, 2019 (see Eclipse bug 518445). Exporting modules as JARs that can be used on the modulepath (-m) already works (see my video).
I don't know if this question is still open for an answer, but you can solve this problem by simply removing all source folders on the build path. At least this works for Eclipse 2021-12 version.
As you can see this is a demo project from the Official Gradle Guide Book and it has multiple modules. Each module has its own module-info.java.
project structure in IntelliJ IDEA
If I open this project in Eclipse it will give me the 'duplicated entries on module-info.java' error.
Eclipse shows the error
But if I delete all the source folders on the build path, the error is gone and the project can be built and run without problem.
project properties: Java build path
The only problem is that you have to build the project with Gradle so that it will produce the .jar of each module and you have to include them in the libraries later.
include all the .jar in libraries
I think this is probably the same solution mentioned by howlger above.

Import multi module project in Eclipse

I am trying to get started with Eclipse SCADA and import the projects from their git repository.
I have cloned the following projects:
org.eclipse.scada.external
org.eclipse.scada.utils
org.eclipse.scada.base
org.eclipse.scada.protocols
org.eclipse.scada.core
org.eclipse.scada.releng
For each project I did mvn verify in the parent folder and imported the projects in Eclipse. I also changed target platform. However, I still seem to have problems with their dependencies.
Any help would really be appreciated.
Actually the Eclipse SCADA java projects are not developed with "Maven first". So you should disregard maven completely while in the IDE. The maven build is basically only used to build the project unattended.
The issue with the target platform is more complex. We were a bit sloppy in providing a always working target platform (and it is actually difficult to keep them up to date, since the versions of the bundles are fix).
I made a target platform file for the current version, you can find it here: https://gist.github.com/CptMauli/ec6eda37734f0108510f
To make it work properly please download a classic eclipse put it somewhere and create an environment variable ECLIPSE_432_HOME which points to it. Alternatively you can just change the first entry in the target file and point it directly to it.
The reason behind it is, if you would use your own eclipse installation, it is possible that bundles installed there conflict with bundles provided in the target platform or from your workspace. This is actually mostly not even a problem when compiling, but as soon as you start a client or a server, Eclipse will complain about duplicated bundles.
If you have any more questions please go to our mailing list: https://dev.eclipse.org/mailman/listinfo/scada-dev
or our google group: https://groups.google.com/forum/#!forum/openscada
or write to me directly at juergen dot rose at ibh-systems dot com

How to build RCP application based on Product Configuration and Target Platform Definition?

I'm about the setup an automatic (command-line) build for my Eclipse RCP Application.
I have found out the following ways to do it:
Buckminster
Using Maven with the pde-maven-plugin
Headless PDE Build
The problem with all these options is that they require me to create essentially a new representation of the information already contained in my target platform definition. For example in Buckminster, this would be the .rmap file.
In my thinking all the information to build the product should be already there when I have the following:
Plugin project with product configuration file (foo.product)
Target platform definition file (foo.target)
Therefore I would expect there to be a command like the following:
build-rcp-product foo.product foo.target win32
Is there anything like that which I may have missed?
With Buckminster you don't need to replicate the information in your target definition file. You can simply import the target file using the importtargetdefinition command. If all your dependencies defined in the target definition file, then in the rmap you define only from where to materialize your plugins (svn, git, maven, file system etc).
With PDE build, there is a filed request (Bug 266311) and it seems it is still not possible to utilize the target file directly but there are some workarounds suggested in there (which I didn't try, I am using Buckminster).
I use the PDE build and it's pretty simple. It essentially gets what it needs from the MANIFEST.MF file and the build.properties file.
The command to run it is more complicated, as you have to start Eclipse and point it to a few things, but it's very well integrated with the IDE. It does everything by making Ant scripts.
you can try tycho
here's a good start:
Tycho tutorial
Reference card
with tycho, all you need is a POM and you usually will not change this information, which is generated via maven

Netbeans project to scripted build

I'm trying to convert a Netbeans 6.9.1 project into a scripted build (without netbeans). Of course, it fails (or I wouldn't be asking for help).
In the failure it says that the org.apache.commons.httpclient package does not exist. (Of course, it worked when we ran the build in Netbeans).
Now I know exactly where the commons-httpclient.jar file is located in my project structure, but I can't seem to tell it to the compiler via the ant build files and the netbeans property files.
Perhaps related to this is when I ran "ant -v" to build my software, it said,
Property lib.mystuff.classpath has not been set. This variable is important, I guess, because
the file nbproject/project.properties uses lib.mystuff.classpath in its definition of javac.classpath, which of course tells the Java compiler where to find the JARs.
So...when moving a Netbeans project to a netbeans-independent scripted build, how can the build script set these properties? Also, how can I ensure that the jar file gets included in the ant build?
I appreciate any help I can get, as I am a Java newbie.
UPDATE AFTER ACCEPTING ANSWER FROM vkraemer:
There are a few best practices for build scripts for production software:
Put everything needed for a build under a single directory tree. (Netbeans = fail)
Put everything in source code control. (I did that)
The first line of the build script should clear all environment variables.
The next section of the build script should explicitly set all environment variables to values which are known to work.
The next part of the build should be able to execute using command-line programs such as javac, ant, cc, etc, and must not depend on firing up an IDE such as Eclipse or Netbeans.
It is a shame that Netbeans makes this hard.
I did a quick look in a Java Application project and found the following...
javac.classpath = ${libs.MyStuff.classpath}
libs.MyStuff.classpath is defined in %HOME%/.netbeans/6.9.1/build.properties.
You may be able to get by doing the following...
ant -Dlibs.MyStuff.classpath=c:\a\b\c.jar
You would need to do more if you have multiple jar files in the MyStuff library that you created in NetBeans.

Modifying Existing Eclipse Plugin and Correctly Installing it

I downloaded the source code for the EMF based UML2 Plugin and changed a class in the org.eclipse.uml2.uml.edit project to remove special characters when returning string representations. Now when I export the projects and place the jar files either in the dropins directory or replace my current uml2 plugin jar files in plugins directory, The UML files are no longer recognized, in short my modified plugin does not install correctly (no error is thrown and I can see the files being picked up under Plugins->Target Platform) .
However, When I run the plugin as an eclipse application (from the workspace) I can see the changes I made being reflected in the new instance of eclipse.
What can I do to ensure that the plugin installs correctly?
Is there a documented procedure of how to build the uml2 plugin (or any comparable plugin) after modification?
Select the project and open the context menu. There is an entry PDE near the bottom of the menu. In there, you can find an entry to build the plugin for deployment. This gives you the features and plugins directory with the fixed files. Copy both into your Eclipse install.
Unless the UML2 plugins require some kind of magic build script, exporting the one plugin you changed and overwriting the original in your Eclipse installation should be the easiest solution. One potential problem which comes to mind is conflicting plugin version numbers: make sure you don't have two identical versions of your modified plugin in your Eclipse installation.
When debugging plugins which apparently don't work properly at runtime, I always look at Help > About Eclipse Platform > Configuration Details. This lists all the plugins found by Equinox during startup, along with their status (see the Javadoc of the org.osgi.framework.Bundle interface for explanation).
I faced the exact same problem as you describe here . I dont have any answer to your problem but i am sharing what worked for me .
I created a local update site of the plugin on my system. Create update site for your plug-in article explains very very nicely the steps needed to accomplish this .