Deployable JAR file from JB Plugin Repo does not contain my files, but the plugin runs correctly locally - plugins

Background
I am working on a simple plugin, and have already deployed to the Plugin Repository once before (successfully).
Since my last successful deployment, I found that I had a lot of issues with the IDE. After completely upgrading, and modifying my plugin's directory structure, I have been able to get the plugin to Run again.
Issue
tl;dr - I have an updated plugin in the JetBrain's Plugin Repository that does not work as intended, and I cannot update it correctly!
When I run the plugin, a second instance of the IDE comes up with my plugin working correctly. I edit my code and run the plugin again - the plugin runs smoothly and the updates are applied!!
With all of this, I decided to deploy my updated plugin to the Repository again. Once that was done, I decided to download the plugin and try it out myself; just to make sure things worked.
The issue is that nothing can be found in the plugin file!! Just the updated plugin.xml file and Manifest.mf file. The total size of the archive file is around 500bytes. I know a correct archive would have more files in it, and in my case, the file size should be around 6kb (based on my first successful archive file).
So how can my local IDE instance find the files correctly, but the deployment feature cannot? How does the deployment feature actually work? I get the feeling I have the structure wrong, eventhough the new IDE instance works perfectly
Plugin
GitHub
JetBrain's Plugin Repository
When you install the plugin, the version is shown as v1.1; however, that is not true, in reality. One of the easiest features to determine the actual version of the plugin is the Folded Text foreground color.
v1.0 - RED
v1.1 - YELLOW
Deployment
Preparing Plugin Module for Deployment + resulting plugin.jar file
Contents of plugin.jar

It seems possible that because of the restructuring an old ChroMATERIAL.xml file was left somewhere in the build output. Somehow this could end up in the plugin jar. An invocation of Build > Rebuild Project should fix this problem.
There could also be problems in the project or module configuration, but the project files are not included in the GitHub repository, so that cannot be checked.

Related

Minecraft Server Paused after my Custom Plugin Loading

this problem has started after I fixed my Plugin in the maven file with erasing this.
<outputDirectory>D:\My New Plugins\Server\plugins</outputDirectory>
this breaked my first test Server erasing all the Plugin Folder Files and damaged the code. Then after reopening the server, the server always stops at this part:
Which I was worried considering my Plugin is broken. Do I have to make a new Plugin and transfer most of the files?
Note: the Last Yellow Lines in the picture Indicate the Use of the JScoreboard dependency used in my plugin. Thanks.
Edit: There is also a warning in the build log:
maven-shade-plugin has detected that some class files are
present in two or more JARs. When this happens, only one
single version of the class is copied to the uber jar.
Usually this is not harmful and you can skip these warnings,
otherwise try to manually exclude artifacts based on
mvn dependency:tree -Ddetail=true and the above output.
See http://maven.apache.org/plugins/maven-shade-plugin/
This might be Related.
The problem appears to be JScoreBoards. It does not support Minecraft version 1.18, and therefore is unable to load up properly.

Google App Engine, Maven and Eclipse development setup

I'll try keep this short. I have Eclipse with an installed M2E (Maven to Eclipse) plugin. I have a GAE (Google App Engine) project I'm working on. Everything is working ok apart from one really annoying thing: I have to stop/start the devserver every time I make a change.
If you have any experience with this setup then you might be able to answer this simple question?
I start the development server with "mvn appegnine:devserver" on the command line. Now I would expect that if I made changes to a *.jsp for example that those changes would automatically be updated on the devserver. Is this what happens with you?
I have noticed that if I make changes to *.jsp files under my target folder then devserver will see those changes and updates as I would expect. I think my problem lies with Eclipse not copying changes to target folder, but not sure if is even suppose to?
Does anyone have any suggestions on how I should progress investigating this? I've ran out of ideas :-/
I thank you in advance for any comments you may have.
P.s I know I can run "mvn package" to update files, but this is slow and the devserver runs out of memory after a do it twice.
This can be little painful, depending on how you want to work and which version of eclipse you're using.
Install the m2e-wtp plugin if you haven't. It's the secret sauce that makes appengine projects work in eclipse. Note this isn't m2e - but another plugin.
Install the GPE - the google plugin for eclipse if you haven't
Make sure your project is being managed by m2e as a maven project.
Go into your project properties - enable it as an appengine project using the GPE (listed under 'Google'). Don't forget to tick HRD while you're here.
Go to your project build path (Properties -> Java Build Path).
Ensure on the source tab that your src/main/resources doesnt have an ** exclusion.
Ensure on the libraries tab your have the three libraries 'JDK', 'Google Appengine' and 'Maven Dependencies' and nothing else
Ensure on the order and export tab that the appengine dependencies are above the maven dependencies.
It sounds pretty ridiculous - i'm not really sure why its still so painful, but that is a good recipe for success. Once that's done, this should allow you to run in debug from eclipse itself, with hotloading of code, jsps, css, scripts etc. I've had this work in helios, indigo and juno.
You can read more about the m2e-wtp setup instructions here. They refer to GWT but it's the same for appengine (I'm not sure why the emphasis on using GWT on GAE) because its actually about the correct setup of GPE and Maven.
You will also find that you may need to repeat some parts of step 5 pretty frequently - if your app isn't loading properly take a quick look to ensure that your resources haven't been excluded. This happens when you update your project configuration using the m2e plugin.
The wtp-m2e plugin updates the target folder as resources modified - so this should also resolve your issues running from the command line, but i can't vouch for that - I prefer to run straight out of eclipse.
I have the same problem as you, however I resolved with other way. I use FileSync plugin (which can be found in the market place).
With this plugin you configure an input directory (webapp) and output directory (target).
Any change made to the webapp will be passed to the target.
I have helped too.
You can use rsync like this:
rsync -r --existing src/main/webapp/ target/ROOT
where "ROOT" is the project build finalName.
The below point worked for me.
Ensure on the order and export tab that the appengine dependencies are above the maven dependencies.

eclipse not updating output folder after compile

I have a maven project on eclipse with jrebel plugin installed. Hot-deploy used to work perfectly last week but now only xhtml pages are hot-deploy. When a java class is changed it doesn't hot deploy.
What I noticed is when I changed a file and save eclipse will automatically build it. But the output folder file is not updated base on file stamp that's why jrebel doesn't pick it up. When I run maven-install it compiles everything and all the java classes are reloaded which is not efficient.
So the main problem is eclipse newly compiled classes don't go to output folder (project/target/classes), even though it's set in Build Path.
Any idea?
By default, the content of your local Maven repository is cached for a day. This can happen even for bad downloads (as I experienced). See https://stackoverflow.com/a/7421893/44089 for a short description of how to work around that.
After several minutes of testing, I found a warning on a jar file specifically guava being downloaded as dependency. I've delete in repo to be redownloaded and after that jrebel is working again.
So the problem is a corrupted jar that causes everything to be rebuild even if only a single file is changed. But the weird part is there's no corrupted file error.

Target Platform for PDE Headless build does not work

I am currently trying to get my headless pde-build working but I am stuck on a point where I do not know how to continue.
The problem is how to define the related target platform to compile the plugins against.
I have a build.bat with the following call (all in one line!):
java -jar D:\target\eclipse\plugins\org.eclipse.equinox.launcher_1.0.201.R35x_v20090715.jar
-application org.eclipse.ant.core.antRunner
-f D:\target\eclipse\plugins\org.eclipse.pde.build_3.5.2.R35x_20100114\scripts\productBuild\productBuild.xml
-Dbuilder=c:\pde-build\scripts %*
I tried to create the target eclipse platform from different parts: The eclipse SDK, RCP SDK, Delta Pack, PDE-SDK in all combinations but none of them worked well.
I got the following error:
BUILD FAILED
D:\target\eclipse\plugins\org.eclipse.pde.build_3.5.2.R35x_20100114\scripts\productBuild\productBuild.xml:18: Cannot fin
d ${eclipse.pdebuild.scripts}/build.xml imported from D:\target\eclipse\plugins\org.eclipse.pde.build_3.5.2.R35x_2010011
4\scripts\productBuild\productBuild.xml
where the variable ${eclipse.pdebuild.scripts} does not got resolved. I also tried to give this parameter via the command line but then I got another error regarding missing svn task which is absolutely confusing as this is working with my local eclipse installation referenced.
When I replace the path from d:/target/eclipse to my local eclipse installation the pde build works as expected!
This leads my to the point that the configuration of the target eclipse is not correct but in the moment I have no idea how to configure this!
My goal is the automate the pde build first on my local site without referencing my local eclipse and later on integrate this building process into our running cruisecontrol instance.
As I saw already another question about defining the target eclipse I would be happy if anyone can contribute hints or facts regarding the problem.
Regards,
Andreas
When performing a headless build, the target can be separate from the eclipse that is actually running the build itself. The problem you had here is that the eclipse that you were using to run the build did not have PDE/Build properly installed.
This is why the ${eclipse.pdebuild.scripts} was not set, because PDE/Build was not installed into that eclipse instance, the org.eclipse.pde.build bundle was not resolved and the code that sets this property never got called. Similarly, the necessary ant classpath entries for PDE/Build tasks would not have been set up properly either.
You need the Eclipse with PDE installed inside to run the build, but the target for the build can be separate from this.
In the build.properties file found under -Dbuilder=c:\pde-build\scripts you can set several properties:
baseLocation This is a path to an eclipse that is your target.
buildDirectory This is where the build will actually take place, source is fetched to plugins/ and features/ subfolders, but if there are already binary plugins located here then those become part of the target as well.
pluginPath This is a list of paths (separated with ';' on windows or ':' on linux) containing other locations that should be considered as part of your target. These locations can be several things:
The root of an eclipse-like install with plugins/ and features/ subfolders. This is a good way to provide the delta-pack instead of just unzipping it on top of an eclipse install.
The root of a workspace-like folder, where all subfolders are treated as plugins or features depending on the presence of a manifest or feature.xml.
The root of a bundle or feature, or the jar for a bundle.
If you are doing a p2 build (p2.gathering = true) you can also provide p2 repositories under a ${repoBaseLocation} which will be transformed and placed under ${transformedRepoLocation} and will become part of your target, and the p2 metadata there will get reused during the build.
after some more time of investigation I found out, what I did wrong so far. As I mentioned above defining the target platform is not that easy as copying the SDK and plugins in into one location (as it was in early times of eclipse dev).
The working solution by now is the following: Copying the eclipse SDK into the target location and run this version. Install inside this the neccessary PDE-Tools to enable plugin development. After that, close the IDE and copy the delta pack + the respective svn plugin (I used org.eclipse.pde.build.svn-1.0.1RC2 from sourceforge) into the target platform and you're done.
Now my automated PDE build is running as expected.
Only minor issue now is the following: The result product contains eclipse-specific menu entries which are not there when I ran this from inside my dev-eclipse.
Any hints on that?
I just posted an answer to my question on this kind of topics, may be this can help you:
Plugin product VS Feature product

Modifying Existing Eclipse Plugin and Correctly Installing it

I downloaded the source code for the EMF based UML2 Plugin and changed a class in the org.eclipse.uml2.uml.edit project to remove special characters when returning string representations. Now when I export the projects and place the jar files either in the dropins directory or replace my current uml2 plugin jar files in plugins directory, The UML files are no longer recognized, in short my modified plugin does not install correctly (no error is thrown and I can see the files being picked up under Plugins->Target Platform) .
However, When I run the plugin as an eclipse application (from the workspace) I can see the changes I made being reflected in the new instance of eclipse.
What can I do to ensure that the plugin installs correctly?
Is there a documented procedure of how to build the uml2 plugin (or any comparable plugin) after modification?
Select the project and open the context menu. There is an entry PDE near the bottom of the menu. In there, you can find an entry to build the plugin for deployment. This gives you the features and plugins directory with the fixed files. Copy both into your Eclipse install.
Unless the UML2 plugins require some kind of magic build script, exporting the one plugin you changed and overwriting the original in your Eclipse installation should be the easiest solution. One potential problem which comes to mind is conflicting plugin version numbers: make sure you don't have two identical versions of your modified plugin in your Eclipse installation.
When debugging plugins which apparently don't work properly at runtime, I always look at Help > About Eclipse Platform > Configuration Details. This lists all the plugins found by Equinox during startup, along with their status (see the Javadoc of the org.osgi.framework.Bundle interface for explanation).
I faced the exact same problem as you describe here . I dont have any answer to your problem but i am sharing what worked for me .
I created a local update site of the plugin on my system. Create update site for your plug-in article explains very very nicely the steps needed to accomplish this .