I have taken the org.eclipse.equinox.p2.examples.rcp.prestartupdate project and adapted it for use in my RCP application. I then setup an update repository that gets updated as part of my nightly build.
When I open my application it goes through the motions like it is updating - it finds the update site, generates an uninstall and install operand for each bundle correctly and says that it finished with no errors. The problem is that the plugins never actually get installed in the plugins folder even though the profile gets updated (a subsequent run states there are no updates). Next time my build runs it correctly identifies there are updates, but the same thing happens again.
I have spent days debugging and the only thing that looks out of the ordinary (not that I fully understand what is going on) is that during the final configure phase none of the TouchpointData objects have any instructions so it doesn't look like configure is doing what it should.
I really have no clue where to look next and would like to see if anyone else has any ideas.
Update:
I finally figured out what was going on.
The problem started when I built my product without the generating the metadata repository. When building through Eclipse I didn't check the "Generate metadata repository" in the export product wizards because I didn't need a p2 repository, just the product. The problem is that without checking that button the product does not install as P2 enabled causing side effects such as not generating a profile among other things.
I tried to compensate for this by manually creating a profile in code which I have since found out is a really bad idea. My original problems were created because my profile wasn't set up correctly.
Once I started exporting the product with "Generate metadata repository" checked the update started correctly installing the new plugins.
The problem I have now is that although the plugins are being installed correctly, the executable is getting trashed and I cannot launch my application any more. I am building my update site through Hudson and the binary folder which is present when I use the Eclipse Export Product wizard is missing. I am assuming that is what is going wrong now.
Any ideas why the binaries would not be building in my headless PDE build?
Figured this out also. I had assumed that all I needed was the individual launcher plugins for the platforms I wanted to build on. Since I was trying to understand the process I was copying over plugins one by one to the build server. It turns out to include the platform specific binaries in the build you need to have the org.eclipse.equinox.executable feature from the delta pack. Once I added that to the build the binaries started showing up in the output. With the binaries the update mechanism works exactly as intended.
I had assumed that all I needed was the individual launcher plugins for the platforms I wanted to build on. Since I was trying to understand the process I was copying over plugins one by one to the build server. It turns out to include the platform specific binaries in the build you need to have the org.eclipse.equinox.executable feature from the delta pack. Once I added that to the build the binaries started showing up in the output. With the binaries the update mechanism works exactly as intended.
Related
I have a component library [design-system]. I want to use that library in another project.
I did run npm-link in 'design-system' and npm link design-system in the project.
First, the link didn't work, after I deleted package-lock.json and did npm i. I could see, my component refers to the design system's dist index path. Even If I change anything in the design system and build it. It is reflecting immediately in the project. I can see the design-system's code in VS Code IDE.
But, when we run the program, the changes are not reflecting in the UI.
For example,
I commented on the console.log code in design-system
even I can see this code in the package by clicking 'Go to Definition'. But in the browser, the code is not updated.
I cleared 'Application cache' and tried again. I don't know what I am missing. Please help me with this.
Thanks!
you need to run the build in watch mode for your design-system library. Now for each change, the library will build and the bundle will update hence you will see the changes reflecting.
Alternatively, for development purposes, you can do something like
"design-system": "file:../relative-path-from-pkg-json"
instead of npm-link. still, if you can target the non-build version of your library. it may or may-not work deponding on your build tooling.
Or you can get real fancy and try Nx Monorepo approach which will take care of everything about tooling. while you can focus on your development.
I have an Ionic app running with the basics of Ionic and running it in the browser by doing ionic serve, but I want some new stuff and run it trough the grunt serve command, also has the feature for JSLint, I am already using this scss this: https://github.com/diegonetto/generator-ionic/ and I see that have everything I want, how do I install that in my project?
Take into account that my project is almost done, I have almost 85 % already done.
Is this the part I need to follow up:
Upgrading
Make sure you've committed (or backed up) your local changes and install the latest version of the generator via npm install -g generator-ionic, then go ahead and re-run yo ionic inside your project's directory.
The handsome devil is smart enough to figure out what files he is attempting to overwrite and prompts you to choose how you would like to proceed. Select Y for overwriting your Gruntfile.js and bower.json to stay up-to-date with the latest workflow goodies and front-end packages.
does this will bring some complications ? is there something else I need to know ?
I use the same generator and enjoy using it. With that said, I would not recommend starting to use a generator until you've made a complete backup of your project.
Even then, I'd recommend creating a brand new project using the generator then migrating your existing code into the newly generated project. While migrating, you should be modifying your code to match the generator conventions as you go. This gives you the most control and will make sure that you learn the conventions of the new project structure. Upgrading instructions are really meant for people who already use the generator and are just upgrading to a new version of the generator. They are not applicable to you.
I'm using latest Jenkins (v 1.590) LOL, but Jenkins official site say: 1.588. I'm 200% sure that I did see 1.589 and 1.590 few days back on Jenkins official download site(when I wanted to upgrade Jenkins to newer version).
This is what I see at the bottom of my Jenkins instance page.
Page generated: Nov 19, 2014 12:07:51 PMREST APIJenkins ver. 1.590
Now, the issue I'm facing is: Since I upgraded few of the plugins and Jenkins itself recently, some of the jobs are missing (I see this can happen during upgrades but upgrading to latest Jenkins should fix it and I'm two steps ahead of what Jenkins has on their official site, right):
I go to Manage Jenkins, Manage Plugins, Go to Available tab, check mark bunch of plugins to install (Artifactory, Maven project plugin etc ) and restart Jenkins using Jenkins GUI interface (which happens automatically once plugins are downloaded/installed in Jenkins GUI). After the restart, I do the same to see whether the plugin is now showing under "Installed" tab or not, but to my luck, it's still showing under "Available tab" and is NOT listed under "Installed" tab. If I open an existing job's configuration OR create a new job, the features available due to installed plugins are NOT visible i.e. if I installed Maven Project Plugin, I don't see an option to create a Maven style(2/3) project job while creating a new job.
I see valid .jpi files for respective plugins in plugins folder in JENKINS_HOME and there are some .pinned files as well. I have tried this a couple of times but the plugins are not visible once installed. Installation doesn't give any error during the whole operation.
Jenkins system log file (upon Jenkins restart) is attached (NOTE: Use slow download button to see/download this log file).
Download at SpeedyShare
or
[code]http://speedy.sh/x6vd8/Jenkins.System.Log[/code]
Issue was with the plugins permissions and expanded folders.
If you see under the plugins folder, you'll see .jpi or .hpi files (Jenkins jpi and Hudson hpi).
If I have awesomeplugin.jpi then there will be a folder called awesomeplugin.
Using Slav's hint, I ran bunch of checks and found out of 70+ plugins that I had installed, few of them somehow got "root" and "root" as their owner and group for their .jpi files and corresponding folder.
Now, The best solution one can try (the safest approach) is to chown -R yourvalidjenkinsuser:yourvalidgroup * and chmod -R 755 * as root. Before doing this, stop/shutdown jenkins.
I went even a step further, I first took backup of config files / whole jenkins JENKINS_HOME folder. Then I went to plugins folder and remove all .jpi corresponding folders using root account or as the owner of those folders (NOTE, I didn't delete the .jpi files). Then, I ran the above two commands (chown/chmod) and started Jenkins.
OUTCOME:
when I'm going to Jenkins > New item (to create a new job), Shenzi, all different types of jobs options are showing up (which included the Maven2/3 type job which I found got missing and few others like "Multi-configuration project" and Multijob Project job type.. all were missing and now they are showing up.
OK, I also checked one of the old job, went to its job's configuration and Shenzi!! I now see all the features there i.e. (Promoted Job plugin feature "Promtoe builds when.." check box. This feature which I configured sometime back, got missing but now it's showing up again.
Few of the Maven jobs that I created in past with Maven Release Plugin and Release Plugin POC work had bunch of steps in it. I found there was nothing in the Build step (after this whole mess), but after the above solution, I now see everything is back. I can see the configurations and build steps populated as they were set.
I hope this can help someone facing similar issue.
Still, I don't know why my Jenkins version is 1.590 (which Jenkins updated recently in automatic fashion) and Jenkins site today says, their latest Jenkins artifact is version 1.588 (seems like a mystery).
When you say "valid .hpi files", did you actually test that they are valid? You should be able to rename them to .zip and extract as a valid archive. An issue I face a lot is the network layer filtering system that we have in the office. It intercepts Jenkins's calls sometimes with the filtering system's login page, instead of whatever internet resource was being loaded.
If your .hpi files are not valid zip archives, open them in a text editor, and see if they are in the form of an html page/response of some kind.
We have an eclipse feature that is licensed and the license is handled by our own code. The user can go in on our update-site and upgrade his feature. The problem we face is when the user's license needs to be updated before he can use the new upgrade.
What I want to do is to validate the feature version against the users license and warn the user that his license needs to be updated before he install.
I thought I would do this using a custom eclipse p2 touchPoint action validateLicense.
Example:
My code is called, where I validate the version against the user's license. If it fails I warn the user and he can then cancel the installation.
So my first question is:
Do I get this right, or is it some other way to do this?
My second question is pretty basic:
Where do I tell eclipse to run my code?
I have looked here at eclipse help where they explain what it is. But I don't understand where to put the information to run my code? Is it in the feature.xml.
Lastly:
Is there an example how to create and use p2 touchPonts?
I implemented a custom action as shown here and I have a system that seems to work. I left out "touchpoint" extension as it's unnecessary in my case, but the rest is the same.
My action is executed during install phase of my feature (instructions.install) but maybe configure phase could work too. Collect phase did not work.
The action is executed during installation process, after the download was already performed. Ideally it would be before the download but it's not a big issue for me. Returning an error status from the action cancels the install. It leaves some downloaded files around but they do not get activated and are probably removed later by p2's garbage collector.
I also managed to do some more interesting things. My actions plugin has a dependency (optional and non-greedy) on my main plugin. So the install works like this:
Actions plugin is downloaded
Custom action is executed
The action detects whether my main plugin is already installed and if yes, it calls into it to retrieve licensing info. The main plugin has to expose an API for the action. The action also checks main plugin's version to detect whether the API is there or not.
The action now can decide whether to proceed or cancel the install. It can even interact with the user using Display#syncExec (this is what the code in checkTrust phase does so I think it's safe). If needed, the action could also detect whether the install is headless.
Some gotchas:
Action itself must be versioned. It's the version you declare in plugin.xml and p2.inf files and it's different from plugin's version. I just replace 1.0.0 with the same version my plugin has. This way the latest version of the action plugin is always downloaded before being executed. This is great because now any problem changes to licensing rules can be implemented in actions plugin.
Actions API changed between Eclipse 3.5 and 3.6. I will probably drop support for 3.5 as it's pretty old anyway.
Actions plugin should probably be signed. It's the case in my case. The system seems almost too powerful to me as just pointing Eclipse to an update site gets it to execute downloaded code.
I still need to test how this works with different versions of Eclipse and other IDEs. I saw a strange (non-blocking) error with 3.6. However the results are promising and it looks like the system might actually work.
Touchpoints are executed at installation time, which means that the resolution (validation) has already happened. I'm not sure they would help. What about creating an Installable Unit (IU) (or Eclipse Feature) that represents the license the user has installed. Then you would put a dependency from your product to that license.
For example, create an IU called com.mycompany.license (1.0.0). You would create another one called com.mycompany.license (2.0.0). When you installed a license, the appropriate IU would be added to the profile.
Now, when you go to install you product, the new version of the product would require license version 2.0.0. If this license was not installed, the resolution would fail.
Does this make sense? Do you think this would help?
We use Eclipse (Indigo, with STS). Certain of our projects take inordinately long to build. Often the progress indicator sticks on, say, 87%, for 30 seconds.
I'm trying to find out what Eclipse is spending it's time on during the build cycle. I hope to be able to optimize the build or disable components that are causing it to be so slow. I'd like to see a log file saying ("compiling java code", "processing resources", etc).
I've poked around the log files in the .metadata directory. I've looked on the Eclipse site for tips. I've tried using "-debug" when starting Eclipse. I still can't find the information I'm looking for.
Is there any way to get Eclipse to spit out a log of what activities it is spending its time on when it builds a project?
What kind of projects are these? Java? Dynamic Web? Two things to look at for hints about what's going on are in the project Properties dialog; look at the Builders section and the Validation section. Try disabling the validations to see if that makes a difference in your build times.
To get some insight into what's happening at the times when the build seems to hang, try setting the -debug and -consoleLog options, as described here.
Disable your virus scanner software for your workspace and project directories. I increased the speed of my build in that way.
You can go to edit Windows->preference->general->workspace->build order to edit the default that exist according to your project need.
And check the maximum number of iteration when building with cycle.
I hope it works.
Since eclipse is a Java application, the usual debugging tools are at your disposal. In particular, you might try connecting to eclipse with JConsole and inspect the thread dump taken when the build "hangs", or run eclipse within a profiler.
You might find out things like a validator trying to download an xml schema, and waiting for the timeout since eclipse is not configured to use the corpoate proxy server - something which is very hard to find out by other means ;-)
Look into Apache Ant build scripts. Eclipse has support to auto generate them as a starting point instead of coding the whole thing by hand. The shop I worked in used tuned ANT scripts to optimize and control build order. We then piped output to log files using shell scripts.
You can try and replace with this aapt . My build for a particular project went from 3 minutes to 41 seconds....
This is an old post but thought of sharing my solution. I was using eclipse Luna and I noted that when you keep on working on a GIT branch without checking into git over the time the build becomes very slow. In my case I just deleted the folder .git and the file .gitignore and the build was very fast. Please note that this will disconnect eclipse from git, therefore use this aproach only if you know how to connect back to git branch using git commands.