How to deal with local packages freezing my IDE during workspace build - eclipse

I have a question about local packages and gulp using them. The problem is, I have a packages.json in my working directory of my IDE with a lot of dependencies which also have dependencies. Now that wouldnt be much of a problem, but when I install them through npm, I get the directory node_modules.
The problem is that I have to build often but since the modules are getting installed in my local directory, my IDE starts to scan them and build the workspace. It hangs every time probably due to the complex dependencies. Apparently I need those packages locally because the gulpfile require('')s them.
I wonder if anyone else has stumbled upon this problem and what their strategy was for excluding the dependencies of their workspace. Eclipse (or in fact Zend Studio which I use, derived from it) doesn't seem to have its own exclude for it.
PS: I tried copying all to my global directory (where all packages are installed with the --global flag set) but there are that many dependencies within depedencies that the path becomes too long for Windows to handle (yeah I know, Windows). That didn't became a suitable solution either.

This sounds similar to big Node.js projects in Eclipse issue https://github.com/Nodeclipse/nodeclipse-1/issues/159
The solution would be disabling JSDT nature for there project (see Nodeclipse not recognizing generator functions answer)
See http://www.nodeclipse.org/history

Related

Import multi module project in Eclipse

I am trying to get started with Eclipse SCADA and import the projects from their git repository.
I have cloned the following projects:
org.eclipse.scada.external
org.eclipse.scada.utils
org.eclipse.scada.base
org.eclipse.scada.protocols
org.eclipse.scada.core
org.eclipse.scada.releng
For each project I did mvn verify in the parent folder and imported the projects in Eclipse. I also changed target platform. However, I still seem to have problems with their dependencies.
Any help would really be appreciated.
Actually the Eclipse SCADA java projects are not developed with "Maven first". So you should disregard maven completely while in the IDE. The maven build is basically only used to build the project unattended.
The issue with the target platform is more complex. We were a bit sloppy in providing a always working target platform (and it is actually difficult to keep them up to date, since the versions of the bundles are fix).
I made a target platform file for the current version, you can find it here: https://gist.github.com/CptMauli/ec6eda37734f0108510f
To make it work properly please download a classic eclipse put it somewhere and create an environment variable ECLIPSE_432_HOME which points to it. Alternatively you can just change the first entry in the target file and point it directly to it.
The reason behind it is, if you would use your own eclipse installation, it is possible that bundles installed there conflict with bundles provided in the target platform or from your workspace. This is actually mostly not even a problem when compiling, but as soon as you start a client or a server, Eclipse will complain about duplicated bundles.
If you have any more questions please go to our mailing list: https://dev.eclipse.org/mailman/listinfo/scada-dev
or our google group: https://groups.google.com/forum/#!forum/openscada
or write to me directly at juergen dot rose at ibh-systems dot com

Google App Engine, Maven and Eclipse development setup

I'll try keep this short. I have Eclipse with an installed M2E (Maven to Eclipse) plugin. I have a GAE (Google App Engine) project I'm working on. Everything is working ok apart from one really annoying thing: I have to stop/start the devserver every time I make a change.
If you have any experience with this setup then you might be able to answer this simple question?
I start the development server with "mvn appegnine:devserver" on the command line. Now I would expect that if I made changes to a *.jsp for example that those changes would automatically be updated on the devserver. Is this what happens with you?
I have noticed that if I make changes to *.jsp files under my target folder then devserver will see those changes and updates as I would expect. I think my problem lies with Eclipse not copying changes to target folder, but not sure if is even suppose to?
Does anyone have any suggestions on how I should progress investigating this? I've ran out of ideas :-/
I thank you in advance for any comments you may have.
P.s I know I can run "mvn package" to update files, but this is slow and the devserver runs out of memory after a do it twice.
This can be little painful, depending on how you want to work and which version of eclipse you're using.
Install the m2e-wtp plugin if you haven't. It's the secret sauce that makes appengine projects work in eclipse. Note this isn't m2e - but another plugin.
Install the GPE - the google plugin for eclipse if you haven't
Make sure your project is being managed by m2e as a maven project.
Go into your project properties - enable it as an appengine project using the GPE (listed under 'Google'). Don't forget to tick HRD while you're here.
Go to your project build path (Properties -> Java Build Path).
Ensure on the source tab that your src/main/resources doesnt have an ** exclusion.
Ensure on the libraries tab your have the three libraries 'JDK', 'Google Appengine' and 'Maven Dependencies' and nothing else
Ensure on the order and export tab that the appengine dependencies are above the maven dependencies.
It sounds pretty ridiculous - i'm not really sure why its still so painful, but that is a good recipe for success. Once that's done, this should allow you to run in debug from eclipse itself, with hotloading of code, jsps, css, scripts etc. I've had this work in helios, indigo and juno.
You can read more about the m2e-wtp setup instructions here. They refer to GWT but it's the same for appengine (I'm not sure why the emphasis on using GWT on GAE) because its actually about the correct setup of GPE and Maven.
You will also find that you may need to repeat some parts of step 5 pretty frequently - if your app isn't loading properly take a quick look to ensure that your resources haven't been excluded. This happens when you update your project configuration using the m2e plugin.
The wtp-m2e plugin updates the target folder as resources modified - so this should also resolve your issues running from the command line, but i can't vouch for that - I prefer to run straight out of eclipse.
I have the same problem as you, however I resolved with other way. I use FileSync plugin (which can be found in the market place).
With this plugin you configure an input directory (webapp) and output directory (target).
Any change made to the webapp will be passed to the target.
I have helped too.
You can use rsync like this:
rsync -r --existing src/main/webapp/ target/ROOT
where "ROOT" is the project build finalName.
The below point worked for me.
Ensure on the order and export tab that the appengine dependencies are above the maven dependencies.

Remote Play framework and Eclipse

I have a play framework project which run on remote server.
I'm trying to configure eclipse to work remotely on the project.
since no build is required my requirements are to be able to edit the project files from eclipse and automatically save on server, auto complete and debug.
I've installed Remote System Explorer on eclipse and setup a remote ftp connection to my server.
The play environment on my server is under
/play-2.0.2/
My project path is
/play-2.0.2/test
In RSE I clicked on /play-2.0.2/test and 'Create Remote Project'
Now in Java perspective I can browse through the project, change files and then automatically save to server.
My problem is auto completion of play framework library doesn't work well since all the reference are to /play-2.0.2/repository/...
any idea how to solve it? I tried to play with the build path but no success
Thanks!
So, I was looking into achieving the same thing myself.
The problem you are experiencing is due to the fact that the .classpath file has absolute values. While for me, besides the symlink that doesn't work between two different kinds of O.S., I thought on another two solutions:
Use sed to rewrite the classpath on the .classpath file after it's generated
Use a "classpathTransformerFactory" for the sbt eclipse command
I haven't had to deal with it (it's more of want to than whatnot) as soon as I do, I will explore the two options and post details. I just leave the answer here in case someone wants to pick up where you left.
Another thing is:
Seen that sbt picks up the libs referenced in build.sbt, downloads the jars and puts them in the ~/.ivy2 directory, if you use any of those methods to change the reference from the remote machine to the local one, you would need to make sure that the same libs are on the local ivy cache. So, just as I wrote this, another idea came to mind:
sbt eclipse or play eclipse or activator eclipse [it should be the same] on a local environment, and on the remote one, then, transfer the .project and .classpath files from the local to the remote, and see what happens (if it doesn't work, scan them for absolute and incorrect paths that might need to be changed)
Sorry for the unproved answer, still I think it's better than no answer.
Cheers

Target Platform for PDE Headless build does not work

I am currently trying to get my headless pde-build working but I am stuck on a point where I do not know how to continue.
The problem is how to define the related target platform to compile the plugins against.
I have a build.bat with the following call (all in one line!):
java -jar D:\target\eclipse\plugins\org.eclipse.equinox.launcher_1.0.201.R35x_v20090715.jar
-application org.eclipse.ant.core.antRunner
-f D:\target\eclipse\plugins\org.eclipse.pde.build_3.5.2.R35x_20100114\scripts\productBuild\productBuild.xml
-Dbuilder=c:\pde-build\scripts %*
I tried to create the target eclipse platform from different parts: The eclipse SDK, RCP SDK, Delta Pack, PDE-SDK in all combinations but none of them worked well.
I got the following error:
BUILD FAILED
D:\target\eclipse\plugins\org.eclipse.pde.build_3.5.2.R35x_20100114\scripts\productBuild\productBuild.xml:18: Cannot fin
d ${eclipse.pdebuild.scripts}/build.xml imported from D:\target\eclipse\plugins\org.eclipse.pde.build_3.5.2.R35x_2010011
4\scripts\productBuild\productBuild.xml
where the variable ${eclipse.pdebuild.scripts} does not got resolved. I also tried to give this parameter via the command line but then I got another error regarding missing svn task which is absolutely confusing as this is working with my local eclipse installation referenced.
When I replace the path from d:/target/eclipse to my local eclipse installation the pde build works as expected!
This leads my to the point that the configuration of the target eclipse is not correct but in the moment I have no idea how to configure this!
My goal is the automate the pde build first on my local site without referencing my local eclipse and later on integrate this building process into our running cruisecontrol instance.
As I saw already another question about defining the target eclipse I would be happy if anyone can contribute hints or facts regarding the problem.
Regards,
Andreas
When performing a headless build, the target can be separate from the eclipse that is actually running the build itself. The problem you had here is that the eclipse that you were using to run the build did not have PDE/Build properly installed.
This is why the ${eclipse.pdebuild.scripts} was not set, because PDE/Build was not installed into that eclipse instance, the org.eclipse.pde.build bundle was not resolved and the code that sets this property never got called. Similarly, the necessary ant classpath entries for PDE/Build tasks would not have been set up properly either.
You need the Eclipse with PDE installed inside to run the build, but the target for the build can be separate from this.
In the build.properties file found under -Dbuilder=c:\pde-build\scripts you can set several properties:
baseLocation This is a path to an eclipse that is your target.
buildDirectory This is where the build will actually take place, source is fetched to plugins/ and features/ subfolders, but if there are already binary plugins located here then those become part of the target as well.
pluginPath This is a list of paths (separated with ';' on windows or ':' on linux) containing other locations that should be considered as part of your target. These locations can be several things:
The root of an eclipse-like install with plugins/ and features/ subfolders. This is a good way to provide the delta-pack instead of just unzipping it on top of an eclipse install.
The root of a workspace-like folder, where all subfolders are treated as plugins or features depending on the presence of a manifest or feature.xml.
The root of a bundle or feature, or the jar for a bundle.
If you are doing a p2 build (p2.gathering = true) you can also provide p2 repositories under a ${repoBaseLocation} which will be transformed and placed under ${transformedRepoLocation} and will become part of your target, and the p2 metadata there will get reused during the build.
after some more time of investigation I found out, what I did wrong so far. As I mentioned above defining the target platform is not that easy as copying the SDK and plugins in into one location (as it was in early times of eclipse dev).
The working solution by now is the following: Copying the eclipse SDK into the target location and run this version. Install inside this the neccessary PDE-Tools to enable plugin development. After that, close the IDE and copy the delta pack + the respective svn plugin (I used org.eclipse.pde.build.svn-1.0.1RC2 from sourceforge) into the target platform and you're done.
Now my automated PDE build is running as expected.
Only minor issue now is the following: The result product contains eclipse-specific menu entries which are not there when I ran this from inside my dev-eclipse.
Any hints on that?
I just posted an answer to my question on this kind of topics, may be this can help you:
Plugin product VS Feature product

Eclipse won't delete files

I've got an ANT project with libs managed by ivy (they are under lib_managed). Eclipse is using the jars to. Probelm is: if I try to update the directory ant refuses to delete it because eclipse holds on to the jars in its classpath. Even if I update (empty) eclipses classpath I can't delete the files. If anyone had the same problem and found a solution I would be thankfull for an answer.
Regards, Jan
Not a solution, but a workaround. I experience Eclipse keeping locks on files quite often in different contexts. I suggest using Unlocker.
I guess this is on Windows. Use the Process Explorer to figure out who is locking the files. Eclipse shouldn't keep a lock; maybe you have the code running in the debugger (hanging in a breakpoint). Use the list of open files and the properties to figure out which Java program is keeping the lock on the files.
If it's really Eclipse, try to upgrade to a newer version of Eclipse or close the project when you need to update the dependencies with ivy.
Cleaning the workspace and restarting eclipse may solve the problem. But in real development environment i don't think its a good idea to restart eclipse whenever you need to build a jar.