Running a GWT project takes a lot of space - gwt

I have been observing this for a few days now, every time i run a GWT project on my web page a lot of space on my local C drive is taken, which is about 200 MB per run. A few days ago the space left on the drive was about 50 GB, now only 4 GB is left. How do i free the space on my C drive without removing the projects i made?

GWT generates a lot of temporary files, some of them relatively big, but cleans up after itself.
That's unless you run DevMode from within Eclipse, in which case it'll kill DevMode so abruptly that it cannot do the cleaning. This is a known issue of the Google Plugin for Eclipse: https://code.google.com/p/google-plugin-for-eclipse/issues/detail?id=74
You'll find various workaround in the issue originally reported against GWT: https://code.google.com/p/google-web-toolkit/issues/detail?id=5261
Files are all in the temporary folder anyway, which you should clean regularly anyway (for example, setup a scheduled task to do it when you start or shutdown your computer, as is done on most other OSes)

Related

How to speed up edit-compile-test cycle in AOSP?

I followed https://source.android.com/setup/start. It mostly worked, except that I needed to modify acloud to work on my Arch Linux system and pass it --launch-args='-vm_manager=qemu_cli' because the default VM manager of Cuttlefish (crosvm) crashes on my system.
I make changes inside frameworks/opt/vcard. Running atest AndroidVCardTests inside that directory takes 3:17 if a file was changed since the last run and 1:53 if no file was changed. This makes edit-compile-test cycles very slow. Is there a way to speed this up?
When running the command while no emulator is running, it aborts after 0:47. It seems like most of the time is spent installing the tests on the device. The tests itself seem to be fast (it reports a few ms of time for each test).
Because it’s slow even if no file was changed, I think that most the rest of the time is spent finding out which files need to be recompiled. However, I know that I only changed files inside frameworks/opt/vcard.

sqldeveloper taking too long to load the content of a DB

I am working with Ubuntu 18.04. After installing sqldeveloper I correctly log in and get a list of databases. Upon doubleclicking on one database to show its content sqldeveloper becomes extremely slow. I receive the following message right after double clicking on a DB:
UsersCache.fillIn() time = 4 ret==null?: true
And then it takes approximately 10-15 minutes to load the database I clicked. After that time I can interact with the DB, but if I want to open another one I have to wait about the same time. The size of the DB is big, but on my colleague's machines it is a matter of seconds. I tried to uninstall and install it again but there is no way it speeds things up. Running it in verbose doesn't give more info than the one-liner I pasted above.
EDIT: top shows a CPU usage of approximately 180% on the sqldeveloper process.
Inspecting with top shows I'm using java-1.8.0-openjdk-amd64 to run
sqldeveloper
That is likely your problem. We do not support OpenJDK (or IBM's either for that matter.)
For the best experience we recommend and ONLY support the Oracle JDK - specifically, version 8.
I noticed on our download pages we do not say this specifically, but do point folks to the Oracle downloads for Java. I'll add a note/disclaimer so it is more obvious.
You can control the Java Home used or SQL Developer in the .sqldeveloper directory in your $HOME. There is a product.conf file in there, put the path to Oracle JDK 8 there.

Will running everything from RAM disk speed up scala compile time?

Scenario:
The machine I use for development have 32Gb of DDR3 RAM, i7 3770, SSD. The project is large, Scala compiles fast most of the time during incremental compilation but sometimes a single change leads to recompilation of hundreds of files, it then take some time to compile all and some good time for jrebel to reload all changed files.
Question:
Will putting everything on a RAMFS (Mac) make compile and jrebel reload significantly faster?
My plan was to put everything directly related to the project in a RAMFS partition ( .ivy, project source, .sbt, maybe even copy JDK. etc). I would create a script to do all that in the boot or manually, that won't be a problem. Also, I would setup file sync tasks, so, losing a change won't be a concern in case of a OS failure.
Updates:
log says around 400 among java and scala sources are compiled after a clean.
after changing a file in a core module, it recompiles 130 files in 50s.
jrebel takes 72s to reload after #1 and 50s after #2
adding -Drebel.check_class_hash=true made jrebel reload instantaneous after #2.
I am quite happy with these results, but still interested on how to make scala compilation even faster, since cpu usage gets at most 70% for just about 5 seconds in compilation process that takes 170s, overall cpu usage during the compilation is 20%.
UPDATE:
After putting JVM, source, .ivy2 and .sbt folders on RAMDISK, I noticed a small improvement on compile time only: from 132s to 122s ( after a clean). So, not worth the trouble.
NOTE:
That is excluding the dependency resolution, since I using this approach to avoid losing dependency resolution after a clean.
I have no idea what speedup you can expect with a Mac, but I have seen speedups on Linux compiling the Scala compiler itself that are encouraging enough to try. My report (warning : quite Linux-specific) is there.
You can try setting a VM argument -Drebel.check_class_hash=true which will check the checksum before reloading the classes.
There's often very little point in a RAM disk if you're working on Linux or OSX. Those OS's cache the files anyway.
https://unix.stackexchange.com/a/66402/141286

STS slow build when loading xyz-context.xml files

i recently started using STS on a 64 bit Windows machine. Often when i "clean" my project STS gets unresponsive or just takes minutes to build while loading context.xml. files.
How can I fix this? Is it looking for resources on the web and waiting for timeouts.?
EDIT: I noticed that during the build process my network usage goes up. Not sure yet what is going on there...
EDIT: Possibly STS is loading all of the referenced springsource XSD files for XML validation?`If so, how can I disable this validation (apart from copying the files and referencing them locally, of course)? I've already tried disabling all of the Preferences related to "Validation" in STS - to no avail.
Often it is, because java is running out of free memory and need to run the garbage collector very often.
You can see the free memory in the bottom right corner of eclipse if you enable Window/Prefercences/General/"Show heap status".
If you can confirm that it is a memory problem, then you can increase the memory in sts.ini (-Xmx).
It is said that the 64bit java version needs up to 1/3 more memory than the 32bit version. But I don't know if this rumour is right or not.

Eclipse getting too slow - workspace recreation helped

My Eclipse was getting slower and slower over time. Tips I found on the Internet did not help.
What I did is completely deleted my workspace, created new one and reimported all my projects into the new workspace and this really made the difference.
So my question is whether it's possible to perform this workspace clean-up without deleting and recreating workspace...
Maybe there is some cache in workspace which is getting big? Any ideas?
Thank you!
Eclipse keeps track of all changes in local history. That might introduce slowdown over time.
Local history is located at .metadata/.plugins/org.eclipse.core.resources/.history.
Not sure about newer versions of eclipse, but in 3.1 settings in
Preferences->General->Workspace->Local history
did not work for me. I had it set by default to 7 days but files were kept for 4 years. And I guess other people here had the same issue.
For me it helped to remove history files manually from
.metadata/.plugins/org.eclipse.core.resources/.history.
I use RAD 7.5 (which is based off of Eclipse 3.4). I found several of my projects had millions of history files, all older than 7 days, and mostly the same dummy MANIFEST.MF file (39 bytes).
I discovered this when I tried to delete an old project with Windows Explorer. After 12 hours, Windows Explorer reported that it had recycled 3.5 million files, and was still working.
I found the only way to remove the workspace was to open a Command Window, CD to
<project>\.metadata\.plugins\org.eclipse.core.resources\.history
then type
DEL *.* /s/q
Even this took the better part of an hour.
Try running eclipse from command prompt with
eclipse.exe -clean
More http://www.myeclipseide.com/PNphpBB2-viewtopic-t-10280.html
Sometimes due to Physical Memory issue it cant build the workspace.
So To remove Memory issues update #
eclipse.ini
file as below
-Xms512m
-Xmx1024m
-XX:MaxPermSize=1024m
--launcher.XXMaxPermSize 1024m
I just solve the problem by deleting all stuffs inside eclipse's directory OPTReplica. after that, re-stat eclipse, for me it helps.
Eclipse is programmed as a filebomb, and it causes a large variety of problem, evben on modern robust filesystem. Problem can goes from large waste of diskspace for nothing to preventing your OS to boot if your workspace is on your OS partition. Eclipse is programmed as a filebomb.
The cleanup mechanism in eclipse doesn't work, so the only viable option is to frequently cleanup your workspace by hand at regular interval, or to add your cleanup code to a sh file that does it before launching eclipse.
An other option will be to introduce the eclipse developpers to the fabulous world of databases that produce faster to run and easier to write code. Sadly a rhumor says that they will shot on sight everyone that pronounce the words "sqlite" or "jdbc", and will sacrifice virgind every sunday to the all-mighty-god-of-filebombs.