How to increase memory for embedded jvm for a deployed javafx application? - deployment

I am using a .fxbuild-script to build a JavaFX Application. I used Packaging-Format all to include its own runtime. Now I am wondering, how I can define any runtime parameters?
Since we noticed, that we had far more OutOfMemory Issuses within the deployed version than with the local development version, we were monitioring it with Visual VM and noticed, that the embedded JVM (by default?) is only configured to use 256MB of RAM. How can I increase the Maximum available RAM for the included JVM?
The Application is launched by an .exe file after beeing installed on the system.
Update:
The Answer of Roland is correct. I just made the mistake, that I added the <fx:platform>-Tag at the Bottom of the Ant script and not within the appropriate <fx:deploy>-Tag which results in that the <fx:platform>-Tag will be ignored and the JVM is configured to use 256 MB max RAM on 32-Bit and 1/4th of available RAM on 64-Bit.

Please read the Packaging Basics, especially chapter "5.8.2 Customizing JVM Setup".
Excerpt of what you need:
<fx:platform javafx="2.1+">
<fx:jvmarg value="-Xmx400m"/>
...
</fx:platform>

Related

Eclipse always exit when used in a virtual machine

I use ubuntu16.04 in VMware for learning Hadoop. The eclipse is Eclipse IDE for Java Developers at 2020.03 for linux_64.
And when use eclipse to write some java code, the IDE usually exits by itself without any error. And the eclipse is too slow when I use it. I guess that whether the memory is not enough for using IDE. But I found the memory is free when I check it. I allocated 2G of memory for ubuntu16.04.
And I search for this problem on web. I found many people believe the problem is eclipse caused. So they come up with a way to edit eclipse.ini.
-Dorg.eclipse.swt.browser.DefaultType=mozilla
Add this command to the last line of eclipse.ini.
Unfortunately, it doesn't work. So Do u know why? Should I allocate more memory for VMware?

Is there an equivalent of ${LIB} for dyld?

I'm working on a Mac launcher for a trace library - the tracing works by adding the library to DYLD_INSERT_LIBRARIES (the Mac equivalent of LD_PRELOAD). The DYLD_INSERT_LIBRARIES variable is then propagated by the trace library as further processes are spawned.
The trouble is that I need the 32-bit version of the trace library to be used for 32-bit tracee processes and the 64-bit version for 64-bit tracee processes. In the Linux launcher I have, this is achieved by using ${LIB} in LD_PRELOAD - the dynamic loader (ld.so) then replaces this with "the right thing" when loading a process.
Is there an equivalent of ld.so's ${LIB} variable for dyld on Mac? I couldn't immediately see one when I looked through the man page (https://developer.apple.com/library/mac/#documentation/Darwin/Reference/ManPages/man1/dyld.1.html), but I may just be reading it wrong. If not, is there another way of achieving the same effect please?
I think what you want is to compile your inserted library as a fat binary (e.g., multiple architectures in the same binary). This should allow a single value of DYLD_INSERT_LIBRARIES to work for subprocesses of various architectures.

Memory allocation for external process in JBoss

We are using ImageMagick launched by a JBoss EJB by the way of im4Java, to produce image files. im4Java is an Java interface to imagemagick, launching external process with java.lang.ProcessBuilder. After few successfull runs, we are blocked by a :
java.io.IOException: Cannot run program "/usr/local/bin/convert": java.io.IOException: error=12, Cannot allocate memory
It seems not to be an ImageMagick issue, because if we launch the same process from command line, it runs perfectly. But more a Jboss memory allocation for an external process.
Any idea ?
Finally solved, but maybe not the best way. Indeed , we have used the solution of the overcommit_memory variable in Linux indicated by Ivan, and no problem any more. But we are not sure changing that this memory global settings would not affect the whole behavior of the system later, as it seems to allocate a lot more memory. Hopefully we do not run so often imagemagick from java code, so memory is released once imagemagick did his job.

Eclipse memory restricting

I am using a 32bit WinXP with no upgrade in sight, is there a way to limit how much memory Eclipse allocates throughout the day? I am also running Weblogic 10 server in debug mode inside eclipse. After a few hours I have an 700mb STS.exe (eclipse) and 400mb java.exe (server). Is there at least a way to force a GC on eclipse?
Here are the settings i surrently use, which seem to me are not being observed.
-vm
C:\bea\jdk160_05\bin\javaw.exe
-showsplash
--launcher.XXMaxPermSize
128M
-vmargs
-Dosgi.requiredJavaVersion=1.5
-Xms40m -Xmx512m
-Dsun.lang.ClassLoader.allowArraySyntax=true
EDIT: here's the monster of a project: Eclipse and Firefox 4.
is there a way to limit how much memory Eclipse allocates throughout the day?
The values of -Xmx and -XX:MaxPermSize place an upper bound on the memory that the JVM will use.
Is there at least a way to force a GC on eclipse?
AFAIK, no. Even if there was, it probably wouldn't help. The JVM is unlikely to give the memory back to the operating system.
Here are the settings i currently use, which seem to me are not being observed.
Based on what you've said (memory usage 700Mb for eclipse.exe), I'd say the settings ARE being observed.
What can you do to get Eclipse to use less memory?
Trim the values of -Xmx and -XX:MaxPermSize. However, if you do this too much you are liable to make Eclipse sluggish ('cos it has to GC more frequently) and ultimately flakey ('cos it will run out of memory and things will start failing with OOMEs)
Get rid of superfluous plugins.
Switch to a "smaller" Eclipse (e.g. the "Classic" distro) ... though you'll lose some of the J2EE support that you are probably using.
Close projects.
Close files.
Restart Eclipse more often.
But the best solution is to upgrade your platform:
Buy some more memory for your PC / laptop. You can probably max it out for a couple of hundred dollars. It is worth it.
Switch the OS to Linux. In my experience, Linux is a much better platform for doing Java software development than Windows XP. It seems to do a much better job in terms of both memory and file system management. The performance difference on identical hardware is significant.
You can always set up your machine to dual boot, so that you can still run XP for other things.
The 700MB of OS level process usuage is consistent with your memory settings. Take 512m for heap + 128m for permgen space + a bit of overhead for the JVM itself.
You cannot force GC. If memory was releasable, JVM would release it. Note that you cannot simply look at OS level process usage as that represents the point where the JVM memory usage peaked. JVM never releases heap or permgen space once it expands to a certain size. It is simply too expensive and not really necessary. The allocated space is in virtual address space of the JVM process. It doesn't represent actual physical memory usage. As physical memory gets tight, the OS will agressively swap out to disk memory pages that haven't been used recently.
So... You need specialized memory analysis tools to get accurate representation of Eclipse memory usage.
If you are seeing a lot of disk activity when there shouldn't be any, the OS may be indeed out of physical memory and doing a lot of swapping. Try closing a few projects or restarting Eclipse. It only takes one plugin with a memory leak to consume all of your available memory.
I'm assuming that you are running BEA JRockit judging by the path to javaw.exe. Note that the -X... options are JVM-specific, so they may not work the same way for JRockit as in Sun's JVM. This link seems to indicate that you should be using
... -Xms:40m -Xmx:512m ...
But setting a heap size limit will only cause Eclipse to fail once it reaches the limit, so you won't solve the real problem. Forcing a GC usually doesn't help; any sane VM will GC periodically or when needed.
Having a 700Mb Eclipse sounds like your Eclipse is either processing large amounts of data, or it is leaking memory. JRockit seems to have a memory leak detector which may be able to give you a hint of where the problem lies.
The ini file memory settings won't help for the Virtual machine. Change VM arguments, as here
Eclipse is very thirsty. Limiting it is most likely to crash it. You really need a new computer.

free memory netbeans plugin

Is there any plugin for NetBeans to free memory manually?
NetBeans' core has this feature but it is not complete.
You can't free memory whenever you want in a Java application! The only way to free memory in Java is to call "System.gc()", but this just suggests the Java Virtual Machine to run the garbage collector (the JVM is free to do the gc or not).
If you're running out of memory while running NetBeans you may want to modify the [netbeans]/etc/netbeans.conf and modify the line that starts with "netbeans_default_options". For instance:
netbeans_default_options="-J-Xms384m -J-Xmx512m -J-XX:PermSize=32m -J-XX:MaxPermSize=96m -J-Xverify:none"
Anyway remember that modifying the memory for NetBeans does not affect the applications you run with NetBeans, because those run in another Java Virtual Machine.
You dont any plugin just click on the default memory bar in the toolbar
this may be relevant and helpful...
Had a similar memory problem on my Linux Debian.
Here is how to fix it:
run a terminal
log in as root
type crontab -e
scroll to the bottom or the file and type * * * * * sync; echo 3 > /proc/sys/vm/drop_caches
This magic line cleared all unused ram every minute. It removed the unused memory NetBeans was producing (including any other memory-consuming programs).
This should work on most UNIX like OS's.
plz tell me if it works...
http://www.omegat.org/en/omegat.html