Does the latest version of Drools have a higher memory limitation than previous versions? I cannot find this memory limitatoin noted anywhere. Is there any version which dynamically increases memory?
Tried to look into increasing memory
Related
I was wondering if there is an 'easy' way to see what the memory footprint is, used by the objects created by JSF. For instance, I have some #SessionScoped objects and some #ViewScoped objects when going to a certain page.
I would like to know how much KB (or MB) they are using. This way, we can make an estimated calculation of the memory footprint per user of JSF.
I am using Eclipse and EAP 7 together with JSF 2.3. I tried using jvisualvm but no specific class information and size in memory is available. I do remember long time ago we had some tool to visualize this kind of information.
Any ideas on how to find out? I guess some Eclipse plugins can work, but I am totally new to this area and have no clue about the better ones...
I personally use JVisualVM which is found in every JDK /bin folder installation. You can attach to your running process and watch the memory and objects and their size.
There is no magic bullet to profiling. If you have a SessionSxoped bean named FooBean I open up VisualVm and I go to the Classes tab and filter by FooBean. I then use my load testing tool to simulate real world use and I monitor how many instances of Foo? How much heap are those instances taking? Are they being garbage collected etc.
It takes a little bit of learning but you will get better at it the more you use it.
Ok, this is just a simple question, but I really like to have some answers from people that create distributions (linux) or if there are also people involved on OsX or Windows.
The size after installation seems to be increasing, Windows 10 requires 20GB of disk space (64bit). I suppose that the kernel is not the problem, so the problem is in the applications (i.e. user space). But I cannot see an increase in the number of applications packaged with the OS, at least not a big increase, so the problem is..how they wrote them, the runtime support, etc.
Could someone comment on this?
While I don't think this question is suitable for here, I'll point out that programmers tend to rely too heavily on increasing memory capacity and increasing processor speed.
Another thing to consider is that newer versions of software usually need to keep backwards compatibility with older versions. This results in multiplied memory requirements and possibly processor requirements.
Newer versions of software introduce new features (whether substantial ones or simply eye-candy ones does not matter in this context). The results are the same as with backwards compatibility.
Many people may disagree here, but some other people will argue there is this thing called "Planned Obsolescence". This way working hardware gets obsolete simply because software requirements increase.
I am trying to figure out if the IBM Persistent Reusable JVM is still relevant and supported with latest Java versions. There is very little I find when searching on internet. There is just this document that talks about it and it looks very old, it refers to JDK 1.4.2 and mentions that the -Xresettable feature is deprecated. I have tried using -Xresettable with JDK 8 and that failed as expected, that however doesn't mean the entire concept doesn't work and hence the question. We have a requirement where we want to invoke a JVM from C code and keep the JVM around and not destroy it for further request processing. Since the OS is z/OS which runs IBM JVM, I am trying to understand if the Persistent Reusable JVM is an option though my hopes are dim. If anyone knows about it, please let me know.
If it turns out to be outdated, I will evaluate other options of keeping a JVM alive, but that's secondary right now.
I believe it is as you fear: the Persistent Resusable JVM feature never made it past Java 1.4.2.
If you haven't checked it out yet, look at the -Xshareclasses JVM option available in the more current IBM Java releases. It creates a shared memory cache of user and system classes, mostly with no impact to applications - somewhat similar in result to what persistent reusable JVM did, but without the side-effects.
I am exploring using soft permutations in my build of GWT because total file system size of the compiled app is important to me (read: sum of all permutations). Aside from increasing the file size the user has to download and potential runtime performance decreases, is there any other drawbacks to using soft permutations? Any loss of localization functionality (number formatting and the like)?
For clarification, this is what I am calling soft permutations.
Thanks in advance.
I don't think there are others, except that there might be incompatibilities with existing generators/linkers (I recently proposed a patch to GWT using soft permutations and it got rolled back at least twice, before being revisited with a runtime check but no soft permutation).
See commits r9970 through r10257.
I'm working on a small GWT project in Intellij using their built-in support. Dev mode functions, but the performance is really spotty, and I can only reload the app a handful of times before getting OutOfMemoryError (using -Xmx512M).
What should I be able to expect out of dev mode? Do others experience consistent reload times and long running processes?
I'm running GWT 2.2 with IDEA 10.0.3. My app is small, but I do include several other modules like Activity, Place, Resources, Guava Collect + Base, UiBinder, Gin Inject, etc. I believe the performance problems started before many of these dependencies were added, though.
Thanks!
You can try to increase PermGen memory size via: -XX:MaxPermSize=256m. It should help. I had the same problem, analyzed what's becoming exhausted with Visual VM and it turned out that PermGen was the problem. Of course -Xmx also helps.