CRX2Oak executing very slow - aem

After unpacking the 6.3 jar. I can find below commands takes too much time to execute.
java -Xmx4096m -jar AEM_6.3-author-p4502.jar -v -x crx2oak -xargs -- --load-profile segment-no-ds --disable-mmap --exclude-paths /content/dam
Any Idea where I am failing. Moreover, there will be need to remove /content/dam to include assets as well. Its upgrade from 6.0 to 6.3
Thanks

You could try increasing the maximum memory in the xmx parameter, if there is more available.
Maybe opt for an offline compaction before the promote (which might take even more time though)?
I would really look at the error.log and upgrade.log because if I'm not mistaken this should take about 20 minutes according to Adobe Engineering. Maybe change the logging level of crx2oak.
Other than that, I'm guessing you're on windows (because of the --disable-mmap) so it's going to be slow.

Related

Is there a way to make system responsive while NetBeans is unpacking the index for Sonatype repository?

All the developers in my team here have the same problem - when NetBeans is unpacking the index for Sonatype Repository the system becomes very, very slow. I hope that there is some parameter somewhere so we can reduce the priority of that process in order to make it "behave"?
UPDATE: Thanks to #sashoalm for reminding me about the disk I/O. I have noticed that the process does lots of disk I/O and that probably makes the system unresponsive.
It has a nasty habit of starting at the worst possible time, so we had to turn it off…
OK, I have found a solution. I am not particularly happy with it, but anything is better than nothing.
Luckily, I use Linux exclusively, so I simply reniced the java process that is taking all the system resources. A simple renice -p <java PID here> -n 10 did the job, and the whole system is now responsive as before.

Achieve SBT Run startup speed while executing through command line

I've been working on a small set of command line programs in Scala. While
developing I used SBT, and tested the program with run within the console. At
this point the programs had a fast startup time (when re-run after initial compilation); nearly instant, even
with additional dependencies.
Now that I'm trying to actually utilize them on my system outside of sbt, the speeds have noticeable lag. I'm looking for ways to
reduce this, since the nature of these utilities requires little to no delay.
The best speeds I've achieved so far has been through utilizing Drip. I include all dependencies in a lib directory by utilizing Pack and then run by executing a shell script like this:
#!/bin/sh
SCRIPT=$(readlink -f "$0")
SCRIPT_PATH=$(dirname "$SCRIPT")
PROG_HOME=`cd "$SCRIPT_PATH/../" && pwd`
CLASSPATH_SUFFIX=""
# Path separator used in EXTRA_CLASSPATH
PSEP=":"
exec drip \
-cp "${PROG_HOME}/lib/*${CLASSPATH_SUFFIX}" \ # Add lib directory to classpath
TagWorkspace "$#" # TagWorkspace is the main class
This is still noticeably slower then invoking run from within SBT.
I'm curious as to why SBT is able to startup the application so much faster, and if there is someway for me to levarage its strategy, or SBT itself, even if that means keeping a long living process around to actually run a command through.
Unless you have forking turned on for your run task, this is likely due to VM startup time. When you run from inside an active SBT session, you have an already initialized VM pointing at your classes - all SBT needs to do is create a new ClassLoader and point it at your build output directory. This bypasses all of the other (not insignificant) stuff that happens when you fire up a new VM.
Have you tried using the client VM to start your utility from the command line? Sadly, this isn't an option with 64-bit Java, since Oracle apparently doesn't want to support it, but if you're using a 32-bit VM, try adding the -client argument to the list that you give the VM from the command line.
If you are using a 64-bit VM, some googling will find you some unofficial forks of OpenJDK that have the client VM re-enabled. It's really just a #define in the JVM build itself - it works fine once it's been compiled in.
The only slowness I have is launching SBT. Running a hello-word Scala app with java (no Drip) version 1.8 on a 7381 bogomips CPU takes only 0.2 seconds.
If you're not in that magnitude, I suspect your application startup requires loading thousands of classes, and creating instances of them.

Eclipse memory restricting

I am using a 32bit WinXP with no upgrade in sight, is there a way to limit how much memory Eclipse allocates throughout the day? I am also running Weblogic 10 server in debug mode inside eclipse. After a few hours I have an 700mb STS.exe (eclipse) and 400mb java.exe (server). Is there at least a way to force a GC on eclipse?
Here are the settings i surrently use, which seem to me are not being observed.
-vm
C:\bea\jdk160_05\bin\javaw.exe
-showsplash
--launcher.XXMaxPermSize
128M
-vmargs
-Dosgi.requiredJavaVersion=1.5
-Xms40m -Xmx512m
-Dsun.lang.ClassLoader.allowArraySyntax=true
EDIT: here's the monster of a project: Eclipse and Firefox 4.
is there a way to limit how much memory Eclipse allocates throughout the day?
The values of -Xmx and -XX:MaxPermSize place an upper bound on the memory that the JVM will use.
Is there at least a way to force a GC on eclipse?
AFAIK, no. Even if there was, it probably wouldn't help. The JVM is unlikely to give the memory back to the operating system.
Here are the settings i currently use, which seem to me are not being observed.
Based on what you've said (memory usage 700Mb for eclipse.exe), I'd say the settings ARE being observed.
What can you do to get Eclipse to use less memory?
Trim the values of -Xmx and -XX:MaxPermSize. However, if you do this too much you are liable to make Eclipse sluggish ('cos it has to GC more frequently) and ultimately flakey ('cos it will run out of memory and things will start failing with OOMEs)
Get rid of superfluous plugins.
Switch to a "smaller" Eclipse (e.g. the "Classic" distro) ... though you'll lose some of the J2EE support that you are probably using.
Close projects.
Close files.
Restart Eclipse more often.
But the best solution is to upgrade your platform:
Buy some more memory for your PC / laptop. You can probably max it out for a couple of hundred dollars. It is worth it.
Switch the OS to Linux. In my experience, Linux is a much better platform for doing Java software development than Windows XP. It seems to do a much better job in terms of both memory and file system management. The performance difference on identical hardware is significant.
You can always set up your machine to dual boot, so that you can still run XP for other things.
The 700MB of OS level process usuage is consistent with your memory settings. Take 512m for heap + 128m for permgen space + a bit of overhead for the JVM itself.
You cannot force GC. If memory was releasable, JVM would release it. Note that you cannot simply look at OS level process usage as that represents the point where the JVM memory usage peaked. JVM never releases heap or permgen space once it expands to a certain size. It is simply too expensive and not really necessary. The allocated space is in virtual address space of the JVM process. It doesn't represent actual physical memory usage. As physical memory gets tight, the OS will agressively swap out to disk memory pages that haven't been used recently.
So... You need specialized memory analysis tools to get accurate representation of Eclipse memory usage.
If you are seeing a lot of disk activity when there shouldn't be any, the OS may be indeed out of physical memory and doing a lot of swapping. Try closing a few projects or restarting Eclipse. It only takes one plugin with a memory leak to consume all of your available memory.
I'm assuming that you are running BEA JRockit judging by the path to javaw.exe. Note that the -X... options are JVM-specific, so they may not work the same way for JRockit as in Sun's JVM. This link seems to indicate that you should be using
... -Xms:40m -Xmx:512m ...
But setting a heap size limit will only cause Eclipse to fail once it reaches the limit, so you won't solve the real problem. Forcing a GC usually doesn't help; any sane VM will GC periodically or when needed.
Having a 700Mb Eclipse sounds like your Eclipse is either processing large amounts of data, or it is leaking memory. JRockit seems to have a memory leak detector which may be able to give you a hint of where the problem lies.
The ini file memory settings won't help for the Virtual machine. Change VM arguments, as here
Eclipse is very thirsty. Limiting it is most likely to crash it. You really need a new computer.

Why does netbeans freeze when I'm trying to type?

I'm using NetBeans 6.7 on win xp*. I'm not really sure what the pattern is, but lately performance has gotten really bad to the point where it's almost unusable. Any ideas for where to look for slowdowns?
Intel Core Duo 2.2 GHz, 3.5 GB or ram, accoring to the system properties panel. 90 GB of free hard disk space.
NetBeans 6.5 "leaks" temporary files. It creates temporary files in %TEMP% (typically c:\\Documents and Settings\\*username*\\Local Settings\\Temp) and does not delete them. When enough files accumulate, access to the temporary directory slows to a crawl. That in turn drags NB down to a crawl.
To clean it up:
Shut down NetBeans
Open a command prompt and type:
cd %TEMP%
del *.java
del *.form
del output*
del *vcs*
Important:
Do not try to do this with windows explorer. It won't work.
The deletes can take several minutes each. Be patient.
This is much better in 6.7 and I have not seen it at all in 6.8.
If you're running on java6 you can use the jconsole app to connect to your running netbeans instance and see among other things, what the threads are doing, memory usage and whether you're in a race condition.

free memory netbeans plugin

Is there any plugin for NetBeans to free memory manually?
NetBeans' core has this feature but it is not complete.
You can't free memory whenever you want in a Java application! The only way to free memory in Java is to call "System.gc()", but this just suggests the Java Virtual Machine to run the garbage collector (the JVM is free to do the gc or not).
If you're running out of memory while running NetBeans you may want to modify the [netbeans]/etc/netbeans.conf and modify the line that starts with "netbeans_default_options". For instance:
netbeans_default_options="-J-Xms384m -J-Xmx512m -J-XX:PermSize=32m -J-XX:MaxPermSize=96m -J-Xverify:none"
Anyway remember that modifying the memory for NetBeans does not affect the applications you run with NetBeans, because those run in another Java Virtual Machine.
You dont any plugin just click on the default memory bar in the toolbar
this may be relevant and helpful...
Had a similar memory problem on my Linux Debian.
Here is how to fix it:
run a terminal
log in as root
type crontab -e
scroll to the bottom or the file and type * * * * * sync; echo 3 > /proc/sys/vm/drop_caches
This magic line cleared all unused ram every minute. It removed the unused memory NetBeans was producing (including any other memory-consuming programs).
This should work on most UNIX like OS's.
plz tell me if it works...
http://www.omegat.org/en/omegat.html