Specify memory for ant maven deploy task - deployment

I am using ant maven deploy task to upload the zip file created by the ant script to our repository, but the problem is the file is too big and it fails with
java.lang.OutOfMemoryError: Java heap space. Following is the task
<deploy uniqueversion="false">
<remoterepository url="${repository}" id="${repositoryId}"/>
<remotesnapshotrepository url="${snapshotRepository}" id="${snapshotRepositoryId}"/>
<attach file="target/${qname}-dist.zip" type="zip"/>
<pom file="pom.xml" groupid="com.my.company" artifactid="test" packaging="zip" version="${version}" />
</deploy>
How do I specify memory heap size here, I don't seem to find anything in deploy task or some of its children task.

Maven doesn't fork on the deploy task so to increase the memory, you have to increase the heap size for the maven executable itself. You can just set your MAVEN_OPTS environment variable to include the -Xmx setting: MAVEN_OPTS=”-Xmx512m”

Related

How to construct an Eclipse instance for standalone (CI) execution?

I need to use the p2 publishing tools from a headless build, and hence need, AFAICT, to assemble an Eclipse execution environment for that. I'm basing this on Running p2.process.artifacts in Jenkins, Running P2 Ant tasks outside Eclipse, and the related documentation.
But I can't find how to assemble the directory structure necessary to start Eclipse from. For instance, if I stupidly just run the launcher in any old directory as in:
java -jar <targetProductFolder>/plugins/org.eclipse.equinox.launcher_*.jar
-application org.eclipse.equinox.p2.publisher.UpdateSitePublisher ...
it won't run without access to all the required plugins, probably some config files, etc. How should these be assembled?
The context is a build environment, where all source and tooling (including whatever is needed for headless Eclipse) is checked out of repositories prior to the build.

Project Deployment Error - GC Overhead Limit Exceede

I am using tomcat as server and eclipse as my IDE and i am using maven
I am getting "GC Overhead Limit Exceeded" when i am doing project clean for my spring project.
The reason why i am getting above error is because "low memory allocation for VM"
the solution for this is
1.Goto bin folder of Tomcat.
2.Increase the size of the permgen in catalina.sh file
Eg: CATALINA_OPTS="$CATALINA_OPTS -Xms1024m -Xmx10246m -XX:NewSize=256m
-XX:MaxNewSize=356m -XX:PermSize=256m -XX:MaxPermSize=356m"
Add the above line in the top of catalina.sh file and restart tomcat (even if doestn't work restart your eclipse also). It worked for me

Start a Hadoop Map Reduce Job on a remote cluster in Eclipse with the run dialog (F11)

Is it possible to start a Map Reduce job on a remote cluster with the Eclipse Run Dialog (F11)?
Currently I have to run it with the External Tool Chain Dialog and Maven.
Note: To execute it on a local cluster is no big deal with the Run Dialog. But for a remote connection it's mandatory to have a compiled JAR. Otherwise you get a ClassNotFoundException (also if Jar-By-Class is set)
Our current Setup is:
Spring-Data-Hadoop 1.0.0
STS - Springsource Toolsuite
Maven
CDH4
This we set on our applicationContext.xml (this is what you specify in the *-site.xml on a vanilla hadoop)
<hdp:configuration id="hadoopConfiguration">
fs.defaultFS=hdfs://carolin.ixcloud.net:8020
mapred.job.tracker=michaela.ixcloud.net:8021
</hdp:configuration>
Is there a way to tell Eclipse it should build a JAR when the Run Dialog is executed.
I do not know if it builds a new jar (may be you must extract a jar to a folder), adding "Run Configurations->Classpath" your jar clears the problem "ClassNotFoundException".

IntelliJ increase Scalatest Heap space

I'm using IntelliJ to run Scalatest tests. Problem I'm having is the tests are running out of Heap space (likely because my tests are using Selenium and starting up jettys to hit my Api).
I know how to increase my Heap space in IntelliJ but after increasing the space the tests still run out of Heap.
Is there a different place to increase Heap space for tests rather than the usual IntelliJ info.plist (oh btw I'm on Mac)
go to Edit Configurations:
Choose the test on the left, and tweak its VM options:
In case you are using a ScalaTest ant task, there is a jvmarg that you can set:
<jvmarg value="-Xmx2048m -XX:MaxPermSize=128m"/>
As you rightly observed, it is not IDEA's heap size that needs to be increased but rather that of the ScalaTest run configuration accessible from the Run > Edit Configurations... dialog. There, you should be able to set the VM Options for the tests you are trying to run.

How do increase memory for the Ant 'Javadoc' task?

When creating an ant build script to generate Javadoc, Eclipse is receiving an OutOfMemoryError.
The ant build has the -Xmx512m and -Xms512m settings under the JRE tab in the run configuration.
This works great for compiling the application.
The only trouble is with the Javadoc portion of the build. Here is the build.xml file
<target name="JavaDoc" description="Create Javadocs">
<javadoc destdir="c:/javadoc" windowtitle="My API">
<classpath refid="application.classpath" />
<packageset dir="Source">
<include name="**" />
</packageset>
</javadoc>
</target>
When the build script runs I see a 2 step process,
Eclipse launches
org.eclipse.ant.internal.ui.antsupport.InternalAntRunner
Visual VM shows that this process launches with the Heap memory arguments listed above.
This process then spawns the 2nd process "JavaDoc" and the VM arguments are not passed along with it.
In VisualVM it can be confirmed that the JavaDoc process has a default -Xms8m value and around a 64m Xmx value before the OOM error is thrown.
Under the Ant preferences in Eclipse I have attempted to add an 'ANT_OPTS' variable to pass the JVM args to JavaDoc.
The change did not work.
The build does work if I create a batch file and set the ANT_OPTS values.
set ANT_OPTS=-Xms512m -Xmx512m
ant -file C:\myApp\build.xml JavaDoc
But creating the batch file is defeating the purpose of allowing me to build everything directly in Eclipse.
I have also tried adding an to the build file, which would hardcode a heap size
<arg value="ANT_OPTS=-Xms512m -Xmx512m" />
Any idea how to set the value so my javadoc will spawn with more heap size?
The javadoc task has the attribute maxmemory for specifying this. Allows you to separately tune for this task.
It is worth noting that apparently you can get an OOME on javadoc for another reason: see that other question : Out of memory error in ant