Max value of Xmx and Xms in SpringToolSuite4 - spring-tool-suite

What is the max memory size in SpringToolSuite4.ini for STS for the following parameters?
-Xms
-Xmx
are the above parameters customized, if yes how much size we can increase for both?

Related

Spark buffer holder size limit issue

I am doing the aggregate function on column level like
df.groupby("a").agg(collect_set(b))
The column value is increasing beyond default size of 2gb.
Error details:
Spark job fails with an IllegalArgumentException: Cannot grow
BufferHolder error. java.lang.IllegalArgumentException: Cannot grow
BufferHolder by size 95969 because the size after growing exceeds size
limitation 2147483632
As we already known
BufferHolder has a maximum size of 2147483632 bytes (approximately 2 GB).
If a column value exceeds this size, Spark returns the exception.
I have removed all duplicate records, repartition(), increased the default partitions and did increase all memory parameters also but no use it is giving above error.
We have huge volume of data in a column after applying the agg of collect_set.
Is there any way to increase the BufferHolder maximum size of 2gb while processing?

How is the heapsize setting for a CF Liberty instance with a particular memory limit

For a CF App (Liberty Runtime based) running on bluemix, I have set a MEMORY_LIMIT say 2G. I dont have any JAVA_OPTS set for setting the -Xms and -Xmx values. How is the heapsize memory allotted by bluemix ? Any range it sets by default ?
Please advice.
The Liberty buildpack uses a ratio to calculate the heap size according to your memory limit.
heap_size_ratio The ratio that is used to calculate the maximum heap
size. The default heap size ratio is 0.75 (75% of the total available
memory).
https://github.com/cloudfoundry/ibm-websphere-liberty-buildpack/blob/master/docs/ibm-jdk.md
new_heap_size = mem * heap_size_ratio
https://github.com/cloudfoundry/ibm-websphere-liberty-buildpack/blob/master/lib/liberty_buildpack/jre/ibmjdk.rb#L175

WinDbg: range limit for dd <address> L <length>

WinDbg has a range limit applied for the d-command series. According to the documentation, the limit is at 256 MB. This limit can be bypassed using the L? syntax.
L? Size (with a question mark) means the same as LSize, except that L?
Size removes the debugger's automatic range limit. Typically, there is
a range limit of 256 MB, because larger ranges are typographic errors.
If you want to specify a range that is larger than 256 MB, you must
use the L? Size syntax.
However, I tried to do a
du 3ddabac0+8 L 0n6518040
which is only 6.5 MB and it says
Range error in 'du 3ddabac0+8 l 0n6518040.
The real limit in WinDbg 6.3 is 512kB. Starting from 0x80001 or 0n524289 you need to use L? to bypass the limit.

How the size of stack and heap memory bound is determined in iPhone OS?

How is internally the maximum size of stack and Heap is set? How can we determine its maximum size? I am not using it for any of my projects. But this is just out of curiosity.
iPhone/iOS has support for virtual memory (just no backing store in normal use), and a virtual address space much larger than physical RAM. So the maximum size for either stack or heap is until the sum of all (maybe dirty) memory use (in allocated pages) runs out of that available for the current app process/sandbox, which will vary depending on what else is running on the system.

How can I resolve out of memory error in MATLAB?

I want to calculate 2 covariance matrices with size (10304,1034) and matlab creates the first one but when it runs the second command, this error occurs:
>> j=ones(10000,10000);
>> jj=ones(10000,10000);
??? Out of memory. Type HELP MEMORY for your options.
My laptop's RAM is 2GB, but it still has 1 GB free. I am using Windows 7 and 32-bit MATLAB 2009b.
How can I resolve this error?
A 10k-by-10k array of doubles uses 1e8*8 bytes, which corresponds to 800MB. MATLAB needs these 800MB to be contiguous. Most likely, your 1GB free memory is a little fragmented, so MATLAB cannot fit the new array into RAM.
Use the command MEMORY to find out the maximum variable size that MATLAB can handle at a given moment.
Try to use sparse matrices, in that case MATLAB doesn't allocate the entire space.
Try any of these two options little bit increase in memory allocated for matlab.exe processing.
1- Give higher priority to Matlab.exe task. You can do that by going to task manager, Processes tab, right click the Matlab.exe Task, select priority and set it to higher priority (say real time), this tells Windows to allocate more resources to this process.
2- Increase the page file size of your applications in general. You can do this by right clicking MyComputer ->properties->Advanced System Settings ->Advanced-> Performance->Virtual Memory (change..). Then the tick from the Automatic .... and set the initial and maximum page size to say 10000 MB.
Go to Matlab-->file-->Preferences-->general-->Java heap memory--> and increase the level.. This solved my problem