WinDbg: range limit for dd <address> L <length> - windbg

WinDbg has a range limit applied for the d-command series. According to the documentation, the limit is at 256 MB. This limit can be bypassed using the L? syntax.
L? Size (with a question mark) means the same as LSize, except that L?
Size removes the debugger's automatic range limit. Typically, there is
a range limit of 256 MB, because larger ranges are typographic errors.
If you want to specify a range that is larger than 256 MB, you must
use the L? Size syntax.
However, I tried to do a
du 3ddabac0+8 L 0n6518040
which is only 6.5 MB and it says
Range error in 'du 3ddabac0+8 l 0n6518040.

The real limit in WinDbg 6.3 is 512kB. Starting from 0x80001 or 0n524289 you need to use L? to bypass the limit.

Related

Character limit of property value in Google Apps Script API

I am testing G Suite Add On and for this purpose, I need to know maximum character limit of Property value.
As the documentation here says, quota limit for property value size is 9 KB. As the size of char vary in different language, what value of char size shall I take to calculate character limit here.
You can estimate that 1 KB is around 1024 characters.
This means that 9 KB corrsponds to around 9216 characters.

What is the maximum size of symbol data type in KDB+?

I cannot find the maximum size of the symbol data type in KDB+.
Does anyone know what it is?
If youa re talking the physical length of a symbol, well symbols exist as interred strings in kdb, so the maximum string length limit would apply. As strings are just a list of characters in kdb, the maximum size of a string would be the maximum length of a list. In 3.x this would be 264 - 1, In previous versions of kdb this limit was 2,000,000,000.
However there is a 2TB maximum serialized size limit that would likely kick in first, you can roughly work out the size of a sym by serializing it,
q)count -8!`
10
q)count -8!`a
11
q)count -8!`abc
13
So each character adds a single byte, this would give a roughly 1012 character length size limit
If you mean the maximum amount of symbols that can exist in memory, then the limit is 1.4B.

Perl module to convert between MB/GB/TB without converting to bytes first?

I'm trying to calculate the free space on an LVM physical volume by multiplying the number of free physical extents by the extent size, for example:
3623365 free extents * 4.00 MB each = 13.8 TB
I was using Number::Format to convert the extent size to bytes and convert the results of the multiplication back to a human-readable string, but TB and higher are not supported, so I end up with the longer, less readable 14,153.8 GB.
According to the docs, the reason TB and up are not supported is because of integer overflows on 32-bit systems, which made me wonder if I should even be multiplying arbitrary large numbers without using something like Math::BigInt. I see that Number::Bytes::Human supports numbers up to YB (yottabytes), but it's still in alpha so I hesitate to use it in production code.
My next thought was, why even convert to bytes in the first place when I could calculate the free space in MB and then convert to TB? Unfortunately, it seems like neither Number::Format nor Number::Bytes::Human supports conversions from one "suffix" to another, e.g. MB -> TB. Is there an existing module that does this? I really like how Number::Format and Number::Bytes::Human handle both SI/non-SI units (MB vs. MiB), allow you to set the precision, etc. and so hesitate to roll my own solution if a similarly full-featured module already does it.
Edit: The extent size is not always in MB, nor is the free space always in TB, so I am not asking how to convert from MB to TB (that would be trivial). I am asking if there are any existing modules that can convert from one [arbitrary] suffix to another without converting to bytes first.
To convert from MB to TB w/o going through Bytes:
Number of TB = Number of MB * (Bytes in 1 TB/Bytes in 1 MB)
UPDATE:
To Generalize:
Number of new units = Number of old units * (Bytes in 1 new unit / Bytes in 1 old unit)

Is there a limit on the message size for SHA-256?

When hashing a string, like a password, with SHA-256, is there a limit to the length of the string I am hashing? For example, is it only "safe" to hash strings that are smaller than 64 characters?
There is technically a limit, but it's quite large. The padding scheme used for SHA-256 requires that the size of the input (in bits) be expressed as a 64-bit number. Therefore, the maximum size is (264-1)/8 bytes ~= 2'091'752 terabytes.
That renders the limit almost entirely theoretical, not practical.
Most people don't have the storage for nearly that much data anyway, but even if they did, processing it all serially to produce a single hash would take an amount of time most would consider prohibitive.
A quick back-of-the-envelope kind of calculation indicates that even with the fastest enterprise SSDs currently1 listed on Tom's hardware, and striping them 16 wide to improve bandwidth, just reading that quantity of data would still take about 220 years.
1. As of April 2016.
There is no such limit, other than the maximum message size of 264-1 bits. SHA2 is frequently used to generate hashes for executables, which tend to be much larger than a few dozen bytes.
The upper limit is given in the NIST Standard FIPS 180-4. The reason for the upper limit is the padding scheme to countermeasure against the MOV attack that Merkle-Damgard construction's artifact. The message length l is lastly appended to the message during padding.
Then append the 64-bit block that is equal to the number l expressed using a binary representation
Therefore by the NIST standard, the maximum file size can be hashed with SHA-256 is 2^64-1 in bits ( approx 2.305 exabytes - that is close to the lower range of the estimated NSA's data center in UTAH, so you don't need to worry).
NIST enables the hash of the size zero message. Therefore the message length starts from 0 to 2^64-1.
If you need to hash files larger than 2^64-1 then either use SHA-512 which has 2^128-1 limit or use SHA3 which has no limit.

Maximum array size in MATLAB?

I'm writing a MATLAB program that will generate a matrix with 1 million rows and an unknown amount of columns (at max 1 million).
I tried pre-allocating this matrix:
a=zeros(1000000,1000000)
but I received the error:
"Maximum variable size allowed by the program is exceeded."
I have a feeling that not pre-allocating this matrix will seriously slow the code down.
This made me curious: What is the maximum array size in MATLAB?
Update: I'm going to look into sparse matrices, because the result I am aiming for in this particular problem will be a matrix consisting for the larger part of zeros.
Take a look at this page, it lists the maximum sizes: Max sizes
It looks to be on the order of a few hundred million. Note that the matrix you're trying to create here is: 10e6 * 10e6 = 10e12 elements. This is many orders of magnitude greater than the max sizes provided and you also do not have that much RAM on your system.
My suggestion is to look into a different algorithm for what you are trying to accomplish.
To find out the real maximum array size (Windows only), use the command user = memory. user.maxPossibleArrayBytes shows how many bytes of contiguous RAM are free. Divide that by the number of bytes per element of your array (8 for doubles) and you know the max number of elements you can preallocate.
Note that as woodchips said, Matlab may have to copy your array (if you pass by value to a subfunction, for example). In my experience 75% of the max possible array is usually available multiple times.
The Limits
There are two different limits to be aware of:
Maximum array size (in terms of number of elements) allowed by MATLAB, regardless of current memory availability.
Current bytes available for a single array -- the (current) maximum possible array size in bytes.
The first limit is what causes "Maximum variable size allowed by the program is exceeded", not the second limit. However the second one is also a practical limit of which you must be aware!
Checking the Limits
The maximum number of elements allowed for an array is checked as follows:
>> [~,maxsize] = computer
maxsize =
2.8147e+14
According to the documentation for the computer command, this returns:
maximum number of elements allowed in a matrix on this version of MATLAB
This is a static MATLAB limit on number of elements, not affected by the state of the computer (hardware specs and current memory usage). And at over 2 petabytes for a double array of that length, it's also way higher than any computer of which I am aware!
On the other hand, the largest practical array size that you can create at any given moment can be checked by the memory command:
>> memory
Maximum possible array: 35237 MB (3.695e+10 bytes) *
Memory available for all arrays: 35237 MB (3.695e+10 bytes) *
Memory used by MATLAB: 9545 MB (1.001e+10 bytes)
Physical Memory (RAM): 24574 MB (2.577e+10 bytes)
* Limited by System Memory (physical + swap file) available.
As the message says, these values are based on actual current memory availability, taking into account both physical memory and the swap file (collectively, virtual memory).
If needed, these values can accessed programmatically by m = memory;.
Adjusting the Limits
The first limit (the hard limit) has been fixed up until R2015a, where it can now be changed (but only reduced to a fraction of system memory) through the following setting:
You can't increase it beyond your system limits.
The second limit obviously has no "setting" in MATLAB since it's based on available memory and computer configuration. Aside from adding RAM, there's not a lot you can do: (1) pack to consolidate workspace memory and perform "garbage collection", but this may only help on certain platforms, and (2) increasing page file size to allow other stuff to swap out and give MATLAB more physical memory. But be cautious when relying on your page file as your computer may become unresponsive if page file thrashing happens.
In older versions of Matlab that don't include the memory command, you can use:
feature memstats
Physical Memory (RAM):
In Use: 738 MB (2e2c3000)
Free: 273 MB (11102000)
Total: 1011 MB (3f3c5000)
Page File (Swap space):
In Use: 1321 MB (529a4000)
Free: 1105 MB (45169000)
Total: 2427 MB (97b0d000)
Virtual Memory (Address Space):
In Use: 887 MB (37723000)
Free: 1160 MB (488bd000)
Total: 2047 MB (7ffe0000)
Largest Contiguous Free Blocks:
1. [at 4986b000] 197 MB ( c585000)
2. [at 3e1b9000] 178 MB ( b2a7000)
3. [at 1f5a0000] 104 MB ( 6800000)
4. [at 56032000] 77 MB ( 4d3e000)
5. [at 68b40000] 70 MB ( 4660000)
6. [at 3a320000] 54 MB ( 3610000)
7. [at 63568000] 45 MB ( 2d48000)
8. [at 35aff000] 40 MB ( 2821000)
9. [at 60f86000] 37 MB ( 25ca000)
10. [at 6f49d000] 37 MB ( 25b3000)
======= ==========
842 MB (34ac0000)
ans =
207114240
You can't suppress the output, but it returns the largest memory block available ( 207,114,240 Bytes / 8 = 25,889,280 doubles )