I know that there are answers already on SO, but after trying a few of them (ex1, ex2) I still cannot produce the correct public key from modulus and an exponent.
This is my python3 code:
from Crypto.PublicKey.RSA import construct
import urllib.parse
import base64
import re
def decode_base64(data, altchars=b'+/'):
"""Decode base64, padding being optional.
:param data: Base64 data as an ASCII byte string
:returns: The decoded byte string.
"""
data = re.sub(rb'[^a-zA-Z0-9%s]+' % altchars, b'', data) # normalize
missing_padding = len(data) % 4
if missing_padding:
data += b'='* (4 - missing_padding)
return base64.b64decode(data, altchars)
e = int.from_bytes(decode_base64(b'AQAB'), 'big', signed=False)
decoded = decode_base64(b'tVKUtcx_n9rt5afY_2WFNvU6PlFMggCatsZ3l4RjKxH0jgdLq6CScb0P3ZGXYbPzXvmmLiWZizpb-h0qup5jznOvOr-Dhw9908584BSgC83YacjWNqEK3urxhyE2jWjwRm2N95WGgb5mzE5XmZIvkvyXnn7X8dvgFPF5QwIngGsDG8LyHuJWlaDhr_EPLMW4wHvH0zZCuRMARIJmmqiMy3VD4ftq4nS5s8vJL0pVSrkuNojtokp84AtkADCDU_BUhrc2sIgfnvZ03koCQRoZmWiHu86SuJZYkDFstVTVSR0hiXudFlfQ2rOhPlpObmku68lXw-7V-P7jwrQRFfQVXw', 'big')
n = int.from_bytes(decoded, 'big', signed=False)
rsaKey = construct((n, e))
pubKey = rsaKey.exportKey()
print(pubKey.decode('ascii'))
But whenever I try to verify a jwt token I get a "signature_invalid" error.
Am I not decoding the binary encoded bytes correctly?
----UPDATE---
As suggested in the comments, I've updated my code to url decode the bytes first, but I still get the same signature invalid error as before.
Related
I understand that you have a hex string and perform SHA256 on it twice and then byte-swap the final hex string. The goal of this code is to find a Merkle Root by concatenating two transactions. I would like to understand what's going on in the background a bit more. What exactly are you decoding and encoding?
import hashlib
transaction_hex = "93a05cac6ae03dd55172534c53be0738a50257bb3be69fff2c7595d677ad53666e344634584d07b8d8bc017680f342bc6aad523da31bc2b19e1ec0921078e872"
transaction_bin = transaction_hex.decode('hex')
hash = hashlib.sha256(hashlib.sha256(transaction_bin).digest()).digest()
hash.encode('hex_codec')
'38805219c8ac7e9a96416d706dc1d8f638b12f46b94dfd1362b5d16cf62e68ff'
hash[::-1].encode('hex_codec')
'ff682ef66cd1b56213fd4db9462fb138f6d8c16d706d41969a7eacc819528038'
header_hex is a regular string of lower case ASCII characters and the decode() method with 'hex' argument changes it to a (binary) string (or bytes object in Python 3) with bytes 0x93 0xa0 etc. In C it would be an array of unsigned char of length 64 in this case.
This array/byte string of length 64 is then hashed with SHA256 and its result (another binary string of size 32) is again hashed. So hash is a string of length 32, or a bytes object of that length in Python 3. Then encode('hex_codec') is a synomym for encode('hex') (in Python 2); in Python 3, it replaces it (so maybe this code is meant to work in both versions). It outputs an ASCII (lower hex) string again that replaces each raw byte (which is just a small integer) with a two character string that is its hexadecimal representation. So the final bit reverses the double hash and outputs it as hexadecimal, to a form which I usually call "lowercase hex ASCII".
How can I convert a message into a hash value using SHA/MD5 hashing in MATLAB? is there any builtin function or any fixed code?
There are no functions in matlab to calculate hashes. However, you can call Java (any OS) or .Net (Windows only) functions directly from matlab and either of these implement what you want.
Note that you haven't specified the encoding of the string. The hash is different if you consider the string in ASCII, UTF8, UTF16, etc.
Also note that matlab does not have 160-bit or 256-bit integer, so the hash can't obviously be a single integer.
Anyway, using .Net:
SHA256
string = 'some string';
sha256hasher = System.Security.Cryptography.SHA256Managed;
sha256 = uint8(sha256hasher.ComputeHash(uint8(string)));
dec2hex(sha256)
SHA1
sha1hasher = System.Security.Cryptography.SHA1Managed;
sha1= uint8(sha1hasher.ComputeHash(uint8(string)));
dec2hex(sha1)
Java based solution can be found in the following link
https://www.mathworks.com/matlabcentral/answers/45323-how-to-calculate-hash-sum-of-a-string-using-java
MATLAB's .NET classes appear to be a more recent creation than the JAVA hashing.
However, these classes don't have much/any public documentation available. After playing with it a bit, I found a way to specify one of several hash algorithms, as desired.
The "System.Security.Cryptography.HashAlgorithm" constructor accepts a hash algorithm name (string). Based on the string name you pass in, it returns different hasher classes (.SHA256Managed is only one type). See the example below for a complete string input ==> hash string output generation.
% Available options are 'SHA1', 'SHA256', 'SHA384', 'SHA512', 'MD5'
algorithm = 'SHA1';
% SHA1 category
hasher = System.Security.Cryptography.HashAlgorithm.Create('SHA1'); % DEFAULT
% SHA2 category
hasher = System.Security.Cryptography.HashAlgorithm.Create('SHA256');
hasher = System.Security.Cryptography.HashAlgorithm.Create('SHA384');
hasher = System.Security.Cryptography.HashAlgorithm.Create('SHA512');
% SHA3 category: Does not appear to be supported
% MD5 category
hasher = System.Security.Cryptography.HashAlgorithm.Create('MD5');
% GENERATING THE HASH:
str = 'Now is the time for all good men to come to the aid of their country';
hash_byte = hasher.ComputeHash( uint8(str) ); % System.Byte class
hash_uint8 = uint8( hash_byte ); % Array of uint8
hash_hex = dec2hex(hash_uint8); % Array of 2-char hex codes
% Generate the hex codes as 1 long series of characters
hashStr = str([]);
nBytes = length(hash_hex);
for k=1:nBytes
hashStr(end+1:end+2) = hash_hex(k,:);
end
fprintf(1, '\n\tThe %s hash is: "%s" [%d bytes]\n\n', algorithm, hashStr, nBytes);
% SIZE OF THE DIFFERENT HASHES:
% SHA1: 20 bytes = 20 hex codes = 40 char hash string
% SHA256: 32 bytes = 32 hex codes = 64 char hash string
% SHA384: 48 bytes = 48 hex codes = 96 char hash string
% SHA512: 64 bytes = 64 hex codes = 128 char hash string
% MD5: 16 bytes = 16 hex codes = 32 char hash string
REFERENCES:
1) https://en.wikipedia.org/wiki/SHA-1
2) https://defuse.ca/checksums.htm#checksums
I just used this and it works well.
Works on strings, files, different data types.
For a file I compared against CRC SHA through file explorer and got the same answer.
https://www.mathworks.com/matlabcentral/fileexchange/31272-datahash
i am using Heimdall
https://github.com/henrinormak/Heimdall
for generating my 1024 bit RSA keys and encrypting messages:
let heimdall = Heimdall(publicTag: publicTag, publicKeyData: data)
When i utf8-encode and base64-encode i pass it to the encrypt method:
let utf8Encoded = self.mystring.data(using: String.Encoding.utf8)!
let base64Encoded = utf8Encoded.base64EncodedData()
let encrypted = heimdall.encrypt(base64Encoded)
print("encrypted \(encrypted!)") // -> 160 bytes !! why not 128
The encrypted part should 128 bytes and not 160.
Can anybody help me to get there?
How can i keep generating 1024 bit rsa keys and encrypting messages with those keys to end up with 128 byte arrays?
Thanks and Greetings !!
From the Heimdall docs: Note on encryption/decryption:
The payload is built, containing the encrypted key, followed by the encrypted message.
Then this is Base64 encoded increasing the length by 1/3.
Thus the output is (the aes key + the encrypted data length + padding) Base64 encoded.
I am using ColdFusion 9
Referencing Ben Nadel's good works on his blog, I tried
ucase(digestUtils.sha512(imageBinary))
For SHA-512 hashing I get that dreaded:
The sha512 method was not found. Either there are no methods with the
specified method name and argument types or the sha512 method is
overloaded with argument types that ColdFusion cannot decipher
reliably. ColdFusion found 0 methods that match the provided
arguments. If this is a Java object and you verified that the method
exists, use the javacast function to reduce ambiguity.
Now I know that sha512 does indeed exist as a method, because I saw it here, but when I perform a
cfdump var="#digestUtils#"
I only get:
md5(byte[]) byte[]
md5(java.lang.String) byte[]
md5Hex(byte[]) java.lang.String
md5Hex(java.lang.String) java.lang.String
sha(java.lang.String) byte[]
sha(byte[]) byte[]
shaHex(java.lang.String) java.lang.String
shaHex(byte[]) java.lang.String
What happened to the other methods? I guess I have to try something else.
Please advise with a ColdFusion solution. A ColdFusion/Java solution would be ok too.
I'm trying to write a SSO application where the 3rd party guys feeds me URL parameters. I have successfully decoded the 1st parameter to get my XML Post. I now need to take the 2nd parameter which is the hash payload and go through the algorithm to ensure my 1st parameter hasn't been tampered with.
=========
Editing begins here: Okay,I tried writing the code again to no avail.
The algorithm sounds simple enough. But trying to implement it is killing me.
1. compute the hash string value of the XMLPost string above:
a. convert the base64 salt string to a UTF-8 byte array.
b. convert the base64 XML payload string to a UTF-8 byte array.
c. create a new byte array consisting of the XML payload bytes from step b, appended with the salt bytes from step a.
d. perform a SHA512 hash on the concatenated byte array from step c, which results in a hashed byte array.
e. create a new byte array consisting of the hashed bytes from step d, appended with the salt bytes from step a.
f. convert the result of step e to a base64-encoded string and should be the value of query string parameter "h" payload hash.
xmlPost was created by my third party guys as such:
This XML payload string was converted to a UTF-8 byte array, which was then converted to a base-64 string. The resulting base-64 string is the value of my xmlPost below.
So I do this:
<code>
<cfset xmlPost = urlDecode("PD94bWwgdmVyc2lvbj0iMS4wIj8%2bPEVzdG9yZVNzb0N1c3RvbWVyIHhtbG5zOnhzaT0iaHR0cDovL3d3dy53My5vcmcvMjAwMS9YTUxTY2hlbWEtaW5zdGFuY2UiIHhtbG5zOnhzZD0iaHR0cDovL3d3dy53My5vcmcvMjAwMS9YTUxTY2hlbWEiPjxDdXN0b21lcklkPjExMjk0MDwvQ3VzdG9tZXJJZD48RGVhbGVyQ29kZT5OODg4ODg8L0RlYWxlckNvZGU%2bPFBvaW50QmFsYW5jZT4yODA8L1BvaW50QmFsYW5jZT48Rmlyc3ROYW1lPkZhaXRoPC9GaXJzdE5hbWU%2bPExhc3ROYW1lPkh1dHVsYTwvTGFzdE5hbWU%2bPC9Fc3RvcmVTc29DdXN0b21lcj4%3d") />
<cfset salt = "3dfjh674!MujErf98344#090" />
<cfset payload_hash = urlDecode("EtLDRJfcRESFKpY4OGZZnRSN2THqT%2bEelzOuXVU06jotd2kE4yKnlYay7BqyAdcUSATRgSMaHxZa6uBqKKd9rjNkZmpoNjc0IU11akVyZjk4MzQ0QDA5MA%3d%3d") />
<cfset strXML = ToString( ToBinary( xmlpost ) ) /> <!--- to get actual XML --->
<!--- base64 encoding returns a byte array --->
<cfset saltByteArray = toBase64( salt, "utf-8" ) />
<cfset xmlpostByteArray = toBase64( xmlPost, "utf-8" ) />
<!--- append salt to xmlpost --->
<cfset xmlpostsaltByteArray = xmlpostByteArray & saltByteArray />
<!--- now let us perform a sha512 hash on this concatenated byte array --->
<cfscript>
// Create an instance of our DigestUtils class
digestUtils = createObject("java","org.apache.commons.codec.digest.DigestUtils");
// I hash a byte array using the given algorithm and return a
// 32-character Hexadecimal string. Home-made hash function for CF9 and earlier
function hashBytes( bytes, algorithm = "SHA-512" ){
// Get our instance of the digest algorithm that we'll use
// to hash the byte array.
var messageDigest = createObject( "java", "java.security.MessageDigest" ).getInstance( javaCast( "string", algorithm ) );
// Get the digest for the given byte array. This returns the
// digest (i.e., hash) in byte-array format.
var digest = messageDigest.digest( bytes );
// Now that we have our digested byte array (i.e., our hash as another byte
// array), we have to convert that into a HEX string. So, we'll need a HEX buffer.
var hexBuffer = [];
// Each integer in the byte digest needs to be converted into
// a HEX character (with possible leading zero).
for (byte =1 ;byte LTE ArrayLen(digest);byte = byte + 1) {
//for ( var byte in digest){
// Get the hex value for this byte. When converting the
// byte, only use the right-most 8 bits (last 8 bits of the integer)
// otherwise the sign of the byte can create oddities
var tail = bitAnd( 255, byte );
// Get the hex-encoding of the byte.
var hex = ucase( formatBaseN( tail, 16 ) );
// In order to make sure that all of the HEX characters
// are two-digits, we have to prepend a zero for any
// value that was originally LTE to 16 (the largest value
// that won't result in two HEX characters).
arrayAppend( hexBuffer, (tail <= 16 ? ("0" & hex) : hex) );
}
// Return the flattened character buffer.
return( arrayToList( hexBuffer, "" ) );
}
// Get the hash of the byte array using our hashBytes() function
hashByteArray = hashBytes( xmlpostsaltByteArray );
</cfscript>
<!--- The hashByteArray is in HEX format now. Convert to binary --->
<!--- You must binary decode the hashed string before converting it to binary --->
<cfset hashByteArray = toBase64( BinaryDecode( hashByteArray, 'HEX' ) ) />
<!--- The final step is to append this new hashbytearray with the salt byte array --->
<cfset hashByteArray = hashByteArray & saltByteArray />
<!--- now convert this value to a base64 encoded string --->
<cfset hashByteArray2 = toBase64( hashByteArray )/>
Here is what I get for my strXML variable:
Actual xml structure converted from base 64 to string:
<?xml version="1.0"?><EstoreSsoCustomer xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"><CustomerId>112940</CustomerId><DealerCode>N88888</DealerCode><PointBalance>280</PointBalance><FirstName>Faith</FirstName><LastName>Hutula</LastName></EstoreSsoCustomer>
The final value, hasByteArray2 is not even remotely similar to payload_hash
This is my first time doing this and my understanding of hashing, byte arrays and character conversions flew out of the window decades ago.
What am I doing wrong?
Thank you
Faith Sloan
DigestUtils.sha512 was added in version 1.4. ColdFusion 9 uses an older version, 1.3. That is why the method is not found.
Use the other function based on MessageDigest. Just be sure to pass in the correct algorithm ie:
imageHash = hashBytes( imageBinary, "SHA-512" );
UPDATE: Based on the updated code, some of the instructions may be a bit misleading. I believe they just mean decode the xml and salt strings from their given encoding (base64 and utf-8) into byte arrays, not strings:
// note: salt value has invalid characters for base64
// assuming it is a plain utf-8 string
saltArray = charsetDecode(salt, "utf-8");
xmlByteArray = binaryDecode(xmlPost, "base64");
Then merge the two binary arrays (see custom function)
mergedBytes = mergeArrays( xmlByteArray, saltArray );
Calculate the hash of the new byte array:
messageDigest = createObject( "java", "java.security.MessageDigest" );
messageDigest = messageDigest.getInstance( javaCast( "string", "SHA-512") );
hashedByteArray = messageDigest.digest( javacast("byte[]", mergedBytes) );
Merge the arrays again:
mergedBytes = mergeArrays( hashedByteArray, saltArray);
Finally convert the binary to base64 and compare:
calculatedPayload = binaryEncode( javacast("byte[]", mergedBytes), "base64");
// check results
arePayloadsEqual = compare(calculatedPayload, payload_hash) eq 0;
WriteDump("arePayloadsEqual="& arePayloadsEqual);
WriteDump("calculatedPayload="& calculatedPayload);
WriteDump("payload_hash="& payload_hash);
Note: BinaryDecode/CharsetDecode return java arrays. Unlike CF arrays, they are immutable (ie cannot be changed). So the handy addAll(..) trick will not work here.
// merge immutable arrays the long way
function mergeArrays( array1, array2 ){
var i = 0;
var newArray = [];
for (i = 1; i <= arrayLen(arguments.array1); i++) {
arrayAppend(newArray, arguments.array1[i]);
}
for (i = 1; i <= arrayLen(arguments.array2); i++) {
arrayAppend(newArray, arguments.array2[i]);
}
return newArray;
}
As a self study exercise, I'm trying to learn how to use some of the pycrypto library. I need to decrypt a ciphertext string in CBC_MODE using AES. I the ciphertext, key, and IV are all given. Here is the code that I have written:
from Crypto.Cipher import AES
mode = AES.MODE_CBC
key = "a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1"
ciphertext = "a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1";
iv = ciphertext[:32]
ciphertext = ciphertext[32:]
decryptor = AES.new(key, mode, iv)
plaintext = decryptor.decrypt(ciphertext)
print plaintext
When I run this, I get the following error:
ValueError: IV must be 16 bytes long
I know that the IV string is 32 hex characters, and therefore 16 bytes. I think that this might be a typing problem, but I don't know how to correct it. Can anyone help?
Thank you!
Your strings contain only hex characters, but they are still plain strings, so every character counts.
So your IV string is 32 byte long as you sliced it out from ciphertext.
I suspect you're right and it is down to typing. Try one of these:
iv = binascii.unhexlify(ciphertext[:32])
or
iv = long(ciphertext[:32], 16)
Tell the computer you are dealing with hex. It is treating it as a string.
iv = iv.decode('hex');