As a self study exercise, I'm trying to learn how to use some of the pycrypto library. I need to decrypt a ciphertext string in CBC_MODE using AES. I the ciphertext, key, and IV are all given. Here is the code that I have written:
from Crypto.Cipher import AES
mode = AES.MODE_CBC
key = "a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1"
ciphertext = "a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1";
iv = ciphertext[:32]
ciphertext = ciphertext[32:]
decryptor = AES.new(key, mode, iv)
plaintext = decryptor.decrypt(ciphertext)
print plaintext
When I run this, I get the following error:
ValueError: IV must be 16 bytes long
I know that the IV string is 32 hex characters, and therefore 16 bytes. I think that this might be a typing problem, but I don't know how to correct it. Can anyone help?
Thank you!
Your strings contain only hex characters, but they are still plain strings, so every character counts.
So your IV string is 32 byte long as you sliced it out from ciphertext.
I suspect you're right and it is down to typing. Try one of these:
iv = binascii.unhexlify(ciphertext[:32])
or
iv = long(ciphertext[:32], 16)
Tell the computer you are dealing with hex. It is treating it as a string.
iv = iv.decode('hex');
Related
I used the following command on my Ubuntu machine "openssl passwd -crypt - salt pass book" to generate a salted password.
What hash is the output made up of? e.g SHA-512, MD5 etc. Also, i'm wondering how it's made up. For example, is it made by hashing "passbook" together?
I need more information on what hashing/algorithm is being used to generate the output I see.
Thanks
The result provided by the openssl passwd app when using the -crypt algorithm seems to be the same as the result provided by the Linux/Unix crypt() function. You can verify this with the following (quick'n'dirty) code snippet:
#include <crypt.h>
#include <stdio.h>
int main(
int argc,
char **argv)
{
char *key = argv[1];
char *salt = argv[2];
char *enc = crypt(key, salt);
printf("key = \"%s\", salt = \"%s\", enc = \"%s\"\n",
key ? key:"NULL", salt ? salt:"NULL", enc ? enc:"NULL");
}
Result:
$ ./main book pass
key = "book", salt = "pass", enc = "pahzZkfwawIXw"
$ openssl passwd -crypt -salt pass book
pahzZkfwawIXw
The exact details of how the crypt() function seem to be explained most clearly in its OSX man page, in particular:
Traditional crypt:
The first 8 bytes of the key are null-padded, and the low-order 7 bits of each character is
used to form the 56-bit DES key.
The salt is a 2-character array of the ASCII-encoded salt. Thus, only 12 bits of salt are
used. count is set to 25.
Algorithm:
The salt introduces disorder in the DES algorithm in one of 16777216 or 4096 possible ways
(ie. with 24 or 12 bits: if bit i of the salt is set, then bits i and i+24 are swapped in
the DES E-box output).
The DES key is used to encrypt a 64-bit constant, using count iterations of DES. The value
returned is a null-terminated string, 20 or 13 bytes (plus null) in length, consisting of
the salt, followed by the encoded 64-bit encryption.
I understand that you have a hex string and perform SHA256 on it twice and then byte-swap the final hex string. The goal of this code is to find a Merkle Root by concatenating two transactions. I would like to understand what's going on in the background a bit more. What exactly are you decoding and encoding?
import hashlib
transaction_hex = "93a05cac6ae03dd55172534c53be0738a50257bb3be69fff2c7595d677ad53666e344634584d07b8d8bc017680f342bc6aad523da31bc2b19e1ec0921078e872"
transaction_bin = transaction_hex.decode('hex')
hash = hashlib.sha256(hashlib.sha256(transaction_bin).digest()).digest()
hash.encode('hex_codec')
'38805219c8ac7e9a96416d706dc1d8f638b12f46b94dfd1362b5d16cf62e68ff'
hash[::-1].encode('hex_codec')
'ff682ef66cd1b56213fd4db9462fb138f6d8c16d706d41969a7eacc819528038'
header_hex is a regular string of lower case ASCII characters and the decode() method with 'hex' argument changes it to a (binary) string (or bytes object in Python 3) with bytes 0x93 0xa0 etc. In C it would be an array of unsigned char of length 64 in this case.
This array/byte string of length 64 is then hashed with SHA256 and its result (another binary string of size 32) is again hashed. So hash is a string of length 32, or a bytes object of that length in Python 3. Then encode('hex_codec') is a synomym for encode('hex') (in Python 2); in Python 3, it replaces it (so maybe this code is meant to work in both versions). It outputs an ASCII (lower hex) string again that replaces each raw byte (which is just a small integer) with a two character string that is its hexadecimal representation. So the final bit reverses the double hash and outputs it as hexadecimal, to a form which I usually call "lowercase hex ASCII".
How can I convert a message into a hash value using SHA/MD5 hashing in MATLAB? is there any builtin function or any fixed code?
There are no functions in matlab to calculate hashes. However, you can call Java (any OS) or .Net (Windows only) functions directly from matlab and either of these implement what you want.
Note that you haven't specified the encoding of the string. The hash is different if you consider the string in ASCII, UTF8, UTF16, etc.
Also note that matlab does not have 160-bit or 256-bit integer, so the hash can't obviously be a single integer.
Anyway, using .Net:
SHA256
string = 'some string';
sha256hasher = System.Security.Cryptography.SHA256Managed;
sha256 = uint8(sha256hasher.ComputeHash(uint8(string)));
dec2hex(sha256)
SHA1
sha1hasher = System.Security.Cryptography.SHA1Managed;
sha1= uint8(sha1hasher.ComputeHash(uint8(string)));
dec2hex(sha1)
Java based solution can be found in the following link
https://www.mathworks.com/matlabcentral/answers/45323-how-to-calculate-hash-sum-of-a-string-using-java
MATLAB's .NET classes appear to be a more recent creation than the JAVA hashing.
However, these classes don't have much/any public documentation available. After playing with it a bit, I found a way to specify one of several hash algorithms, as desired.
The "System.Security.Cryptography.HashAlgorithm" constructor accepts a hash algorithm name (string). Based on the string name you pass in, it returns different hasher classes (.SHA256Managed is only one type). See the example below for a complete string input ==> hash string output generation.
% Available options are 'SHA1', 'SHA256', 'SHA384', 'SHA512', 'MD5'
algorithm = 'SHA1';
% SHA1 category
hasher = System.Security.Cryptography.HashAlgorithm.Create('SHA1'); % DEFAULT
% SHA2 category
hasher = System.Security.Cryptography.HashAlgorithm.Create('SHA256');
hasher = System.Security.Cryptography.HashAlgorithm.Create('SHA384');
hasher = System.Security.Cryptography.HashAlgorithm.Create('SHA512');
% SHA3 category: Does not appear to be supported
% MD5 category
hasher = System.Security.Cryptography.HashAlgorithm.Create('MD5');
% GENERATING THE HASH:
str = 'Now is the time for all good men to come to the aid of their country';
hash_byte = hasher.ComputeHash( uint8(str) ); % System.Byte class
hash_uint8 = uint8( hash_byte ); % Array of uint8
hash_hex = dec2hex(hash_uint8); % Array of 2-char hex codes
% Generate the hex codes as 1 long series of characters
hashStr = str([]);
nBytes = length(hash_hex);
for k=1:nBytes
hashStr(end+1:end+2) = hash_hex(k,:);
end
fprintf(1, '\n\tThe %s hash is: "%s" [%d bytes]\n\n', algorithm, hashStr, nBytes);
% SIZE OF THE DIFFERENT HASHES:
% SHA1: 20 bytes = 20 hex codes = 40 char hash string
% SHA256: 32 bytes = 32 hex codes = 64 char hash string
% SHA384: 48 bytes = 48 hex codes = 96 char hash string
% SHA512: 64 bytes = 64 hex codes = 128 char hash string
% MD5: 16 bytes = 16 hex codes = 32 char hash string
REFERENCES:
1) https://en.wikipedia.org/wiki/SHA-1
2) https://defuse.ca/checksums.htm#checksums
I just used this and it works well.
Works on strings, files, different data types.
For a file I compared against CRC SHA through file explorer and got the same answer.
https://www.mathworks.com/matlabcentral/fileexchange/31272-datahash
i am using Heimdall
https://github.com/henrinormak/Heimdall
for generating my 1024 bit RSA keys and encrypting messages:
let heimdall = Heimdall(publicTag: publicTag, publicKeyData: data)
When i utf8-encode and base64-encode i pass it to the encrypt method:
let utf8Encoded = self.mystring.data(using: String.Encoding.utf8)!
let base64Encoded = utf8Encoded.base64EncodedData()
let encrypted = heimdall.encrypt(base64Encoded)
print("encrypted \(encrypted!)") // -> 160 bytes !! why not 128
The encrypted part should 128 bytes and not 160.
Can anybody help me to get there?
How can i keep generating 1024 bit rsa keys and encrypting messages with those keys to end up with 128 byte arrays?
Thanks and Greetings !!
From the Heimdall docs: Note on encryption/decryption:
The payload is built, containing the encrypted key, followed by the encrypted message.
Then this is Base64 encoded increasing the length by 1/3.
Thus the output is (the aes key + the encrypted data length + padding) Base64 encoded.
I am working on a small task which requires some base64 encoding. I am trying to do it in head but getting lost .
I have a 13 digit number in java long format say: 1294705313608 , 1294705313594 , 1294705313573
I do some processing with it, bascially I take this number append it with stuff put it in a byte array and then convert it to base64 using:
String b64String = new sun.misc.BASE64Encoder().encodeBuffer(bArray);
Now , I know that for my original number, the first 3 digits would never change. So 129 is constant in above numbers. I want to find out how many chars corresponding to those digits would not change in the resultant base64 string.
Code to serialize long to the byte array. I ignore the first 2 bytes since they are always 0:
bArray[0] = (byte) (time >>> 40);
bArray[1] = (byte) (time >>> 32);
bArray[2] = (byte) (time >>> 24);
bArray[3] = (byte) (time >>> 16);
bArray[4] = (byte) (time >>> 8);
bArray[5] = (byte) (time >>> 0);
Thanks.
Notes:
I know that base64 would take 6 bits and make one character out of it. So if first 3 digits do not change in long how many chars would not change in base64.
This in NOT a HW assignment, but I am not very familiar with encoding.
1290000000000 is 10010110001011001111111011110010000000000 in binary.
1299999999999 is 10010111010101110000010011100011111111111 in binary.
Both are 41 bits long, and they differ after the first 7 bits. Your shift places bits 41-48 in the first byte, which will always be 00000001. The following byte will always be 00101110, 00101101, or 00101110. So you've got the leading 14 bits in common across all your possible array values, which (at 6 bits per encoded base64 char) means 2 characters in common in the encoded string.
Appears you're on the right track. I think what you want to do is convert a long to a byte array, then convert the byte array to Base64.
How do I convert Long to byte[] and back in java shows you how to convert it to bytes.