I am trying to encrypt plain text using the Perl's module Crypt::Blowfish.
My code is
#!/usr/bin/perl
use Crypt::Blowfish;
my $key = pack("H16", "0123456789ABCDEF");
my $cipher = Crypt::Blowfish->new($key);
my $cipher_text = $cipher->encrypt($plain_text);
But it returns the error ""input must be 8 bytes long at Crypt/Blowfish.pm"
Can anyone explain this to me?
Blowfish, like similar encryption algorithms, encrypts blocks rather than bytes. You need to use something like Crypt::CBC to provide padding.
Crypt::CBC also provides two other very important functions: salting and chaining. Without these, the encryption is severely weakened.
Related
My question is about this:
https://metacpan.org/pod/Crypt::OpenSSL::RSA
If there described method verify() fails, I do error handling like this:
my $rsa_pub = Crypt::OpenSSL::RSA->new_public_key($x509PubKey);
logm("exception: my err msg...") unless $rsa_pub->verify($text, $signature);
But is it possible get exact reason why verification failed?
I'm not sure that getting "the exact reason why verification failed" makes sense as a question. To verify a signature you specify:
the signature algorithm
the padding algorithm
the hashing function
Ultimately the signature is just a number that was computed by padding the the plaintext input, hashing the resulting bytes and performing a mathematical calculation using the private key.
Verifying the signature involves taking the plaintext, padding it, hashing it, and performing a mathematical calculation using the public key to produce another number which is then compared to the number from the signature (using modulo arthimetic?). If the numbers are the same then the signature is valid if they're different, it's not.
All of which is a roundabout way of saying if the verify method returns false then assuming you're using the correct public key, one of these things must be different:
the plaintext
the signature algorithm
the padding algorithm
the hashing function
But there's really no way of knowing which. It's like saying "I'm trying to multiply two numbers to get 42, but I don't get 42, which of the numbers is wrong?".
Here are a couple of signature verification functions for common combinations of algorithms (which I wrote for Authen::NZRealMe::XMLSig):
sub _verify_signature_rsa_sha1 {
my($self, $plaintext, $bin_sig) = #_;
my $rsa_pub_key = Crypt::OpenSSL::RSA->new_public_key($self->pub_key_text);
$rsa_pub_key->use_pkcs1_padding();
$rsa_pub_key->use_sha1_hash();
return $rsa_pub_key->verify($plaintext, $bin_sig);
}
sub _verify_signature_rsa_sha256 {
my($self, $plaintext, $bin_sig) = #_;
my $rsa_pub_key = Crypt::OpenSSL::RSA->new_public_key($self->pub_key_text);
$rsa_pub_key->use_pkcs1_oaep_padding();
$rsa_pub_key->use_sha256_hash();
return $rsa_pub_key->verify($plaintext, $bin_sig);
}
The context for the above code is signed sections of XML documents, which has the added complexity of needing to use the right canonicalization and encoding and also the signature data is Base64 encoded so needs to be decoded into bytes first.
The information about which padding and hashing algorithms to use should be available from the spec for the source data you're working with, but if not I guess you could try random combinations.
I'm trying to sign a transaction skeleton Blockcypher returns, in order to send it along, following https://www.blockcypher.com/dev/bitcoin/#creating-transactions.
For this example, I'll use the completely-unsafe 'raw raw raw raw raw raw raw raw raw raw raw raw' mnemonic, which using dart bip32 package creates a BIP32 with private key 0x05a2716a8eb37eb2aaa72594573165349498aa6ca20c71346fb15d82c0cbbf7c and address mpQfiFFq7SHvzS9ebxMRGVohwHTRJJf9ra for BTC testnet.
Blockcypher Tx Skeleton tosign is 1cbbb4d229dcafe6dc3363daab8de99d6d38b043ce62b7129a8236e40053383e.
Using Blockcypher signer tool:
$ ./signer 1cbbb4d229dcafe6dc3363daab8de99d6d38b043ce62b7129a8236e40053383e 05a2716a8eb37eb2aaa72594573165349498aa6ca20c71346fb15d82c0cbbf7c
304402202711792b72547d2a1730a319bd219854f0892451b8bc2ab8c17ec0c6cba4ecc4022058f675ca0af3db455913e59dadc7c5e0bd0bf1b8ef8c13e830a627a18ac375ab
On the other hand, using bip32 I get:
String toSign = txSkel['tosign'][0];
var uToSign = crypto.hexToBytes(toSign);
var signed = fromNode.sign(uToSign);
var signedHex = bufferToHex(signed);
var signedHexNo0x = signedHex.substring(2);
where fromNode is the bip32.BIP32 node. Output is signedHexNo0x = 2711792b72547d2a1730a319bd219854f0892451b8bc2ab8c17ec0c6cba4ecc458f675ca0af3db455913e59dadc7c5e0bd0bf1b8ef8c13e830a627a18ac375ab.
At first sight, they seem completely different buffers, but after a detailed look, Blockcypher signer output only has some extra characters than that of bip32:
Blockcypher signer output (I split it into several lines for you to see it clearly):
30440220
2711792b72547d2a1730a319bd219854f0892451b8bc2ab8c17ec0c6cba4ecc4
0220
58f675ca0af3db455913e59dadc7c5e0bd0bf1b8ef8c13e830a627a18ac375ab
bip32 output (also intentionally split):
2711792b72547d2a1730a319bd219854f0892451b8bc2ab8c17ec0c6cba4ecc4
58f675ca0af3db455913e59dadc7c5e0bd0bf1b8ef8c13e830a627a18ac375ab
I'd expect two 64-character numbers to give a 128-characters signature, which bip32 output accomplishes. Hence, Blockcypher signer output has 140 characters, i.e. 12 more than the former, which is clear when seen as split into lines as above.
I'd be really thankful to anyone throwing some light on this issue, which I need to understand and correct. I need to implement the solution in dart, I cannot use the signer script other than for testing.
The dart bip32 package doesn't seem to encode the signature in DER format, but rather in a simple (r, s) encoding. However DER is required for Bitcoin. For more information see:
https://bitcoin.stackexchange.com/questions/92680/what-are-the-der-signature-and-sec-format
You can either add the DER extra bytes yourself according to your r and s or check if there's a DER encoding in the dart bip32 library.
I am trying to implement RSA using Big integers i can currently encrypt and decrypt fine but I need to be able to take the 2 lots of 2 BigIntegers n, e and n, d of any Bit length up to 2048 and then some how save them to files named publicKey.txt and privateKey.txt . and then be able to read it in later. does anyone have any ideas for this.
would like to somehow save them like this so i can separate them into their 2 parts on reading them in using the , as the separator
publicKey.txt
n,e
privateKey.txt
n,d
RSA standard defines ASN.1 encoding for serializing private and public keys for transferring and storage. This format is a standard for cryptography software.
Recent CryptoJS versions support SHA3 hashing.
SHA3 can output different hash sizes, including 512-bit (default) and 256-bit. These two work fine:
var sha3_512_hash = CryptoJS.SHA3( 'test' );
var sha3_256_hash = CryptoJS.SHA3( 'test' , { outputLength:256 } );
Similarly, CryptoJS can also calculate HMAC values. However, I can't figure out how to change the default output size there:
var sha3_512_hmac = CryptoJS.HmacSHA3( 'test' , 'key' );
var sha3_256_hmac = CryptoJS.HmacSHA3( 'test' , 'key' , { outputLength:256 } );
The first works OK (the result is a 512-bit hmac value) but the second is the same (i.e. also 512-bit), as if it ignores the {outputLength:256} parameter!
Here's a live example: http://jsfiddle.net/M8xf3/ (using hmac-sha3.js from CryptoJS 3.1.2)
Does anyone know how to create 256-bit SHA3-based HMAC hashes?
P.S. For the SHA2 family of functions, CryptoJS has separate Hmac functions for each output size (that's HmacSHA256 and HmacSHA512). But this doesn't seem to be the case for SHA3?
This doesn't answer the actual question, but note that with SHA3 you don't really need HMAC hashes. Unlike SHA1 and SHA2 and MD5, SHA3 is not vulnerable to length-extension attacks.
Therefore with SHA3 it would suffice to just prepend or append the secret key to your input.
Or, if you're paranoid of a single hash step becoming compromised (not to be expected in the foreseeable future, especially not with SHA3, but still) you could do something like SHA3(key+SHA3(key+data)) or SHA3(key+SHA3(key+data)+data) (obviously with "+" denoting binary concatenation).
You can just edit hmac-sha3.js and change the outputLength to 256-bit instead of 512-bit.
Open hmac-sha3.js file, using your text editor.
Find "{outputLength:512}" and replace it with "{outputLength:256}"
Then the hash output will be 256-bit in length.
To be sure that you did not messed up things, double check your 256-bit hmac-sha3 output with some test cases available on the internet, for example: http://www.di-mgt.com.au/hmac_sha3_testvectors.html
I'm considering using Data::UUID Perl module to generate a 256 bit symmetric key for use with the HMAC_SHA256 algorithm. Each call should give me a unique string of 128 bits so I'm thinking of doing something like the following:
use Data::UUID;
my $ug = new Data::UUID;
my $uuid1 = $ug->to_hexstring($ug->create());
my $uuid2 = $ug->to_hexstring($ug->create());
my $256_bit_key = $uuid1 . $uuid2;
Is this key cryptographically strong?
No.
Use Crypt::OpenSSL::Random or another crypto-strong random number generator.
To be more precise, you can get some bytes from the CRNG, convert them into an ASCII string, and then use that to do the hash against.