Why does Blockcypher signer tool return some extra characters than bip32 dart package? - flutter

I'm trying to sign a transaction skeleton Blockcypher returns, in order to send it along, following https://www.blockcypher.com/dev/bitcoin/#creating-transactions.
For this example, I'll use the completely-unsafe 'raw raw raw raw raw raw raw raw raw raw raw raw' mnemonic, which using dart bip32 package creates a BIP32 with private key 0x05a2716a8eb37eb2aaa72594573165349498aa6ca20c71346fb15d82c0cbbf7c and address mpQfiFFq7SHvzS9ebxMRGVohwHTRJJf9ra for BTC testnet.
Blockcypher Tx Skeleton tosign is 1cbbb4d229dcafe6dc3363daab8de99d6d38b043ce62b7129a8236e40053383e.
Using Blockcypher signer tool:
$ ./signer 1cbbb4d229dcafe6dc3363daab8de99d6d38b043ce62b7129a8236e40053383e 05a2716a8eb37eb2aaa72594573165349498aa6ca20c71346fb15d82c0cbbf7c
304402202711792b72547d2a1730a319bd219854f0892451b8bc2ab8c17ec0c6cba4ecc4022058f675ca0af3db455913e59dadc7c5e0bd0bf1b8ef8c13e830a627a18ac375ab
On the other hand, using bip32 I get:
String toSign = txSkel['tosign'][0];
var uToSign = crypto.hexToBytes(toSign);
var signed = fromNode.sign(uToSign);
var signedHex = bufferToHex(signed);
var signedHexNo0x = signedHex.substring(2);
where fromNode is the bip32.BIP32 node. Output is signedHexNo0x = 2711792b72547d2a1730a319bd219854f0892451b8bc2ab8c17ec0c6cba4ecc458f675ca0af3db455913e59dadc7c5e0bd0bf1b8ef8c13e830a627a18ac375ab.
At first sight, they seem completely different buffers, but after a detailed look, Blockcypher signer output only has some extra characters than that of bip32:
Blockcypher signer output (I split it into several lines for you to see it clearly):
30440220
2711792b72547d2a1730a319bd219854f0892451b8bc2ab8c17ec0c6cba4ecc4
0220
58f675ca0af3db455913e59dadc7c5e0bd0bf1b8ef8c13e830a627a18ac375ab
bip32 output (also intentionally split):
2711792b72547d2a1730a319bd219854f0892451b8bc2ab8c17ec0c6cba4ecc4
58f675ca0af3db455913e59dadc7c5e0bd0bf1b8ef8c13e830a627a18ac375ab
I'd expect two 64-character numbers to give a 128-characters signature, which bip32 output accomplishes. Hence, Blockcypher signer output has 140 characters, i.e. 12 more than the former, which is clear when seen as split into lines as above.
I'd be really thankful to anyone throwing some light on this issue, which I need to understand and correct. I need to implement the solution in dart, I cannot use the signer script other than for testing.

The dart bip32 package doesn't seem to encode the signature in DER format, but rather in a simple (r, s) encoding. However DER is required for Bitcoin. For more information see:
https://bitcoin.stackexchange.com/questions/92680/what-are-the-der-signature-and-sec-format
You can either add the DER extra bytes yourself according to your r and s or check if there's a DER encoding in the dart bip32 library.

Related

Question about Crypt::OpenSSL::RSA->verify method

My question is about this:
https://metacpan.org/pod/Crypt::OpenSSL::RSA
If there described method verify() fails, I do error handling like this:
my $rsa_pub = Crypt::OpenSSL::RSA->new_public_key($x509PubKey);
logm("exception: my err msg...") unless $rsa_pub->verify($text, $signature);
But is it possible get exact reason why verification failed?
I'm not sure that getting "the exact reason why verification failed" makes sense as a question. To verify a signature you specify:
the signature algorithm
the padding algorithm
the hashing function
Ultimately the signature is just a number that was computed by padding the the plaintext input, hashing the resulting bytes and performing a mathematical calculation using the private key.
Verifying the signature involves taking the plaintext, padding it, hashing it, and performing a mathematical calculation using the public key to produce another number which is then compared to the number from the signature (using modulo arthimetic?). If the numbers are the same then the signature is valid if they're different, it's not.
All of which is a roundabout way of saying if the verify method returns false then assuming you're using the correct public key, one of these things must be different:
the plaintext
the signature algorithm
the padding algorithm
the hashing function
But there's really no way of knowing which. It's like saying "I'm trying to multiply two numbers to get 42, but I don't get 42, which of the numbers is wrong?".
Here are a couple of signature verification functions for common combinations of algorithms (which I wrote for Authen::NZRealMe::XMLSig):
sub _verify_signature_rsa_sha1 {
my($self, $plaintext, $bin_sig) = #_;
my $rsa_pub_key = Crypt::OpenSSL::RSA->new_public_key($self->pub_key_text);
$rsa_pub_key->use_pkcs1_padding();
$rsa_pub_key->use_sha1_hash();
return $rsa_pub_key->verify($plaintext, $bin_sig);
}
sub _verify_signature_rsa_sha256 {
my($self, $plaintext, $bin_sig) = #_;
my $rsa_pub_key = Crypt::OpenSSL::RSA->new_public_key($self->pub_key_text);
$rsa_pub_key->use_pkcs1_oaep_padding();
$rsa_pub_key->use_sha256_hash();
return $rsa_pub_key->verify($plaintext, $bin_sig);
}
The context for the above code is signed sections of XML documents, which has the added complexity of needing to use the right canonicalization and encoding and also the signature data is Base64 encoded so needs to be decoded into bytes first.
The information about which padding and hashing algorithms to use should be available from the spec for the source data you're working with, but if not I guess you could try random combinations.

Why is MD5 hashing so hard and in Swift 3?

Ok, so every now and then you come across problems that you've solved before using various frameworks and libraries and whatnot found on the internet and your problem is solved relatively quick and easy and you also learn why your problem was a problem in the first place.
However, sometimes you come across problems that make absolute 0 sense, and even worse when the solutions make negative sense.
My problem is that I want to take Data and make an MD5 hash out of it.
I find all kinds of solutions but none of them work.
What's really bugging me out actually is how unnecessarily complicated the solutions seem to be for a trivial task as getting an MD5 hash out of anything.
I am trying to use the Crypto and CommonCrypto frameworks by Soffes and they seem fairly easy, right? Right?
Yes!
But why am I still getting the error fatal error: unexpectedly found nil while unwrapping an Optional value?
From what I understand, the data served by myData.md5 in the extension of Crypto by Soffes seem to be "optional". But why?
The code I am trying to execute is:
print(" md5 result: " + String(data: myData.md5, encoding: .utf8)!)
where myData has data in it 100% because after the above line of code, I send that data to a server, and the data exists.
On top of that, printing the count of myData.md5.count by print(String(myData.md5.count)) works perfectly.
So my question is basically: How do I MD5 hash a Data and print it as a string?
Edit:
What I have tried
That works
MD5:ing the string test in a PHP script gives me 098f6bcd4621d373cade4e832627b4f6
and the Swift code "test".md5() also gives me 098f6bcd4621d373cade4e832627b4f6
That doesn't work
Converting the UInt8 byte array from Data.md5() to a string that represents the correct MD5 value.
The different tests I've done are the following:
var hash = ""
for byte in myData.data.md5() {
hash += String(format: "%02x", byte)
}
print("loop = " + hash) //test 1
print("myData.md5().toHexString() = " + myData.md5().toHexString()) //test 2
print("CryptoSwift.Digest.md5([UInt8](myData)) = " + CryptoSwift.Digest.md5([UInt8](myData)).toHexString()) //test 3
All three tests with the 500 byte test data give me the MD5 value 56f6955d148ad6b6abbc9088b4ae334d
while my PHP script gives me 6081d190b3ec6de47a74d34f6316ac6b
Test Sample (64 bytes):
Raw data:
FFD8FFE0 00104A46 49460001 01010048 00480000 FFE13572 45786966 00004D4D
002A0000 0008000B 01060003 00000001 00020000 010F0002 00000012 00000092
Test 1, 2 and 3 MD5: 7f0a012239d9fde5a46071640d2d8c83
PHP MD5: 06eb0c71d8839a4ac91ee42c129b8ba3
PHP Code: echo md5($_FILES["file"]["tmp_name"])
The simple answer to your question is:
String(data: someData, encoding: .utf8)
returns nil if someData is not properly UTF8 encoded data. If you try to unwrap nil like this:
String(data: someDate, encoding: .utf8)!
you get:
fatal error: unexpectedly found nil while unwrapping an Optional value
So at it's core, it's got nothing to do with hashing or crypto.
Both the input and the output of MD5 (or any hash algorithm for that matter) are binary data (and not text or strings). So the output of MD5 is not UTF8 encoded data. Thus why the above String initializer always failed.
If you want to display binary data in your console, you need to convert it to a readable representation. The most common ones are hexadecimal digits or Base 64 encoding.
Note: Some crypto libraries allow you to feed string into their hash functions. They will silently convert the string to a binary representation using some character encoding. If the encodings do not match, the hash values do not match across systems and programming languages. So you better try to understand why they really do in the background.
I use a library called 'CryptoSwift' for creating hashes, as well as encrypting data before I send it/store it. It's very easy to use.
It can be found here https://github.com/krzyzanowskim/CryptoSwift and you can even install it with CocoaPods by adding pod 'CryptoSwift' to your podfile.
Once installed, hashing a Data object is as simple as calling Data.md5()! It really is that easy. It also supports other hashing algorithms such as SHA.
You can then just print the MD5 object and CryptoSwift will convert it to a String for you.
The full docs on creating digests can be found here: https://github.com/krzyzanowskim/CryptoSwift#calculate-digest
Thanks to Jacob King I tried a much simpler MD5 framework called CryptoSwift.
The user Codo inspired me to look deeper in to my PHP script as he suggested that I am not in fact hashing the content of my data, but instead the filename, which is correct.
The original question however was not about which framework to use or suggestions to as why my app and my PHP script return different MD5 values.
The question was originally about why I get the error
fatal error: unexpectedly found nil while unwrapping an Optional value
at the line of code saying
print(" md5 result: " + String(data: myData.md5, encoding: .utf8)!)
So the answer to that is that I should not try to convert the 16 bytes data output of the MD5() function, but instead call a subfunction of MD5() called toHexString().
So the proper line of code should look like the following:
print("md5 result: " + myData.md5().toHexString())
BONUS
My PHP script now contains the following code:
move_uploaded_file($_FILES["file"]["tmp_name"], $target_dir); //save data to disk
$md5_of_data = md5_file ($target_dir); //get MD5 of saved data
BONUS-BONUS
The problem and solution is part of a small framework called AssetManager that I'm working on, which can be found here: https://github.com/aidv/AssetManager

Saving/Reading BigInteger RSA keys to/from file

I am trying to implement RSA using Big integers i can currently encrypt and decrypt fine but I need to be able to take the 2 lots of 2 BigIntegers n, e and n, d of any Bit length up to 2048 and then some how save them to files named publicKey.txt and privateKey.txt . and then be able to read it in later. does anyone have any ideas for this.
would like to somehow save them like this so i can separate them into their 2 parts on reading them in using the , as the separator
publicKey.txt
n,e
privateKey.txt
n,d
RSA standard defines ASN.1 encoding for serializing private and public keys for transferring and storage. This format is a standard for cryptography software.

How to calculate a 256-bit HMAC_SHA3 hash with CryptoJS?

Recent CryptoJS versions support SHA3 hashing.
SHA3 can output different hash sizes, including 512-bit (default) and 256-bit. These two work fine:
var sha3_512_hash = CryptoJS.SHA3( 'test' );
var sha3_256_hash = CryptoJS.SHA3( 'test' , { outputLength:256 } );
Similarly, CryptoJS can also calculate HMAC values. However, I can't figure out how to change the default output size there:
var sha3_512_hmac = CryptoJS.HmacSHA3( 'test' , 'key' );
var sha3_256_hmac = CryptoJS.HmacSHA3( 'test' , 'key' , { outputLength:256 } );
The first works OK (the result is a 512-bit hmac value) but the second is the same (i.e. also 512-bit), as if it ignores the {outputLength:256} parameter!
Here's a live example: http://jsfiddle.net/M8xf3/ (using hmac-sha3.js from CryptoJS 3.1.2)
Does anyone know how to create 256-bit SHA3-based HMAC hashes?
P.S. For the SHA2 family of functions, CryptoJS has separate Hmac functions for each output size (that's HmacSHA256 and HmacSHA512). But this doesn't seem to be the case for SHA3?
This doesn't answer the actual question, but note that with SHA3 you don't really need HMAC hashes. Unlike SHA1 and SHA2 and MD5, SHA3 is not vulnerable to length-extension attacks.
Therefore with SHA3 it would suffice to just prepend or append the secret key to your input.
Or, if you're paranoid of a single hash step becoming compromised (not to be expected in the foreseeable future, especially not with SHA3, but still) you could do something like SHA3(key+SHA3(key+data)) or SHA3(key+SHA3(key+data)+data) (obviously with "+" denoting binary concatenation).
You can just edit hmac-sha3.js and change the outputLength to 256-bit instead of 512-bit.
Open hmac-sha3.js file, using your text editor.
Find "{outputLength:512}" and replace it with "{outputLength:256}"
Then the hash output will be 256-bit in length.
To be sure that you did not messed up things, double check your 256-bit hmac-sha3 output with some test cases available on the internet, for example: http://www.di-mgt.com.au/hmac_sha3_testvectors.html

Sending hex characters over a socket

I am trying to send a hex character to a socket to indicate a new message. This code works:
$socket->send("\x{0B}");
$socket->send($contents);
$socket->send("\x{1C}");
$socket->send("\x{0D}");
However, since this happens in a loop, I need variable hex characters, and I have not figure out how to get it to work. This is what I have tried.
my $start_char = get(); # returns, for example 0B
my $end_char = get(); # 1C
my $end_seg = get(); #0D
$socket->send("\x{$start_char}");
$socket->send($contents);
$socket->send("\x{$end_char}");
$socket->send("\x{$end_seg}");
I can verify that the variables returned by the function are correct on the perl side, but the server does not accept them as valid characters. Any input regarding how to do this?
Try ...send( chr($start_char) );, etc. (guessing that get() is actually returning integers).
If it really gives you strings like "0B", then ...send( chr(hex($start_chr)) );
If you have a small amount of data, ysth's answer makes sense.
If you have a larger amount of data, you may want to look at pack.
pack ("H*", "0B") and pack ("C*", 0x0B) both give "\x0B".