How do I extract the private key components $N$ and $D$ from a private RSA key? - rsa

I have a private RSA key like – for example – this one:
-----BEGIN RSA PRIVATE KEY-----
MIIBOgIBAAJBAMPMNNpbZZddeT/GTjU0PWuuN9VEGpxXJTAkmZY02o8238fQ2ynt
N40FVl08YksWBO/74XEjU30mAjuaz/FB2kkCAwEAAQJBALoMlsROSLCWD5q8EqCX
rS1e9IrgFfEtFZczkAWc33lo3FnFeFTXSMVCloNCBWU35od4zTOhdRPAWpQ1Mzxi
aCkCIQD9qjKjNvbDXjUcCNqdiJxPDlPGpa78yzyCCUA/+TNwVwIhAMWZoqZO3eWq
SCBTLelVQsg6CwJh9W7vlezvWxUni+ZfAiAopBAg3jmC66EOsMx12OFSOTVq6jiy
/8zd+KV2mnKHWQIgVpZiLZo1piQeAvwwDCUuZGr61Ap08C3QdsjUEssHhOUCIBee
72JZuJeABcv7lHhAWzsiCddVAkdnZKUo6ubaxw3u
-----END RSA PRIVATE KEY-----
This private RSA key was generated using OpenSSL using the following command:
openssl genrsa
Now, how do I get the value of $N$ and $D$ used for decryption using this key and what format is the key in?

what format is the key in?
That is an RSA private key with a PEM encoding. I believe the PEM encoding is from RFC 1421. After the PEM encoding is peeled off, there's an ASN.1/DER encoded RSA private key. The ASN.1 encoding is binary, so its not human readable. The format for the ASN.1 key can be found in PKCS #1 or RFC 3447.
According to RFC 3447, Section A.1.2 RSA Private Key Syntax, here's what you can expect:
RSAPrivateKey ::= SEQUENCE {
version Version,
modulus INTEGER, -- n
publicExponent INTEGER, -- e
privateExponent INTEGER, -- d
prime1 INTEGER, -- p
prime2 INTEGER, -- q
exponent1 INTEGER, -- d mod (p-1)
exponent2 INTEGER, -- d mod (q-1)
coefficient INTEGER, -- (inverse of q) mod p
otherPrimeInfos OtherPrimeInfos OPTIONAL
}
Your key is on my Pasteboard (Clipboard on Linux), so:
$ pbpaste | openssl rsa -text -noout
Private-Key: (512 bit)
modulus:
00:c3:cc:34:da:5b:65:97:5d:79:3f:c6:4e:35:34:
3d:6b:ae:37:d5:44:1a:9c:57:25:30:24:99:96:34:
da:8f:36:df:c7:d0:db:29:ed:37:8d:05:56:5d:3c:
62:4b:16:04:ef:fb:e1:71:23:53:7d:26:02:3b:9a:
cf:f1:41:da:49
publicExponent: 65537 (0x10001)
privateExponent:
00:ba:0c:96:c4:4e:48:b0:96:0f:9a:bc:12:a0:97:
ad:2d:5e:f4:8a:e0:15:f1:2d:15:97:33:90:05:9c:
df:79:68:dc:59:c5:78:54:d7:48:c5:42:96:83:42:
05:65:37:e6:87:78:cd:33:a1:75:13:c0:5a:94:35:
33:3c:62:68:29
prime1:
00:fd:aa:32:a3:36:f6:c3:5e:35:1c:08:da:9d:88:
9c:4f:0e:53:c6:a5:ae:fc:cb:3c:82:09:40:3f:f9:
33:70:57
prime2:
00:c5:99:a2:a6:4e:dd:e5:aa:48:20:53:2d:e9:55:
42:c8:3a:0b:02:61:f5:6e:ef:95:ec:ef:5b:15:27:
8b:e6:5f
exponent1:
28:a4:10:20:de:39:82:eb:a1:0e:b0:cc:75:d8:e1:
52:39:35:6a:ea:38:b2:ff:cc:dd:f8:a5:76:9a:72:
87:59
exponent2:
56:96:62:2d:9a:35:a6:24:1e:02:fc:30:0c:25:2e:
64:6a:fa:d4:0a:74:f0:2d:d0:76:c8:d4:12:cb:07:
84:e5
coefficient:
17:9e:ef:62:59:b8:97:80:05:cb:fb:94:78:40:5b:
3b:22:09:d7:55:02:47:67:64:a5:28:ea:e6:da:c7:
0d:ee
... how do I get the value of $N$ and $D$ used for decryption using this key
This should do it for you:
$ pbpaste | /usr/local/ssl/macosx-x64/bin/openssl rsa -noout -modulus
Modulus=C3CC34DA5B65975D793FC64E35343D6BAE37D5441A9C57253024999634DA8F36DFC7D0DB
29ED378D05565D3C624B1604EFFBE17123537D26023B9ACFF141DA49
Unfortunately, there's no -d or -privateExponent switch. You'll have to parse that using some other method.

Related

Manually calculating JWT signature never outputs the real signature

I've been reading a lot of questions on stackOverflow and jwt's docs.
Right now from what I understand this is what I should do to calculate a token:
header =
{
"alg": "HS256",
"typ": "JWT"
}
payload =
{
"sub": "1234567890",
"name": "JohnDoe",
"iat": 1516239022
}
secret = "test123"
Remove unnecessary spaces and breaklines from header and payload and then encoding both to base64url.
base64urlEncode(header)
// output: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9
base64urlEncode(payload)
// output: eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG5Eb2UiLCJpYXQiOjE1MTYyMzkwMjJ9
Same output as on jwt.io, perfect.
Calculate the sha256 hmac using "test123" as secret.
sha256_hmac("eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG5Eb2UiLCJpYXQiOjE1MTYyMzkwMjJ9", "test123)
// output: 3b59324118bcd59a5435194120c2cfcb7cf295f25a79149b79145696329ffb95
Convert the hash to string and then base64url encode it.
I use hex to string converter for this part, then I encode it using base64urlEncode and I get the following output:
O1kyQRjCvMOVwppUNRlBIMOCw4_Di3zDssKVw7JaeRTCm3kUVsKWMsKfw7vClQ
Output from jwt.io
O1kyQRi81ZpUNRlBIMLPy3zylfJaeRSbeRRWljKf-5U
But if I go to this page From Hex, to Base64 I get the correct output:
O1kyQRi81ZpUNRlBIMLPy3zylfJaeRSbeRRWljKf-5U
So what am I doing wrong? Why converting the hex to string and then Encoding it outputs a different result?
In case the online hex to string conversion is wrong, how can I convert this hex to string (so then I can encode it) on c++ without using any libray. Am I correct if I convert each byte (2 characters because hex = 4 bits) to ASCII character and then encode?
Thanks in advance.
Your hmac step is correct, does have the right output bytes (as commented). The conversion problem you have is caused by non-display chars in the temporary string (the raw bytes were not correctly copied pasted from first webpage to second).
To reproduce the exact output at each stage, you can use these commands below.
In terms of C++, you should try to operate on the raw bytes, rather than on the hex string. Take the raw bytes and run them through a base64 URL-safe encoder. Or, as in the example below, take the raw bytes, run them through a plain base64 encoder, and then fix the generated base64 string to be URL safe.
Construct the header
jwt_header=$(echo -n '{"alg":"HS256","typ":"JWT"}' | base64 | sed s/\+/-/g | sed 's/\//_/g' | sed -E s/=+$//)
# ans: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9
Construct the payload
payload=$(echo -n '{"sub":"1234567890","name":"JohnDoe","iat":1516239022}' | base64 | sed s/\+/-/g |sed 's/\//_/g' | sed -E s/=+$//)
# ans: eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG5Eb2UiLCJpYXQiOjE1MTYyMzkwMjJ9
Raw password
secret="test123"
Convert secret to hex (not base64)
hexsecret=$(echo -n "$secret" | xxd -p | tr -d '\n')
# ans: 74657374313233
Perform hmac, and capture the raw bytes (caution, this is a non printable string)
hmac_signature_rawbytes=$(echo -n "${jwt_header}.${payload}" | openssl dgst -sha256 -mac HMAC -macopt hexkey:$hexsecret -binary)
Dump the raw bytes as hex, for illustration only (matches OP output)
echo -n ${hmac_signature_rawbytes} | xxd -p | tr -d '\n'
#ans: 3b59324118bcd59a5435194120c2cfcb7cf295f25a79149b79145696329ffb95
For JWT signature, convert raw bytes to base64uri encoding
hmac_signature=$(echo -n ${hmac_signature_rawbytes} | base64 | sed s/\+/-/g | sed 's/\//_/g' | sed -E s/=+$//)
#ans: O1kyQRi81ZpUNRlBIMLPy3zylfJaeRSbeRRWljKf-5U
Create the full token
jwt="${jwt_header}.${payload}.${hmac_signature}"
# ans: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG5Eb2UiLCJpYXQiOjE1MTYyMzkwMjJ9.O1kyQRi81ZpUNRlBIMLPy3zylfJaeRSbeRRWljKf-5U

How do I get to the same results as the Linux crypt and salt output?

I used the following command on my Ubuntu machine "openssl passwd -crypt - salt pass book" to generate a salted password.
What hash is the output made up of? e.g SHA-512, MD5 etc. Also, i'm wondering how it's made up. For example, is it made by hashing "passbook" together?
I need more information on what hashing/algorithm is being used to generate the output I see.
Thanks
The result provided by the openssl passwd app when using the -crypt algorithm seems to be the same as the result provided by the Linux/Unix crypt() function. You can verify this with the following (quick'n'dirty) code snippet:
#include <crypt.h>
#include <stdio.h>
int main(
int argc,
char **argv)
{
char *key = argv[1];
char *salt = argv[2];
char *enc = crypt(key, salt);
printf("key = \"%s\", salt = \"%s\", enc = \"%s\"\n",
key ? key:"NULL", salt ? salt:"NULL", enc ? enc:"NULL");
}
Result:
$ ./main book pass
key = "book", salt = "pass", enc = "pahzZkfwawIXw"
$ openssl passwd -crypt -salt pass book
pahzZkfwawIXw
The exact details of how the crypt() function seem to be explained most clearly in its OSX man page, in particular:
Traditional crypt:
The first 8 bytes of the key are null-padded, and the low-order 7 bits of each character is
used to form the 56-bit DES key.
The salt is a 2-character array of the ASCII-encoded salt. Thus, only 12 bits of salt are
used. count is set to 25.
Algorithm:
The salt introduces disorder in the DES algorithm in one of 16777216 or 4096 possible ways
(ie. with 24 or 12 bits: if bit i of the salt is set, then bits i and i+24 are swapped in
the DES E-box output).
The DES key is used to encrypt a 64-bit constant, using count iterations of DES. The value
returned is a null-terminated string, 20 or 13 bytes (plus null) in length, consisting of
the salt, followed by the encoded 64-bit encryption.

Swift 3 - AES Encryption Heimdall - Zero Padding

i am using Heimdall
https://github.com/henrinormak/Heimdall
for generating my 1024 bit RSA keys and encrypting messages:
let heimdall = Heimdall(publicTag: publicTag, publicKeyData: data)
When i utf8-encode and base64-encode i pass it to the encrypt method:
let utf8Encoded = self.mystring.data(using: String.Encoding.utf8)!
let base64Encoded = utf8Encoded.base64EncodedData()
let encrypted = heimdall.encrypt(base64Encoded)
print("encrypted \(encrypted!)") // -> 160 bytes !! why not 128
The encrypted part should 128 bytes and not 160.
Can anybody help me to get there?
How can i keep generating 1024 bit rsa keys and encrypting messages with those keys to end up with 128 byte arrays?
Thanks and Greetings !!
From the Heimdall docs: Note on encryption/decryption:
The payload is built, containing the encrypted key, followed by the encrypted message.
Then this is Base64 encoded increasing the length by 1/3.
Thus the output is (the aes key + the encrypted data length + padding) Base64 encoded.

How do I extract n,e key from RSA256 public key in perl?

I use this Python code
from Crypto.PublicKey import RSA
key = RSA.importKey( open('public.key').read() )
But I have no idea how to extract n and e keys in Perl.
Please explain about n and e extraction.
You would want
my $key = Crypt::RSA::Key::Public->new( Filename => 'public.key' );

AES decryption using pycrypto

As a self study exercise, I'm trying to learn how to use some of the pycrypto library. I need to decrypt a ciphertext string in CBC_MODE using AES. I the ciphertext, key, and IV are all given. Here is the code that I have written:
from Crypto.Cipher import AES
mode = AES.MODE_CBC
key = "a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1"
ciphertext = "a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1a1";
iv = ciphertext[:32]
ciphertext = ciphertext[32:]
decryptor = AES.new(key, mode, iv)
plaintext = decryptor.decrypt(ciphertext)
print plaintext
When I run this, I get the following error:
ValueError: IV must be 16 bytes long
I know that the IV string is 32 hex characters, and therefore 16 bytes. I think that this might be a typing problem, but I don't know how to correct it. Can anyone help?
Thank you!
Your strings contain only hex characters, but they are still plain strings, so every character counts.
So your IV string is 32 byte long as you sliced it out from ciphertext.
I suspect you're right and it is down to typing. Try one of these:
iv = binascii.unhexlify(ciphertext[:32])
or
iv = long(ciphertext[:32], 16)
Tell the computer you are dealing with hex. It is treating it as a string.
iv = iv.decode('hex');