RAW RSA encryption and decryption with Crypto++ - rsa

I need to establish a secure communication between a PC and a device which supports RSA encryption and signature with SHA1. As I have already used Crypto++ in other parts of my application, I would like to utilize Crypto++ for this as well.
The device is very primitive but allows executing a program I write on it. It has raw RSA and SHAa functions built-in; However, it has very little memory to work with, 2K bytes to be precise.
I have to encrypt and sign a message from a PC. Then the device decrypts and verifies the message. The device will then reply an encrypted message and sign on it. The PC will decrypt the message and verify it afterwards. I have implemented the raw RSA encryption, signature and verification with SHA1 inside the device using the built-in functions. The messages is short enough to be done in a single round.
However, I don't know how to encrypt a message with raw RSA using Crypto++ without involving OAEP or PKCS#1. Could somebody kind enough to show me some sample code? Thanks a ton!

I don't know how to encrypt a message with raw RSA using Crypto++ without
involving OAEP or PKCS#1. Could somebody kind enough to show me some sample code?
That's easy enough when you know where to look: Raw RSA from the Crypto++ wiki. The code below was taken from the page.
Encryption
Integer n("0xbeaadb3d839f3b5f"), e("0x11"), d("0x21a5ae37b9959db9");
RSA::PublicKey pubKey;
pubKey.Initialize(n, e);
///////////////////////////////////////////////////////////////
Integer m, c;
string message = "secret";
cout << "message: " << message << endl;
// Treat the message as a big endian byte array
m = Integer((const byte *)message.data(), message.size());
cout << "m: " << hex << m << endl;
// Encrypt
c = pubKey.ApplyFunction(m);
cout << "c: " << hex << c << endl;
Decryption
Integer n("0xbeaadb3d839f3b5f"), e("0x11"), d("0x21a5ae37b9959db9");
AutoSeededRandomPool prng;
RSA::PrivateKey privKey;
privKey.Initialize(n, e, d);
///////////////////////////////////////////////////////////////
Integer c("0x3f47c32e8e17e291"), r;
string recovered;
// Decrypt
r = privKey.CalculateInverse(prng, c);
cout << "r: " << hex << r << endl;
// Round trip the message
size_t req = r.MinEncodedSize();
recovered.resize(req);
r.Encode((byte *)recovered.data(), recovered.size());
cout << "recovered: " << recovered << endl;
Here's a sample output:
$ ./cryptopp-raw-rsa.exe
message: secret
m: 736563726574h
c: 3f47c32e8e17e291h
r: 736563726574h
recovered: secret
There is one caveat: c = m ^ e mod n, so there are some limits on plaint text size and cipher text size. Essentially, m and c must be smaller than n. In this example, replacing the string secret with now is the time for all good men to come to the aid of their country would fail because it's larger than n when converted to an Integer.
You can get the maximum plain text size with the function MaxPreImage(), and the maximum cipher text size with MaxImage().
I have to encrypt and sign a message from a PC. Then the device decrypts
and verifies the message. The device will then reply an encrypted message
and sign on it. The PC will decrypt the message and verify it afterwards.
On the surface, this looks like it will suffer replay attacks. You might need a protocol with the protection.

Here is a demo function I wrote when I first did RSA encryption and decryption with Crypto++. I wrote it just to understand the basics. I hope it helps:
#include <cryptopp/files.h>
#include <cryptopp/modes.h>
#include <cryptopp/osrng.h>
#include <cryptopp/rsa.h>
#include <cryptopp/sha.h>
using namespace CryptoPP;
void rsa_examples()
{
// Keys created here may be used by OpenSSL.
//
// openssl pkcs8 -in key.der -inform DER -out key.pem -nocrypt
// openssl rsa -in key.pem -check
AutoSeededRandomPool rng;
// Create a private RSA key and write it to a file using DER.
RSAES_OAEP_SHA_Decryptor priv( rng, 4096 );
TransparentFilter privFile( new FileSink("rsakey.der") );
priv.DEREncode( privFile );
privFile.MessageEnd();
// Create a private RSA key and write it to a string using DER (also write to a file to check it with OpenSSL).
std::string the_key;
RSAES_OAEP_SHA_Decryptor pri( rng, 2048 );
TransparentFilter privSink( new StringSink(the_key) );
pri.DEREncode( privSink );
privSink.MessageEnd();
std::ofstream file ( "key.der", std::ios::out | std::ios::binary );
file.write( the_key.data(), the_key.size() );
file.close();
// Example Encryption & Decryption
InvertibleRSAFunction params;
params.GenerateRandomWithKeySize( rng, 1536 );
std::string plain = "RSA Encryption", cipher, decrypted_data;
RSA::PrivateKey privateKey( params );
RSA::PublicKey publicKey( params );
RSAES_OAEP_SHA_Encryptor e( publicKey );
StringSource( plain, true, new PK_EncryptorFilter( rng, e, new StringSink( cipher )));
RSAES_OAEP_SHA_Decryptor d( privateKey );
StringSource( cipher, true, new PK_DecryptorFilter( rng, d, new StringSink( decrypted_keydata )));
assert( plain == decrypted_data );
}

Related

STM32 SPI data is sent the reverse way

I've been experimenting with writing to an external EEPROM using SPI and I've had mixed success. The data does get shifted out but in an opposite manner. The EEPROM requires a start bit and then an opcode which is essentially a 2-bit code for read, write and erase. Essentially the start bit and the opcode are combined into one byte. I'm creating a 32-bit unsigned int and then bit-shifting the values into it. When I transmit these I see that the actual data is being seen first and then the SB+opcode and then the memory address. How do I reverse this to see the opcode first then the memory address and then the actual data. As seen in the image below, the data is BCDE, SB+opcode is 07 and the memory address is 3F. The correct sequence should be 07, 3F and then BCDE (I think!).
Here is the code:
uint8_t mem_addr = 0x3F;
uint16_t data = 0xBCDE;
uint32_t write_package = (ERASE << 24 | mem_addr << 16 | data);
while (1)
{
/* USER CODE END WHILE */
/* USER CODE BEGIN 3 */
HAL_SPI_Transmit(&hspi1, &write_package, 2, HAL_MAX_DELAY);
HAL_Delay(10);
}
/* USER CODE END 3 */
It looks like as your SPI interface is set up to process 16 bit halfwords at a time. Therefore it would make sense to break up the data to be sent into 16 bit halfwords too. That would take care of the ordering.
uint8_t mem_addr = 0x3F;
uint16_t data = 0xBCDE;
uint16_t write_package[2] = {
(ERASE << 8) | mem_addr,
data
};
HAL_SPI_Transmit(&hspi1, (uint8_t *)write_package, 2, HAL_MAX_DELAY);
EDIT
Added an explicit cast. As noted in the comments, without the explicit cast it wouldn't compile as C++ code, and cause some warnings as C code.
You're packing your information into a 32 bit integer, on line 3 of your code you have the decision about which bits of data are placed where in the word. To change the order you can replace the line with:
uint32_t write_package = ((data << 16) | (mem_addr << 8) | (ERASE));
That is shifting data 16 bits left into the most significant 16 bits of the word, shifting mem_addr up by 8 bits and or-ing it in, and then adding ERASE in the least significant bits.
Your problem is the Endianness.
By default the STM32 uses little edian so the lowest byte of the uint32_t is stored at the first adrress.
If I'm right this is the declaration if the transmit function you are using:
HAL_StatusTypeDef HAL_SPI_Transmit(SPI_HandleTypeDef *hspi, uint8_t *pData, uint16_t Size, uint32_t Timeout)
It requires a pointer to uint8_t as data (and not a uint32_t) so you should get at least a warning if you compile your code.
If you want to write code that is independent of the used endianess, you should store your data into an array instead of one "big" variable.
uint8_t write_package[4];
write_package[0] = ERASE;
write_package[1] = mem_addr;
write_package[2] = (data >> 8) & 0xFF;
write_package[3] = (data & 0xFF);

Transmitting SD Card Communication sequence on STM32f103 with Hal Driver

I am writing a SD Card low level driver to implement Chan's FATFS System on a Olimex MOD-MP3-X board with a STM32f103rb processor. I'm pretty new to this topic at all and I am no native english speaker, but i hope i can point out my problem.
At first I want to write a function to send the cmd commands via SPI.
So i got this Prototype for the function usage:
static void sd_cmd(uint8_t cmd, uint32_t arg);
//I already found the HAL-command to transmit the data:
HAL_SPI_Transmit_IT(SPI_HandleTypeDef *hspi, uint8_t *pData, uint16_t Size);
//But I don't know exactly how put the Argument and the cmd into the data buffer.
//I already tried by creating an array, but this didn't work:
uint8_t buffer[5];
uint8_t buffer[0]= 0x40 | cmd;
uint8_t buffer[1]= arg >> 24;
uint8_t buffer[2]= arg >> 16;
uint8_t buffer[3]= arg >> 8;
uint8_t buffer[4]= arg;
...
HAL_SPI_Transmit_IT(&hspi2, &buffer, 5);

Invalid CRC32 Hash Generation

I'm creating SHA1 and CRC32 hash from plain text using Crypto++ Library as the following:
#include <cryptopp/filters.h>
#include <cryptopp/hex.h>
#include <cryptopp/sha.h>
#include <cryptopp/crc.h>
#include <string.h>
#include <iostream>
int main()
{
// Calculate SHA1
std::string data = "Hello World";
std::string base_encoded_string;
byte sha_hash[CryptoPP::SHA::DIGESTSIZE];
CryptoPP::SHA().CalculateDigest(sha_hash, (byte*)data.data(), data.size());
CryptoPP::StringSource ss1( std::string(sha_hash, sha_hash+CryptoPP::SHA::DIGESTSIZE), true,
new CryptoPP::HexEncoder( new CryptoPP::StringSink( base_encoded_string ) ));
std::cout << base_encoded_string << std::endl;
base_encoded_string.clear();
// Calculate CRC32
byte crc32_hash[CryptoPP::CRC32::DIGESTSIZE];
CryptoPP::CRC32().CalculateDigest(crc32_hash, (byte*)data.data(), data.size());
CryptoPP::StringSource ss2( std::string(crc32_hash, crc32_hash+CryptoPP::CRC32::DIGESTSIZE), true,
new CryptoPP::HexEncoder( new CryptoPP::StringSink( base_encoded_string ) ));
std::cout << base_encoded_string << std::endl;
base_encoded_string.clear();
}
The output I get is:
0A4D55A8D778E5022FAB701977C5D840BBC486D0
56B1174A
Press any key to continue . . .
And, out of these I confirmed that CRC32 is incorrect according to various online resources such as this one: http://www.fileformat.info/tool/hash.htm?text=Hello+World
I have no idea why because I'm creating CRC32 hash by following the same procedure as I followed for SHA1. Is there really different way or am I really doing something wrong in here?
byte crc32_hash[CryptoPP::CRC32::DIGESTSIZE];
I believe you have a bad endian interaction. Treat the CRC32 value is an integer, not a byte array.
So try this:
int32_t crc = (crc32_hash[0] << 0) | (crc32_hash[1] << 8) |
(crc32_hash[2] << 16) | (crc32_hash[3] << 24);
If crc32_hash is integer aligned, then you can:
int32_t crc = ntohl(*(int32_t*)crc32_hash);
Or, this might be easier:
int32_t crc32_hash;
CryptoPP::CRC32().CalculateDigest(&crc32_hash, (byte*)data.data(), data.size());
I might be wrong about int32_t, it might be uint32_t (I did not look at the standard).

Length of data to hash for PGP

I have finally managed to verify some simple PGP signed message blocks. However, I discovered that for some reason, my implementation limits me to verifying data that is 9-16 bytes long. no less. no more.
is there some instruction somewhere (RFC4880 or elsewhere) that specifies how to deal with plaintext data of any length? maybe there is some sort of padding i missed? pkcs1?
I am pretty sure i formatted the data to hash properly, since the instructions in RFC 4880 sec 5.2.4 say for text documents, just replace all \n with \r\n and add a trailer. since my test values were single lines of data, nothing had to be replaced
all of these values are in base 10 unless otherwise noted:
// DSA public key values
p = 175466718616740411615640156350265486163809613514213656685227237159351776260193236923030228927905671867677337184318134702903960237546408302010360724274436019639502405323187799029742776686067449287558904042137172927936686590837020160292525250748155580652384740664931255981772117478967314777932252547256795892071
q = 809260232002608708872165272150356204306578772713
g = 127751900783328740354741342100721884490035793278553520238434722215554870393020469115393573782393994875216405838455564598493958342322790638050051759023658096740912555025710033120777570527002197424160086000659457154926758682221072408093235236853997248304424303705425567765059722098677806247252106481642577996274
y = 172935968966072909036304664996424500241381878537444332146572958203083745609400290814117451480512268901233962890933482206538294509037615827035398352528065134903071886710296983781453184598843331365336270501467458073523376152406987560592548479865116940266729198119357206749848310472131186772143408998928864559411
not working:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
abcd
-----BEGIN PGP SIGNATURE-----
Version: BCPG v1.39
iFsEARECABsFAk/tB28UHGFiYyA8bWFrY21AYWFhLmNvbT4ACgkQMFIlRc933Ya2
RwCfdMyI08Iz0rDXVHOPlGA3s5Y9j/8An2He7+hHjWfGJNoOJT7gAxqJaoLo
=I2rT
-----END PGP SIGNATURE-----
data hashed (in hex): 6162636404011102001b05024fed076f141c616263203c6d616b636d406161612e636f6d3e04ff00000021
r = 666804200764671083282351405489424949903645052927
s = 558743769080942454889260816818443017172325925608
w = 702955297882281869313155599553522395227576660460 // s^-1 mod q
u1 = 190417717173929082607343542521304347388874234334
u2 = 306786785479358548892951170619047936651163362761
v = g^u1 * y^u2 % p % q = 737052148656331043521702886300418501784667890334
v != r
working:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
0123456789abcdef
-----BEGIN PGP SIGNATURE-----
Version: BCPG v1.39
iFsEARECABsFAk/tCE0UHGFiYyA8bWFrY21AYWFhLmNvbT4ACgkQMFIlRc933YYG
IQCfercgPsXFnah6otgQdEMbv9OeCgIAnRIyOLirbqSlBugBT6Ex/Adz4+7L
=bzab
-----END PGP SIGNATURE-----
data hashed (in hex): 3031323334353637383961626364656604011102001b05024fed084d141c616263203c6d616b636d406161612e636f6d3e04ff00000021
r = 700580719365380086754774917458461236187098909186
s = 103881812262595813943381509986903840453887782603
w = 178510125628083028184051840492924307896586330444 // s^-1 mod q
u1 = 78831508775508876446567239486098677466912246622
u2 = 572875590470993668032596348682349224460207395691
v = g^u1 * y^u2 % p % q = 700580719365380086754774917458461236187098909186
v == r
what data did i not include in the hash / what did i do wrong?
EDIT: here is the public key as requested, even though the relevant values have been posted already)
-----BEGIN PGP PUBLIC KEY BLOCK-----
Version: BCPG v1.39
mQGiBE5B0h8RBAD533Z5bK1IpBx02QyQL0QoJE4uFRIMGDiwXuwmZzVl+R7Vlurd
GRLsCCbE6vOOh7XQVZGzLEBy9WNzZ9m+EbCfSVAYkjS6FhLws6hG6irrnS+b3JBf
gFJ8vNGF9Z7bhx+7y7NBk0IMyWkGnUkcnav73t5FQUI2faEBN4c/yAGJZwCgjcB7
3akWk9XVWvTCsiMXxpyvkukEALXsvB6cOoFEtQq9cQHjP63fBlvD94dhhMiM0cH6
hW9JotxdK+cxFGG9ZIWgoN2PWbMJka/H4W5EL6tS+YiNAR7I1Ozkt6X16GjnQUzZ
MlSpleK+KiKVN2anRaPEoOIinHrE3ZXd6QlJ/4+OJn4IVWmSEaJpFf4QNgvEu4rh
xinyBAD2RNzREOA+wpnFZ4lDt9NZXmXdxQME/l0J9XcvWhpGsxA/MATQKImy7N49
7GT/M38F+TrpBobag1O3buE99fOLyws4Tbc+sZMdHxoiGZDAIRNQS2rv475E6ktj
7vd5CYvOkA6+8sX1+hPcNlkHtHB1OFkJRsYp6k0zkyC9adjBM7QTYWJjIDxtYWtj
bUBhYWEuY29tPohGBBMRAgAGBQJOQdIfAAoJEDBSJUXPd92GRSQAoItbtbToOg7a
/hcg2sA/aBEQNwuxAKCGR69vmSoCWoBP5waPk0UsjM3BSbjMBE5B0h8QAgCUlP7A
lfO4XuKGVCs4NvyBpd0KA0m0wjndOHRNSIz44x24vLfTO0GrueWjPMqRRLHO8zLJ
S/BXO/BHo6ypjN87Af0VPV1hcq20MEW2iujh3hBwthNwBWhtKdPXOndJGZaB7lsh
LJuWv9z6WyDNXj/SBEiV1gnPm0ELeg8Syhy5pCjMAf9QHehP2eCFqfEwTAnaOlA6
CU+rYHKPZaI9NUwCA7qD2d93/l08/+ZtFvejZW1RWrJ8qfLDRtlPgRzigoF/CXbR
iEYEGBECAAYFAk5B0h8ACgkQMFIlRc933YZRrACfUnWTjHHN+QsEEoJrwRvFmvzj
bR4An24pTpeeN+I6R59O/sdmYsAhjULX
=sStS
-----END PGP PUBLIC KEY BLOCK-----
Haven't got enough time to look up the details, but I would guess that you're applying (or not applying) padding correctly. That would cause the right result to come up for some input lengths, but not for others.
I guess I'll look into this more, but I wanted to get something in under the bounty wire :)
Edit: Ok, found an error. Not sure why you're getting it, but if it's fixed, then the right answer comes out. In your not-working example, you calculate w (s^-1 mod q) as
w = 702955297882281869313155599553522395227576660460 // s^-1 mod q
but I get
w = 702955297882281869313155599553522395227576660458
off by 2! Really, really close values though. And it can be shown that mine is right:
s * your_w mod q = 308227306159276200906356361486529830038073078504
s * my_w mod q = 1
If you plug in this w value, you then get
u1 = 536931432138658080437983667536052790245747416035
u2 = 591698847955233800072578903940910445457030802333
v = (g^u1 * y^u2) % p % q = 666804200764671083282351405489424949903645052927
r == v
Hope that helps.

Implementing HMAC encryption algorithm in iPhone application

I want to implement HMAC encryption algorithm for my iPhone application. Any sample code will really help. Also, please guide me with brief implementation of the same.
Use the Common Crypto functions. The documentation is in man pages, so you'll need to hunt for it a bit. They're in libSystem on iOS and Mac OS X, so no need to add another library or framework to your project. As you can see from the example below, the API is very similar to OpenSSL's.
If you are actually interested in encrypting, as opposed to authenticating data, Common Crypto has functions to perform AES and 3DES (and DES, but don't use it, it's far too weak for modern needs). Take a look at the CCCryptor man page for details.
The example below is equivalent to running openssl dgst -md5 -hmac secret < myfile.txt. Start by initializing the the CCHmacContext, and then call CCHmacUpdate as long as you have data to authenticate. When you've read all the bytes, call CCHmacFinal to get the HMAC into a buffer. I've provided a crude method to convert the HMAC bytes into printable hex.
#include <CommonCrypto/CommonHMAC.h>
#include <sys/types.h>
#include <errno.h>
#include <fcntl.h>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <unistd.h>
extern int errno;
int
main( int ac, char *av[] )
{
CCHmacContext ctx;
char *key = "secret";
char buf[ 8192 ];
unsigned char mac[ CC_MD5_DIGEST_LENGTH ];
char hexmac[ 2 * CC_MD5_DIGEST_LENGTH + 1 ];
char *p;
int fd;
int rr, i;
if ( ac != 2 ) {
fprintf( stderr, "usage: %s path\n", av[ 0 ] );
exit( 1 );
}
if (( fd = open( av[ 1 ], O_RDONLY )) < 0 ) {
fprintf( stderr, "open %s: %s\n", av[ 1 ], strerror( errno ));
exit( 2 );
}
CCHmacInit( &ctx, kCCHmacAlgMD5, key, strlen( key ));
while (( rr = read( fd, buf, sizeof( buf ))) > 0 ) {
CCHmacUpdate( &ctx, buf, rr );
}
if ( rr < 0 ) {
perror( "read" );
exit( 2 );
}
CCHmacFinal( &ctx, mac );
(void)close( fd );
p = hexmac;
for ( i = 0; i < CC_MD5_DIGEST_LENGTH; i++ ) {
snprintf( p, 3, "%02x", mac[ i ] );
p += 2;
}
printf( "%s\n", hexmac );
return( 0 );
}
HMAC is not an encryption mechanism, but an authentication digest. It uses an underlying message digest function such as SHA-1, SHA-256, MD5 etc, with a secret key to generate a code that can be used to authenticate data.
Generating an HMAC digest is extremely simple. Here is the description from RFC2104 (via Wikipedia)
Let:
H(·) be a cryptographic hash function (ie. SHA-1, SHA-256, MD5 etc)
K be a secret key padded to the right with extra zeros to the input block size of the hash function, or the hash of the original key if it's longer than that block size
m be the message to be authenticated
| denote concatenation
⊕ denote exclusive or (XOR)
opad be the outer padding (0x5c5c5c…5c5c, one-block-long hexadecimal constant)
ipad be the inner padding (0x363636…3636, one-block-long hexadecimal constant)
Then HMAC(K,m) is mathematically defined by:
HMAC(K,m) = H((K ⊕ opad) | H((K ⊕ ipad) | m)).
For the underlying digest function you can help yourself to one of the C implementations from OpenSSL. In fact it also has a C implementation of HMAC that you can probably just use as is.