AES128 decoding problems - aes

I have AES128 crypted web-services
I use the AFHTTPRequestOperationManager and I receive the good response ( the response is crypted string )
When I try to decript the string I look the wrong response
This is my code:
NSString *string = [[NSString alloc]initWithData:response encoding:NSUTF8StringEncoding];
NSLog(#"%#",string);
NSString *decodedString = [AES128Util AES128Decrypt:string key:Key];
NSLog(#"%#",decodedString);
The key is good beacause on android work correctly
In the first NSLog I read the correctly coded response
In the second NSLog I read the NULL string
Thanks
Edit:
LR1JZEOE8MgbEgyZtbqSAbO5ZL5wYBCpLX0KE4PynsFZiRBJe3lvRRr0CPbf0ufuSga8dG5j6IeDBvbn1iNeLUb7cYIb+caSXZw7t8TgrYA=
This is the recovered coded string

A lot could be wrong with encryption and it's kinda hard to debug (or I should say it has a binary behavior working or failing...)
I would first be really pretty damn sure that my key is good.
After that you can look at the AES128Util code https://github.com/GreenWangcl/AES128-for-iOS/blob/master/ASE128Demo/ASE128/AES128Util.m
The encoding seems to be good, else I would have recommand to try Latin1 or others.
So if you have a working implementation on Android one interesting line should be this in AES128Util :
CCCryptorStatus cryptStatus = CCCrypt(kCCDecrypt,
kCCAlgorithmAES128,
kCCOptionPKCS7Padding,
keyPtr,
kCCBlockSizeAES128,
NULL,
[data bytes],
dataLength,
buffer,
bufferSize,
&numBytesCrypted);
Check the padding option in your Android implementation, maybe it's not PKCS7.
If it is, the parameters should be pretty much the same in Android and you can try to evaluate what is different.
Ps : personnaly I'm using https://github.com/RNCryptor/RNCryptor and it works fine with our AES 128 decryption (like you I also have an Android app and a server making the encryption).

Related

NSData getBytes provides different results in simulator than on device

I am sending data from a self-written server backend to my iOS device.
The application works fine in the iOS Simulator but not on the device.
For example: I'm sending the integer value 4 - on the Simulator I get 4 but on the device it's 1024. I searched the whole internet but didn't find any solution, so maybe someone can help me here. I don't think it's an endian issue cause i already payed attention to it.
This is the code:
int someintvalue = 0;
uint8_t firstBuffer[4];
[inputStream read:firstBuffer maxLength: sizeof(firstBuffer)];
NSMutableData *data1 = [[NSMutableData alloc] initWithBytes:firstBuffer
length:sizeof(firstBuffer)];
NSRange rng = NSMakeRange(0, 4);
[data1 getBytes:&someintvalue range:rng];
someintvalue = CFSwapInt32HostToBig(someintvalue);
NSLog(#"Got %i Value", someintvalue);
You're byte-swapping incorrectly. CFSwapInt32HostToBig() swaps from the "host" endianness to big-endian. This is backwards. You probably meant to use CFSwapInt32BigToHost().
I have an answer for this problem.
Ran into the same issue:
The length is incorrect in this case. iOS keeps reading and rewriting the last byte in this case.

Data encoding in Objective-C and Ruby

I'm working on an iOS app using Parse, and I need to simulate their encryption flow using Ruby for the website and browser extensions I'm building for it.
For generating the salt for AES, they have the following code in the iOS app:
NSString *str = user.email;
NSData *strData = [str dataUsingEncoding: NSUTF8StringEncoding];
NSData *encryptedData = [strData AESEncryptWithPassphrase:str]; // using the same key
What's puzzling me is dataUsingEncoding: NSUTF8StringEncoding -- it's obviously encoding the string in UTF-8, but when I see it in NSLog it seems quite different. So if str was "randomemail#gmail.com", doing NSLog on strData outputs:
<72616e64 6f6d656d 61696c40 676d6169 6c2e636f 6d>
Now, how do I get that same output in Ruby? I can't get the right salt because of this ("randomemail#gmail.com".encode("UTF-8") simply returns the email as expected). How can I simulate dataUsingEncoding in Ruby to get the same exact salt, is there something I'm missing?
Thanks
Try this:
"randomemail#gmail.com".encode("UTF-8").bytes.to_a.map{ |x| x.to_s(16)}
and you will "see" what you want.
The hexadecimal representation of "randomemail#gmail.com" with UTF-8 encoding is
<72616e64 6f6d656d 61696c40 676d6169 6c2e636f 6d>
But ruby shows you the string representation, which is "randomemail#gmail.com".
They are showing the same stuff in different ways.
Check here for more information.

Problem converting NSDATA

I'm a young iPhone developer and I'm trying to write an iPhone app that syncs data from a server side app on my mac, basically text data.
I'm having trouble reading data on the iPhone side with the following:
`
(void)connectionReceived:(NSNotification *)aNotification {
NSFileHandle *incomingConnection = [[aNotification userInfo] objectForKey:NSFileHandleNotificationFileHandleItem];
[[aNotification object] acceptConnectionInBackgroundAndNotify];
NSData *receivedData = [incomingConnection availableData];
NSString *theString = [[[NSString alloc] initWithData:receivedData encoding:NSUTF8StringEncoding] autorelease];
`
I need to print out theString to a text label on the iPhone, but what I get is the exact "hex code translation" of the text entered on the server side app and I don't seem to be able to convert it to char.
Can anyone help ?
I don't know the answer - but there are a few things to look at or try:
Did you try converting into something other than UTF8? (Unicode? ASCII?)
In the hex output it gives you - I assume it is giving a UTF8 representation - i.e. one 8-bit (two hex nibble) code for each character in your string. Is this the case? Are the hex codes "correct" for a UTF-8 or ASCII representation of your string?
Are there any "bad" codes in the result?
I am wondering if this is happening because there are characters (maybe even invisable ones - control characters - nulls, whatever) in your string which are making it so iOS can't do a "normal" UTF8 conversion...

NSData to NSString using initWithBytes:length:encoding

I have some image data (jpeg) I want to send from my iPhone app to my webservice. In order to do this, I'm using the NSData from the image and converting it into a string which will be placed in my JSON.
Currently, I'm doing this:
NSString *secondString = [[NSString alloc] initWithBytes:[result bytes]
length:[result length]
encoding:NSUTF8StringEncoding];
Where result is of type NSData. However, secondString appears to be null even though result length returns a real value (like 14189). I used this method since result is raw data and not null-terminated.
Am I doing something wrong? I've used this code in other areas and it seems to work fine (but those areas I'm currently using it involve text not image data).
TIA.
For binary data, better to encode it using Base64 encoding then decode it in you webservice. I use NSData+Base64 class downloaded from here, this reference was also taken from Stackoverflow, an answer made by #Ken (Thanks Ken!).
You are not converting the data to a string. You are attempting to interpret it as a UTF-8 encoded string, which will fail unless the data really is a UTF-8 encoded string. Your best bet is to encode it somehow, perhaps with Base64 as Manny suggests, and then decode it again on the server.

A good strategy to ensure the integrity of a file

I have some code that downloads a plist from a web server and stores it in the documents directory of the phone. My concern is that if the file becomes corrupt then it will effect the stability and user experience of the app.
I am coding defensively in the data read parts of the app, but wondered what advice is out there for checking the integrity of the file in the first place before the old one is over written. I am thinking of implementing some sort of computed value which is also stored in as a key in the plist for example.
Any thoughts about making this as robust as possible would be greatly appreciated.
Best regards
Dave
Have a look at CommonCrypto/CommonDigest.h.
The CC_MD5(const void *data, CC_LONG len, unsigned char *md); function computes an MD5 hash.
#implementation NSData (MD5)
-(NSString*)md5
{
unsigned char digest[CC_MD5_DIGEST_LENGTH];
CC_MD5([self bytes], [self length], digest);
NSString* s = [NSString stringWithFormat: #"%02x%02x%02x%02x%02x%02x%02x%02x%02x%02x%02x%02x%02x%02x%02x%02x",
digest[0], digest[1],
digest[2], digest[3],
digest[4], digest[5],
digest[6], digest[7],
digest[8], digest[9],
digest[10], digest[11],
digest[12], digest[13],
digest[14], digest[15]];
return s;
}
#end
As part of the deployment of the files on the server, you can use OpenSSL to compute the hashs. The openssl md5 filename command computes an MD5 hash for a file. This can be integrated in a script.
Then after your application has downloaded a file, it computes the hash of what's been downloaded and compares it to the hash stored on the server.
Obviously, if you want to ensure the integrity of a plist file, this plist cannot contain its own hash.