How to Pass byte array as parameter in url in iPhone? - iphone

I am using following code to get bytes array. thx to this post.
Data *data = [NSData dataWithContentsOfFile:filePath];
NSUInteger len = [data length];
Byte *byteData = (Byte*)malloc(len);
memcpy(byteData, [data bytes], len);
it is the right method to do so?
Now, How i can pass bytes array into url?
Thank You for Help,

You're using the right method to extract the raw bytes from the data. To get those into a URL, you'll need to convert them to a string. Exactly what string depends on the format that you're submitting (ie. it could be just a list of 1s and 0s, or YES and NO, or any other character(s) as required by the server you're talking to.)

Related

NSString into NSMutableData without conversion to NSData

my problem is the following. I would like to encapsulate a NSString in a NSMutableData object. But I would like to do it together with other items and without first encapsulating it into a NSData. It's only bytes after all isn't it?
My final NSMutableData object would look something like
[header | stringLength | NSString]
where header is a char and stringLength is an unsigned short.
I build my packet like this
unsigned short stringLength = myString.length;
NSMutableData* nData = [NSMutableData dataWithBytes:(const void*)&header length:sizeof(char)];
[nData appendBytes:(const void*)&dataLength length:sizeof(unsigned short)];
[nData appendBytes:(const void*)myString length:stringLength];
I would then send this over a gkSession and at the other end I would extract the sting lenght and then the string itself:
NSString* temp = [NSString alloc];
[data getBytes:(void*)&temp range:NSMakeRange(sizeof(char)+sizeof(unsigned short), stringLenght)];
For some reasons this is giving me bad memory access. I suspect that myString.length is not doing exactly what I would expect. Do you have any hints? Thanks in advance.
This line is incorrect:
[nData appendBytes:(const void*)myString length:stringLength];
This is encoding the first part of the underlying NSString structure (which is larger than stringLength).
What you mean is:
[nData appendBytes:[myString UTF8String]
length:[myString lengthOfBytesUsingEncoding:NSUTF8StringEncoding]];
length is the number of characters. This can be substantially smaller than the number of bytes for multibyte characters.
Just as a side note: if you can shorten your length to 1 byte (0-255), then that encoding is called a Pascal String, and CFString can handle that encoding natively (see CFStringGetPascalString()). Not something you generally want to do, but interesting. It's particularly nice with CFStringCreateWithPascalStringNoCopy(), since you can avoid the memory copy operation entirely. This is mostly for legacy support, and I wouldn't jump through any hoops to use it, but sometimes it's handy.

Convert NSData to double array in Objective C

I have an audio file in my iPhone app that I convert to an NSData object. Ideally, I would like to get an array of doubles from the audio file. Is there a way to convert NSData to an array of doubles?
Here is the current output from the line NSLog(#"%#\n", data) where data is an NSData object of the audio file:
<0000001c 66747970 6d703432 00000001 6d703431 6d703432 69736f6d 00000008 77696465 004d956e 6d646174 21000340 681c210c 53ed990c 1f33e94d ab588b95 55a61078 08799c67 f1f706cc 595b4eb6 08cfb807 ea0e3c40 03c13303 e674e05a...
The -bytes method return the raw bytes of the NSData. You can then cast it to a pointer to double (assuming correct endianness).
const double* array_of_doubles = (const double*)[data bytes];
NSUInteger size_of_array = [data length] / sizeof(double);
Edit: The data is an MP4 file. You cannot convert MP4 to a meaningful double array directly. I don't know what you want to do, but maybe AVAssetReader would help.

HmacSHA256 objective-c encryptation

I wanna encpryt a string with a key, using HmacSHA256. The code everyone use is the one below, but there is one thing that doesn´t make sense.
Why would we use base64 at the end if all we want is the HmacSHA256 hash?
I tried seeing the hash generated after the method CCHmac is called with
NSString *str = [[NSString alloc] initWithData:HMAC encoding:NSASCIIStringEncoding];
NSLog(#"%#", str);
But i don´t get the hash generated, i get null, or garbage, like this:
2011-10-11 09:38:05.082 Hash_HmacSHA256[368:207] (null)
2011-10-11 09:38:05.085 Hash_HmacSHA256[368:207] Rwªb7iså{yyþ§Ù(&oá÷ÛËÚ¥M`f
import < CommonCrypto/CommonHMAC.h>
NSString *key;
NSString *data;
const char *cKey = [key cStringUsingEncoding:NSASCIIStringEncoding];
const char *cData = [data cStringUsingEncoding:NSASCIIStringEncoding];
unsigned char cHMAC[CC_SHA256_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA256, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
NSData *HMAC = [[NSData alloc] initWithBytes:cHMAC
length:sizeof(cHMAC)];
NSString *hash = [HMAC base64Encoding]; //This line doesn´t make sense
[key release];
[data release];
First of all, for those wondering, this is in reference to my answer to this question: Objective-C sample code for HMAC-SHA1
The HMAC you generate is a 256-bit binary value that may or may not start with a 0 byte.
To be able to print it, you need a string representation (binary, hex, decimal, base64, etc.). Base64 is one of the most efficient among these, that's why I used a Base64 encoding there.
The reason you get garbage is that most (if not all) of the octets in the HMAC value are outside the range of printable ASCII characters. If the first octet is 0 (0x00), you get nil. This is why you need an encoding that supports arbitrary values. ASCII doesn't.
Of course, if you don't want to print the HMAC value, then may not need such an encoding, and can keep the HMAC as is (binary NSData).
I spend a whole day, trying to convert the generated hash (bytes) into readable data. I used the base64 encoded you said and it didn´t work at all for me .
So what i did was this:
CCHmac(kCCHmacAlgSHA256, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
// Now convert to NSData structure to make it usable again
NSData *out = [NSData dataWithBytes:cHMAC length:CC_SHA256_DIGEST_LENGTH];
// description converts to hex but puts <> around it and spaces every 4 bytes
NSString *hash = [out description];
hash = [hash stringByReplacingOccurrencesOfString:#" " withString:#""];
hash = [hash stringByReplacingOccurrencesOfString:#"<" withString:#""];
hash = [hash stringByReplacingOccurrencesOfString:#">" withString:#""];
// hash is now a string with just the 40char hash value in it
NSLog(#"%#",hash);
Don't do "[out description]" to get the hash as a string.
Do [hash base64Encoding] to get the base64 encoding of it. Use http://cybersam.com/ios-dev/http-basic-access-authentication-with-objective-c-and-ios/attachment/nsdataadditions to get the base64Encoding function. The additions class is a category that will add the function base64Encoding to NSData's implementation.
Or you can do [[NSString alloc]initWithData:out encoding:NSUTF8StringEncoding].

Problem with hash256 in Objective C

when i use this code for generate an hash256 in my iPhone app:
unsigned char hashedChars[32];
NSString *inputString;
inputString = [NSString stringWithFormat:#"hello"];
CC_SHA256([inputString UTF8String],
[inputString lengthOfBytesUsingEncoding:NSASCIIStringEncoding ],
hashedChars);
NSData * hashedData = [NSData dataWithBytes:hashedChars length:32];
The hash256 of inputString, is created correctly, but if i use a string like this #"\x00\x25\x53\b4", the hash256 is different from the real string with "\x" characters.
I think that the problem is in encoding "UTF8" instead of ascii.
Thanks!
I would be suspicious of the first character, "\x00" - thats going to terminate anything that thinks its dealing with "regular C strings".
Not sure whether lengthOfBytesUsingEncoding: takes that stuff into account, but its something I'd experiment with.
You're getting the bytes with [inputString UTF8String] but the length with [inputString lengthOfBytesUsingEncoding:NSASCIIStringEncoding]. This is obviously wrong. Moreover (assuming you mean "\xB4" and that it turns into something not in ASCII), "\xB4" is not likely to be in ASCII. The docs for NSString say
Returns 0 if the specified encoding cannot be used to convert the receiver
So you're calculating the hash of the empty string. Of course it's wrong.
You're less likely to have problems if you only generate the data once:
NSData * inputData = [inputString dataUsingEncoding:NSUTF8StringEncoding];
CC_SHA256(inputData.bytes, inputData.length, hashedChars);

iPhone --- 3DES Encryption returns "wrong" results?

I have some serious trouble with a CommonCrypto function. There are two existing applications for BlackBerry and Windows Mobile, both use Triple-DES encryption with ECB mode for data exchange. On either the encrypted results are the same.
Now I want to implent the 3DES encryption into our iPhone application, so I went straight for CommonCrypto:
http://www.opensource.apple.com/source/CommonCrypto/CommonCrypto-32207/CommonCrypto/CommonCryptor.h
I get some results if I use CBC mode, but they do not correspond with the results of Java or C#. Anyway, I want to use ECB mode, but I don't get this working at all - there is a parameter error showing up...
This is my call for the ECB mode... I stripped it a little bit:
const void *vplainText;
plainTextBufferSize = [#"Hello World!" length];
bufferPtrSize = (plainTextBufferSize + kCCBlockSize3DES) & ~(kCCBlockSize3DES - 1);
plainText = (const void *) [#"Hello World!" UTF8String];
NSString *key = #"abcdeabcdeabcdeabcdeabcd";
ccStatus = CCCrypt(kCCEncrypt,
kCCAlgorithm3DES,
kCCOptionECBMode,
key,
kCCKeySize3DES,
nil, // iv, not used with ECB
plainText,
plainTextBufferSize,
(void *)bufferPtr, // output
bufferPtrSize,
&movedBytes);
t is more or less the code from here: http://discussions.apple.com/thread.jspa?messageID=9017515
But as already mentioned, I get a parameter error each time...
When I use kCCOptionPKCS7Padding instead of kCCOptionECBMode and set the same initialization vector in C# and my iPhone code, the iPhone gives me different results. Is there a mistake by getting my output from the bufferPtr? Currently I get the encrypted stuff this way:
NSData *myData = [NSData dataWithBytes:(const void *)bufferPtr length:(NSUInteger)movedBytes];
result = [[NSString alloc] initWithData:myData encoding:NSISOLatin1StringEncoding];
It seems I almost tried every setting twice, different encodings and so on... where is my error?
Can you post the error message?
One of the best ways to troubleshoot this stuff, I've found, is to take known input, known key and known output ("test vectors") and compare the bytes of the expected output with the observed output.
What you're doing here is probably not a good way to test the output:
NSData *myData = [NSData dataWithBytes:(const void *)bufferPtr length:(NSUInteger)movedBytes];
result = [[NSString alloc] initWithData:myData encoding:NSISOLatin1StringEncoding];
How do you know the encrypted binary data can be interpreted with the NSISOLatin1StringEncoding encoding?
Instead, compare the bytes directly (via [myData description] or the like) or translate the output with hexadecimal or base64 encoding.
I believe the problem is that kCCOptionECBMode alone is not enough. You also need padding (since it is a block cypher). If you pass both (i.e. kCCOptionPKCS7Padding | kCCOptionECBMode ) it will work.
I realise this is an old question, but for reference, I think that your key should not be passed in as an NSString. The key should instead be converted from hexadecimal to a byte array. This hexToBytes NSString extension should provide what you need by doing the following:
[[key hexToBytes] bytes]
The key should also be twice as long as the one given (48 characters of hex, i.e. 24 bytes).