Convert Hexadecimal to NSData [duplicate] - iphone

This question already has answers here:
Converting HEX NSString To NSData
(4 answers)
Closed 9 years ago.
I'm getting from the webservice this hexadecimal
http://www.moodpin.it/cms/listar_avatar?id=9
and trying to convert to NSData and set the UIImageView but it doesn't work
Here is my code:
NSData *dataImage = [appDelegate hexStringToData:avatar.avatar];
[imageView setImage:[UIImage imageWithData:dataImage]];
- (NSData *) hexStringToData:(NSString *) aString
{
NSString *command = aString;
command = [command stringByReplacingOccurrencesOfString:#" " withString:#""];
NSMutableData *commandToSend= [[NSMutableData alloc] init];
unsigned char whole_byte;
char byte_chars[3] = {'\0','\0','\0'};
int i;
for (i=0; i < [command length]/2; i++) {
byte_chars[0] = [command characterAtIndex:i*2];
byte_chars[1] = [command characterAtIndex:i*2+1];
whole_byte = strtol(byte_chars, NULL, 16);
[commandToSend appendBytes:&whole_byte length:1];
}
NSLog(#"%#", commandToSend);
return commandToSend;
}

for (i=0; i < 8; i++) {
"doesn't work" is pretty vague. My first impression is that you're only looking at the first 16 bytes (two bytes per iteration), so you're obviously not going to get a chunk of data that corresponds to the input string. Seems like you should be using the length of the string to manage the for loop. Also, if the string has an odd number of characters, your code will effectively append a 0 character to the hex string.
Aside from whatever is making your code not work, you might also consider a different approach for converting. For example, you could create a 256-character lookup table that's filled with some invalid value (-1) except for the characters that correspond to hex digits, i.e. 0..9, a..f, and A..F. The values in the table at positions corresponding to those characters would have the hex digit value, so 10 at the indices of a and A, for example. That'll only work for specific single-byte character encodings like ASCII, but that may not be a serious restriction in this case. Use the lookup table to quickly find the value of each hex digit, and then use those values as the nibbles to build your bytes.

Related

get ascii code from string in xcode for iphone app

hey just a couple quick noob questions about writing my first ios app. Ive been searching through the questions here but they all seem to address questions more advanced than mine so im getting confused.
(1) All I want to do is turn a string into an array of integers representing the ASCII code. In other words, I want to convert:
"This is some string. It has spaces, punctuation, AND capitals."
into an array with 62 integers.
(2) How do I get back from the NSArray to a string?
(3) Also, are these expensive operations in terms of memory or computation time? It seems like it might be if we have to create a new variable at every iteration or something.
I know how to declare all the variables and im assuming I run a loop through the length of the string and at each iteration I somehow get the character and convert it into a number with some call to a built in command.
Thanks for any help you can offer or links to posts that might help!
if you want to store the ascii values in an nsarray it is going to be expensive. NSArray can only hold objects so you're going to have to create an NSNumber for each ASCII value:
unsigned len = [string length];
NSMutableArray arr = [NSMutableArray arrayWithCapacity:len];
for (unsigned i = 0; i < len; ++i) {
[arr addObject:[NSNumber numberWithUnsignedShort:[string characterAtIndex:i]]];
}
2) to go back to an NSString you'll need to use an MSMutableString and append each byte to the NSMutableString.
After saying that I'd suggest you don't use this method if you can avoid it.
A better approach would be to use #EmilioPelaez's answer. To go back from a memory buffer to an NSString is simple and inexpensive compared to iterating and concatting strings.
NSString * stringFromMemory = [[NSString alloc] initWithBytes:buffer length:len encoding: NSASCIIStringEncoding];
I ended up using the syntax I found here. Thanks for the help
How to convert ASCII value to a character in Objective-C?
NSString has a method to get the characters in an array:
NSString *string = "This is some string. It has spaces, punctuation, AND capitals.";
unichar *buffer = malloc(sizeof(unichar) * [string lenght]);
[string getCharacters:buffer range:NSMakeRange(0, [string length])];
If you check the definition of unichar, it's an unsigned short.

Objective-C character encoding - Change char to int, and back

Simple task: I need to convert two characters to two numbers, add them together and change that back to an character.
What I have got: (works perfect in Java - where encoding is handled for you, I guess):
int myChar1 = (int)([myText1 characterAtIndex:i]);
int myChar2 = (int)([myText2 characterAtIndex:keyCurrent]);
int newChar = (myChar1 + myChar2);
//NSLog(#"Int's %d, %d, %d", textChar, keyChar, newChar);
char newC = ((char) newChar);
NSString *tmp1 = [NSString stringWithFormat:#"%c", newC];
NSString *tmp2 = [NSString stringWithFormat:#"%#", newString];
newString = [NSString stringWithFormat:#"%#%#", tmp2, tmp1]; //Adding these char's in a string
The algorithm is perfect, but now I can't figure out how to implement encoding properties. I would like to do everything in UTF-8 but have no idea how to get a char's UTF-8 value, for instance. And if I've got it, how to change that value back to an char.
The NSLog in the code outputs the correct values. But when I try to do the opposite with the algorithm (I.e. - the values) then it goes wrong. It gets the wrong character value for weird/odd characters.
NSString works with unichar characters that are 2 bytes long (16 bits). Char is one byte long so you can only store code point from U+0000 to U+00FF (i.e. Basic Latin and Latin-1 Supplement).
You should do you math on unichar values then use +[NSString stringWithCharacters:length:] to create the string representation.
But there is still an issue with that solution. You code may generate code points between U+D800 and U+DFFF that aren't valid Unicode characters. The standard reserves them to encode code points from U+10000 to U+10FFFF in UTF-16 by pairs of 16-bit code units. In such a case, your string would be ill-formed and could neither be displayed nor converted in UTF8.
Also, the temporary variable tmp2 is useless and you should not create a new newString as you concatenate the string but rather use a NSMutableString.
I am assuming that your strings are NSStrings consisting of numerals which represent a number. If that is the case, you could try the following:
Include the following headers:
#include <inttypes.h>
#include <stdlib.h>
#include <stdio.h>
Then use the following code:
// convert NSString to UTF8 string
const char * utf8String1 = [myText1 UTF8String];
const char * utf8String2 = [myText2 UTF8String];
// convert UTF8 string into long integers
long num1 = strtol(utf8String1, NULL 0);
long num2 = strtol(utf8String2, NULL 0);
// perform calculations
long calc = num1 - num2;
// convert calculated value back into NSString
NSString * calcText = [[NSString alloc] initWithFormat:#"%li" calc];
// convert calculated value back into UTF8 string
char calcUTF8[64];
snprintf(calcUTF8, 64, "%li", calc);
// log results
NSLog(#"calcText: %#", calcText);
NSLog(#"calcUTF8: %s", calcUTF8);
Not sure if this is what you meant, but from what I understood, you wanted to create a NSString with the UTF-8 string encoding from a char?
If that's what you want, maybe you can use the initWithCString:encoding: method in NSString.

How to get the decimal ascii code of NSString?

With the method:
UniChar ch = (UniChar) [aInput characterAtIndex: i];
I get the ​​ch value is a hexadecimal code, how to get the decimal ascii code? Is there a system approach?
Char ch = [aInput characterAtIndex: i];
NSLog(#"%d", ch);
This gives you ascii value.
NO,i want get the 'ch' value, and not only to print out.Do you have some ideas?
Doesn't cast to UniChar. characterAtIndex return you decimal value

HmacSHA256 objective-c encryptation

I wanna encpryt a string with a key, using HmacSHA256. The code everyone use is the one below, but there is one thing that doesn´t make sense.
Why would we use base64 at the end if all we want is the HmacSHA256 hash?
I tried seeing the hash generated after the method CCHmac is called with
NSString *str = [[NSString alloc] initWithData:HMAC encoding:NSASCIIStringEncoding];
NSLog(#"%#", str);
But i don´t get the hash generated, i get null, or garbage, like this:
2011-10-11 09:38:05.082 Hash_HmacSHA256[368:207] (null)
2011-10-11 09:38:05.085 Hash_HmacSHA256[368:207] Rwªb7iså{yyþ§Ù(&oá÷ÛËÚ¥M`f
import < CommonCrypto/CommonHMAC.h>
NSString *key;
NSString *data;
const char *cKey = [key cStringUsingEncoding:NSASCIIStringEncoding];
const char *cData = [data cStringUsingEncoding:NSASCIIStringEncoding];
unsigned char cHMAC[CC_SHA256_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA256, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
NSData *HMAC = [[NSData alloc] initWithBytes:cHMAC
length:sizeof(cHMAC)];
NSString *hash = [HMAC base64Encoding]; //This line doesn´t make sense
[key release];
[data release];
First of all, for those wondering, this is in reference to my answer to this question: Objective-C sample code for HMAC-SHA1
The HMAC you generate is a 256-bit binary value that may or may not start with a 0 byte.
To be able to print it, you need a string representation (binary, hex, decimal, base64, etc.). Base64 is one of the most efficient among these, that's why I used a Base64 encoding there.
The reason you get garbage is that most (if not all) of the octets in the HMAC value are outside the range of printable ASCII characters. If the first octet is 0 (0x00), you get nil. This is why you need an encoding that supports arbitrary values. ASCII doesn't.
Of course, if you don't want to print the HMAC value, then may not need such an encoding, and can keep the HMAC as is (binary NSData).
I spend a whole day, trying to convert the generated hash (bytes) into readable data. I used the base64 encoded you said and it didn´t work at all for me .
So what i did was this:
CCHmac(kCCHmacAlgSHA256, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
// Now convert to NSData structure to make it usable again
NSData *out = [NSData dataWithBytes:cHMAC length:CC_SHA256_DIGEST_LENGTH];
// description converts to hex but puts <> around it and spaces every 4 bytes
NSString *hash = [out description];
hash = [hash stringByReplacingOccurrencesOfString:#" " withString:#""];
hash = [hash stringByReplacingOccurrencesOfString:#"<" withString:#""];
hash = [hash stringByReplacingOccurrencesOfString:#">" withString:#""];
// hash is now a string with just the 40char hash value in it
NSLog(#"%#",hash);
Don't do "[out description]" to get the hash as a string.
Do [hash base64Encoding] to get the base64 encoding of it. Use http://cybersam.com/ios-dev/http-basic-access-authentication-with-objective-c-and-ios/attachment/nsdataadditions to get the base64Encoding function. The additions class is a category that will add the function base64Encoding to NSData's implementation.
Or you can do [[NSString alloc]initWithData:out encoding:NSUTF8StringEncoding].

Converting an NSString to and from UTF32

I'm working with a database that includes hex codes for UTF32 characters. I would like to take these characters and store them in an NSString. I need to have routines to convert in both ways.
To convert the first character of an NSString to a unicode value, this routine seems to work:
const unsigned char *cs = (const unsigned char *)
[s cStringUsingEncoding:NSUTF32StringEncoding];
uint32_t code = 0;
for ( int i = 3 ; i >= 0 ; i-- ) {
code <<= 8;
code += cs[i];
}
return code;
However, I am unable to do the reverse (i.e. take a single code and convert it into an NSString). I thought I could just do the reverse of what I do above by simply creating a c-string with the UTF32 character in it with the bytes in the correct order, and then create an NSString from that using the correct encoding.
However, converting to / from cstrings does not seem to be reversible for me.
For example, I've tried this code, and the "tmp" string is not equal to the original string "s".
char *cs = [s cStringUsingEncoding:NSUTF32StringEncoding];
NSString *tmp = [NSString stringWithCString:cs encoding:NSUTF32StringEncoding];
Does anyone know what I am doing wrong? Should I be using "wchar_t" for the cstring instead of char *?
Any help is greatly appreciated!
Thanks,
Ron
You have a couple of reasonable options.
1. Conversion
The first is to convert your UTF32 to UTF16 and use those with NSString, as UTF16 is the "native" encoding of NSString. It's not actually all that hard. If the UTF32 character is in the BMP (e.g. it's high two bytes are 0's), you can just cast it to unichar directly. If it's in any other plane, you can convert it to a surrogate pair of UTF16 characters. You can find the rules on the wikipedia page. But a quick (untested) conversion would look like
UTF32Char inputChar = // my UTF-32 character
inputChar -= 0x10000;
unichar highSurrogate = inputChar >> 10; // leave the top 10 bits
highSurrogate += 0xD800;
unichar lowSurrogate = inputChar & 0x3FF; // leave the low 10 bits
lowSurrogate += 0xDC00;
Now you can create an NSString using both characters at the same time:
NSString *str = [NSString stringWithCharacters:(unichar[]){highSurrogate, lowSurrogate} length:2];
To go backwards, you can use [NSString getCharacters:range:] to get the unichar's back and then reverse the surrogate pair algorithm to get your UTF32 character back (any characters which aren't in the range 0xD800-0xDFFF should just be cast to UTF32 directly).
2. Byte buffers
Your other option is to let NSString do the conversion directly without using cStrings. To convert a UTF32 value into an NSString you can use something like the following:
UTF32Char inputChar = // input UTF32 value
inputChar = NSSwapHostIntToLittle(inputChar); // swap to little-endian if necessary
NSString *str = [[[NSString alloc] initWithBytes:&inputChar length:4 encoding:NSUTF32LittleEndianStringEncoding] autorelease];
To get it back out again, you can use
UTF32Char outputChar;
if ([str getBytes:&outputChar maxLength:4 usedLength:NULL encoding:NSUTF32LittleEndianStringEncoding options:0 range:NSMakeRange(0, 1) remainingRange:NULL]) {
outputChar = NSSwapLittleIntToHost(outputChar); // swap back to host endian
// outputChar now has the first UTF32 character
}
There are two probelms here:
1:
The first one is that both [NSString cStringUsingEncoding:] and [NSString getCString:maxLength:encoding:] return the C-string in native-endianness (little) without adding a BOM to it when using NSUTF32StringEncoding and NSUTF16StringEncoding.
The Unicode standard states that: (see, "How I should deal with BOMs")
"If there is no BOM, the text should be interpreted as big-endian."
This is also stated in NSString's documentation: (see, "Interpreting UTF-16-Encoded Data")
"... if the byte order is not otherwise specified, NSString assumes that the UTF-16 characters are big-endian, unless there is a BOM (byte-order mark), in which case the BOM dictates the byte order."
Although they're referring to UTF-16, the same applies to UTF-32.
2:
The second one is that [NSString stringWithCString:encoding:] internally uses CFStringCreateWithCString to create the C-string. The problem with this is that CFStringCreateWithCString only accepts strings using 8-bit encodings. From the documentation: (see, "Parameters" section)
The string must use an 8-bit encoding.
To solve this issue:
Explicitly state the encoding endianness you want to use both ways (NSString -> C-string and C-string -> NSString)
Use [NSString initWithBytes:length:encoding:] when trying to create an NSString from a C-string encoded in UTF-32 or UTF-16.
Hope this helps!