Converting HEX NSString To NSData - iphone

I'm trying to convert a Hex NSString to NSData (I'm using the below attached code). The following is the output:
<00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000>
which looks totally irrelevant to me. Any idea/ suggestions on where its going wrong?
NSString *strData = #"72ff63cea198b3edba8f7e0c23acc345050187a0cde5a9872cbab091ab73e553";
NSLog(#"string Data length is %d",[strData length]);
NSMutableData *commandToSend= [[NSMutableData alloc] init];
unsigned char whole_byte;
char byte_chars[2];
int i;
for (i=0; i < [strData length]/2; i++) {
byte_chars[0] = [strData characterAtIndex:i*2];
byte_chars[1] = [strData characterAtIndex:i*2+1];
whole_byte = strtol(byte_chars, NULL, [strData length]);
[commandToSend appendBytes:&whole_byte length:1];
}
NSLog(#"%#", commandToSend);

NSString *command = #"72ff63cea198b3edba8f7e0c23acc345050187a0cde5a9872cbab091ab73e553";
command = [command stringByReplacingOccurrencesOfString:#" " withString:#""];
NSMutableData *commandToSend= [[NSMutableData alloc] init];
unsigned char whole_byte;
char byte_chars[3] = {'\0','\0','\0'};
int i;
for (i=0; i < [command length]/2; i++) {
byte_chars[0] = [command characterAtIndex:i*2];
byte_chars[1] = [command characterAtIndex:i*2+1];
whole_byte = strtol(byte_chars, NULL, 16);
[commandToSend appendBytes:&whole_byte length:1];
}
NSLog(#"%#", commandToSend);

Here is another method that also handles leading <, trailing > and embedded spaces such as
<9dc69faf a7434ba9 aef57f5c 365d571f 4c3753c4 ae13db42 57d184ca e00246c5>
Code:
+ (NSData *)dataFromHexString:(NSString *)string
{
string = [string lowercaseString];
NSMutableData *data= [NSMutableData new];
unsigned char whole_byte;
char byte_chars[3] = {'\0','\0','\0'};
int i = 0;
int length = string.length;
while (i < length-1) {
char c = [string characterAtIndex:i++];
if (c < '0' || (c > '9' && c < 'a') || c > 'f')
continue;
byte_chars[0] = c;
byte_chars[1] = [string characterAtIndex:i++];
whole_byte = strtol(byte_chars, NULL, 16);
[data appendBytes:&whole_byte length:1];
}
return data;
}
This is based on the answer by #Nikunj R. Jadav

This might be more useful, Apple has shared a NSData category.
NSData+HexString.m
The code is:
#implementation NSData (HexString)
// Not efficent
+(id)dataWithHexString:(NSString *)hex
{
char buf[3];
buf[2] = '\0';
NSAssert(0 == [hex length] % 2, #"Hex strings should have an even number of digits (%#)", hex);
unsigned char *bytes = malloc([hex length]/2);
unsigned char *bp = bytes;
for (CFIndex i = 0; i < [hex length]; i += 2) {
buf[0] = [hex characterAtIndex:i];
buf[1] = [hex characterAtIndex:i+1];
char *b2 = NULL;
*bp++ = strtol(buf, &b2, 16);
NSAssert(b2 == buf + 2, #"String should be all hex digits: %# (bad digit around %d)", hex, i);
}
return [NSData dataWithBytesNoCopy:bytes length:[hex length]/2 freeWhenDone:YES];
}
#end

I see several solution have been post only able to convert string with even length.
So here is my solution which also able return correct data if the string is odd length like this "DBA" became data like this this "\x0D\xBA"
+ (NSData *)dataFromHexString:(NSString *) string {
if([string length] % 2 == 1){
string = [#"0"stringByAppendingString:string];
}
const char *chars = [string UTF8String];
int i = 0, len = (int)[string length];
NSMutableData *data = [NSMutableData dataWithCapacity:len / 2];
char byteChars[3] = {'\0','\0','\0'};
unsigned long wholeByte;
while (i < len) {
byteChars[0] = chars[i++];
byteChars[1] = chars[i++];
wholeByte = strtoul(byteChars, NULL, 16);
[data appendBytes:&wholeByte length:1];
}
return data;
}

Related

How to convert text to MD5 64 bytes format in iPhone

Im using the following method to convert text to MD5 format.
- (NSString*)MD5
{
const char *ptr = [txt_Password.text UTF8String];
unsigned char md5Buffer[CC_MD5_DIGEST_LENGTH];
CC_MD5(ptr, strlen(ptr), md5Buffer);
NSMutableString *output = [NSMutableString stringWithCapacity:CC_MD5_DIGEST_LENGTH * 2];
for(int i = 0; i < CC_MD5_DIGEST_LENGTH; i++)
[output appendFormat:#"%02x",md5Buffer[i]];
return output;
}
However, it returns me the MD5 string in 16 bytes. And i need it in 64 bytes. Any help is appreciated. Thank you!!!
For md5: (Add this on a cathegory over NSString as well)
+ (NSString *)hashForString:(NSString *)aString {
NSData *data = [aString dataUsingEncoding:NSUTF8StringEncoding];
unsigned char digest[CC_MD5_DIGEST_LENGTH];
CC_MD5([data bytes], [data length], digest);
NSString *md5String = [[NSString alloc] initWithBytes:digest length:CC_MD5_DIGEST_LENGTH encoding:NSUTF8StringEncoding];
return [md5String autorelease];
}
For encoding your hashed psw to 64bit format:
- (NSString*)base64MD5HashForString:(NSString *)string {
NSString *md5Hash = [[[NSString hashForString:string] dataUsingEncoding:NSUTF8StringEncoding] encodeBase64];
return md5Hash;
}
Method below is a cathegory over NSData;
static const char kBase64Alphabet[64] = "ABCDEFGHIJKLMNOPQRSTUVWXYZ"
"abcdefghijklmnopqrstuvwxyz"
"0123456789+/";
- (NSString*)encodeBase64 {
NSMutableString *encodedData = [NSMutableString string];
int i = 0, j = 0;
unsigned char char_array_3[3];
unsigned char char_array_4[5];
memset(char_array_3, 0, 3*sizeof(char));
memset(char_array_4, 0, 5*sizeof(char));
int length = [self length];
char *bytes = (char*)[self bytes];
while(length--) {
char_array_3[i++] = *(bytes++);
if (i == 3) {
char_array_4[0] = kBase64Alphabet[(char_array_3[0] & 0xfc)>>2];
char_array_4[1] = kBase64Alphabet[((char_array_3[0] & 0x03) <<4) + ((char_array_3[1] & 0xf0) >>4)];
char_array_4[2] = kBase64Alphabet[((char_array_3[1] & 0x0f) <<2) + ((char_array_3[2] & 0xc0) >>6)];
char_array_4[3] = kBase64Alphabet[char_array_3[2]&0x3f];
[encodedData appendString:[NSString stringWithUTF8String:(const char*)char_array_4]];
i = 0;
}
}
if (i) {
for(j=i; j<3; j++)
char_array_3[j] = '\0';
char_array_4[0] = kBase64Alphabet[(char_array_3[0] & 0xfc)>>2];
char_array_4[1] = kBase64Alphabet[((char_array_3[0] & 0x03) <<4) + ((char_array_3[1] & 0xf0) >>4)];
char_array_4[2] = kBase64Alphabet[((char_array_3[1] & 0x0f) <<2) + ((char_array_3[2] & 0xc0) >>6)];
char_array_4[3] = kBase64Alphabet[char_array_3[2]&0x3f];
char_array_4[i+1] = 0;
[encodedData appendString:[NSString stringWithUTF8String:(const char*)char_array_4]];
while((i++<3))
[encodedData appendString:[NSString stringWithUTF8String:"="]];
}
return encodedData;
}

How to show image from the Hex string [iphone]?

I have such Hex code 89504e470d0a1a0a0000000d49484452000001000000010008060000005c72a8
now this hex code is the code of some image . that I know , now I want to show that image in a imageView . I dont know how to do it because when I am converting this Hex into NSData and trying yo convert that data into image view and when I show that Image . the image is not being shown.
can anybody tell me how to do this?
// This is the first part i.e. image to hex conversion
UIImage *img = [UIImage imageNamed:#"image.png"];
NSData *data1 = UIImagePNGRepresentation(img);
// NSLog(#"data is %#",data1);
const unsigned *tokenBytes = [data1 bytes];
NSString *hexToken = [NSString stringWithFormat:#"%08x%08x%08x%08x%08x%08x%08x%08x",
ntohl(tokenBytes[0]), ntohl(tokenBytes[1]), ntohl(tokenBytes[2]),
ntohl(tokenBytes[3]), ntohl(tokenBytes[4]), ntohl(tokenBytes[5]),
ntohl(tokenBytes[6]), ntohl(tokenBytes[7])];
NSLog(#"hexToken is %#",hexToken);
If I correctly understand your question, UIImage *image = [UIImage imageWithData:data]; is what you need.
Your NSData to hex conversion is incorrect, you're only taking first 8 bytes as hex token; use following code instead:
- (NSString*)stringWithHexBytes:(NSData *) data {
static const char hexdigits[] = "0123456789ABCDEF";
const size_t numBytes = [data length];
const unsigned char* bytes = [data bytes];
char *strbuf = (char *)malloc(numBytes * 2 + 1);
char *hex = strbuf;
NSString *hexBytes = nil;
for (int i = 0; i<numBytes; ++i) {
const unsigned char c = *bytes++;
*hex++ = hexdigits[(c >> 4) & 0xF];
*hex++ = hexdigits[(c ) & 0xF];
}
*hex = 0;
hexBytes = [NSString stringWithUTF8String:strbuf];
free(strbuf);
return hexBytes;
}
Converting hex string back to NSData:
- (NSData *) hexStringToData:(NSString *) hexString
{
unsigned char whole_byte;
NSMutableData *returnData= [[NSMutableData alloc] init];
char byte_chars[3] = {'\0','\0','\0'};
int i;
for (i=0; i < 8; i++) {
byte_chars[0] = [hexString characterAtIndex:i*2];
byte_chars[1] = [hexString characterAtIndex:i*2+1];
whole_byte = strtol(byte_chars, NULL, 16);
[returnData appendBytes:&whole_byte length:1];
}
return (NSData *) [returnData autorelease];
}

-[NSCFDictionary length]: unrecognized selector

I have the next problem with this code:
NSDictionary * imagen = [[NSDictionary alloc] initWithDictionary:[envio resultValue]];
NSString *imagenS = [imagen valueForKey:#"/Result"];
[Base64 initialize];
NSData * imagenDecode = [Base64 decode:imagenS];
NSLog(#"%#", [imagenS length]);
//SAVE IMAGE
NSArray *sysPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES);
NSString *docDirectory = [sysPaths objectAtIndex:0];
NSString *filePath = [NSString stringWithFormat:#"%#david.png",docDirectory];
[imagenDecode writeToFile:filePath atomically:YES];
[envio resultValue] --> return a NSDictionary with one image in Base 64 codification.
I want decoder and save this image but in my console I have showed this message:
2011-08-23 20:15:36.539 WSStub[39226:a0f] -[NSCFDictionary length]: unrecognized selector sent to instance 0xd00ee0
The Base 64 code is:
//
// Base64.m
// CryptTest
//
// Created by Kiichi Takeuchi on 4/20/10.
// Copyright 2010 ObjectGraph LLC. All rights reserved.
//
#import "Base64.h"
#implementation Base64
#define ArrayLength(x) (sizeof(x)/sizeof(*(x)))
static char encodingTable[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
static char decodingTable[128];
+ (void) initialize
{
if (self == [Base64 class])
{
memset(decodingTable, 0, ArrayLength(decodingTable));
for (NSInteger i = 0; i < ArrayLength(encodingTable); i++) {
decodingTable[encodingTable[i]] = i;
}
}
}
+ (NSData*) decode:(const char*) string length:(NSInteger) inputLength
{
if ((string == NULL) || (inputLength % 4 != 0))
{
return nil;
}
while (inputLength > 0 && string[inputLength - 1] == '=')
{
inputLength--;
}
NSInteger outputLength = inputLength * 3 / 4;
NSMutableData* data = [NSMutableData dataWithLength:outputLength];
uint8_t* output = data.mutableBytes;
NSInteger inputPoint = 0;
NSInteger outputPoint = 0;
while (inputPoint < inputLength)
{
char i0 = string[inputPoint++];
char i1 = string[inputPoint++];
char i2 = inputPoint < inputLength ? string[inputPoint++] : 'A'; /* 'A' will decode to \0 */
char i3 = inputPoint < inputLength ? string[inputPoint++] : 'A';
output[outputPoint++] = (decodingTable[i0] << 2) | (decodingTable[i1] >> 4);
if (outputPoint < outputLength)
{
output[outputPoint++] = ((decodingTable[i1] & 0xf) << 4) | (decodingTable[i2] >> 2);
}
if (outputPoint < outputLength)
{
output[outputPoint++] = ((decodingTable[i2] & 0x3) << 6) | decodingTable[i3];
}
}
return data;
}
+ (NSData*) decode:(NSString*) string
{
return [self decode:[string cStringUsingEncoding:NSASCIIStringEncoding] length:[string length]];
}
#end
The line
NSString *imagenS = [imagen valueForKey:#"/Result"];
is returning a dictionary, not a string. You'll have to inspect your data source to determine whether this is correct or not.
Your NSLog() call is wrong. To show a length, it should be:
NSLog(#"%lu", [imagenS length]);
But that is probably not the problem.
You seem to be invoking length on an NSDictionary. Hard to tell where you do that, since you don't show the piece of code where that happens. It could be that imagenS is not an NSString, but an NSDictionary instead.
Try to do:
NSLog(#"%#", [imagenS class]);
and see what is displayed. If probably tells you that imagenS is not a string, but an NSCFDictionary or similar instead.
FWIW, NSCFDictionary is one of the subclasses of NSDictionary that actually implement the different versions of the main class. This is called a class cluster.

iphone SDK : How Can I Convert a string to SHA-1 And SHA-1 to Base64? (for WSSecurity)

my convert code is all i have, but it converts a string to sha-1 to hex format. how can i convert sha-1 to base64 ?
-(NSString *)digest:(NSString*)input
int i;
const char *cstr = [input cStringUsingEncoding:NSUTF8StringEncoding];
NSData *data = [NSData dataWithBytes:cstr length:input.length];
uint8_t digest[CC_SHA1_DIGEST_LENGTH];
CC_SHA1(data.bytes, data.length, digest);
output = [NSMutableString stringWithCapacity:CC_SHA1_DIGEST_LENGTH * 2];
length:CC_SHA1_DIGEST_LENGTH];
for(i = 0; i < CC_SHA1_DIGEST_LENGTH; i++)
{
[output appendFormat:#"%02x", digest[i]];
}
return output;
}
There's no built-in hashing or Base64 in iOS. You'll have to roll your own. Find a C implementation of SHA1 in Google; as for Base64, I've got one for you:
NSString *ToBase64(NSData *d)
{
static const char ALPHABET[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
static const NSString *PAD1 = #"=", *PAD2 = #"==";
int l = [d length];
unsigned char *p = (unsigned char*)[d bytes];
unichar Chunk[4];
NSMutableString *s = [NSMutableString stringWithCapacity:(l+3)*4/3];
int i;
int mod = l % 3;
int ll = l - mod;
unsigned int triad;
NSString *sChunk;
for(i=0;i<ll;i+=3)
{
triad = (p[i]<<16) | (p[i+1]<<8) | p[i+2];
Chunk[0] = ALPHABET[(triad >> 18) & 0x3f];
Chunk[1] = ALPHABET[(triad >> 12) & 0x3f];
Chunk[2] = ALPHABET[(triad >> 6) & 0x3f];
Chunk[3] = ALPHABET[triad & 0x3f];
sChunk = [[NSString alloc] initWithCharacters:Chunk length:4];
[s appendString:sChunk];
[sChunk release];
}
if(mod == 1)
{
Chunk[0] = ALPHABET[(p[i] >> 2) & 0x3f];
Chunk[1] = ALPHABET[(p[i] << 4) & 0x3f];
sChunk = [[NSString alloc] initWithCharacters:Chunk length:2];
[s appendString:sChunk];
[sChunk release];
[s appendString: PAD2];
}
if(mod == 2)
{
triad = (p[i]<<8) | p[i+1];
Chunk[0] = ALPHABET[(triad >> 10) & 0x3f];
Chunk[1] = ALPHABET[(triad >> 4) & 0x3f];
Chunk[2] = ALPHABET[(triad << 2) & 0x3f];
sChunk = [[NSString alloc] initWithCharacters:Chunk length:3];
[s appendString:sChunk];
[sChunk release];
[s appendString: PAD1];
}
return s;
}

How to convert an NSString to hex values

I'd like to convert a regular NSString into an NSString with the (what I assume are) ASCII hex values and back.
I need to produce the same output that the Java methods below do, but I can't seem to find a way to do it in Objective-C. I've found some examples in C and C++ but I've had a hard time working them into my code.
Here are the Java methods I'm trying to reproduce:
/**
* Encodes the given string by using the hexadecimal representation of its UTF-8 bytes.
*
* #param s The string to encode.
* #return The encoded string.
*/
public static String utf8HexEncode(String s) {
if (s == null) {
return null;
}
byte[] utf8;
try {
utf8 = s.getBytes(ENCODING_UTF8);
} catch (UnsupportedEncodingException x) {
throw new RuntimeException(x);
}
return String.valueOf(Hex.encodeHex(utf8));
}
/**
* Decodes the given string by using the hexadecimal representation of its UTF-8 bytes.
*
* #param s The string to decode.
* #return The decoded string.
* #throws Exception If an error occurs.
*/
public static String utf8HexDecode(String s) throws Exception {
if (s == null) {
return null;
}
return new String(Hex.decodeHex(s.toCharArray()), ENCODING_UTF8);
}
Update: Thanks to drawnonward's answer here's the method I wrote to create the hex NSStrings. It gives me an "Initialization discards qualifiers from pointer target type" warning on the char declaration line, but it works.
- (NSString *)stringToHex:(NSString *)string
{
char *utf8 = [string UTF8String];
NSMutableString *hex = [NSMutableString string];
while ( *utf8 ) [hex appendFormat:#"%02X" , *utf8++ & 0x00FF];
return [NSString stringWithFormat:#"%#", hex];
}
Haven't had time to write the decoding method yet. When I do, I'll edit this to post it for anyone else interested.
Update2: So the method I posted above actually doesn't output what I'm looking for. Instead of outputting hex values in 0-f format, it was instead outputting all numbers. I finally got back to working on this problem and was able to write a category for NSString that exactly duplicates the Java methods I posted. Here it is:
//
// NSString+hex.h
// Created by Ben Baron on 10/20/10.
//
#interface NSString (hex)
+ (NSString *) stringFromHex:(NSString *)str;
+ (NSString *) stringToHex:(NSString *)str;
#end
//
// NSString+hex.m
// Created by Ben Baron on 10/20/10.
//
#import "NSString+hex.h"
#implementation NSString (hex)
+ (NSString *) stringFromHex:(NSString *)str
{
NSMutableData *stringData = [[[NSMutableData alloc] init] autorelease];
unsigned char whole_byte;
char byte_chars[3] = {'\0','\0','\0'};
int i;
for (i=0; i < [str length] / 2; i++) {
byte_chars[0] = [str characterAtIndex:i*2];
byte_chars[1] = [str characterAtIndex:i*2+1];
whole_byte = strtol(byte_chars, NULL, 16);
[stringData appendBytes:&whole_byte length:1];
}
return [[[NSString alloc] initWithData:stringData encoding:NSASCIIStringEncoding] autorelease];
}
+ (NSString *) stringToHex:(NSString *)str
{
NSUInteger len = [str length];
unichar *chars = malloc(len * sizeof(unichar));
[str getCharacters:chars];
NSMutableString *hexString = [[NSMutableString alloc] init];
for(NSUInteger i = 0; i < len; i++ )
{
[hexString appendString:[NSString stringWithFormat:#"%x", chars[i]]];
}
free(chars);
return [hexString autorelease];
}
#end
The perfect and short way to convert nsstring to hexadecimal values
NSMutableString *tempHex=[[NSMutableString alloc] init];
[tempHex appendString:#"0xD2D2D2"];
unsigned colorInt = 0;
[[NSScanner scannerWithString:tempHex] scanHexInt:&colorInt];
lblAttString.backgroundColor=UIColorFromRGB(colorInt);
The macro used for this code is----
#define UIColorFromRGB(rgbValue)
[UIColor \colorWithRed:((float)((rgbValue & 0xFF0000) >> 16))/255.0 \
green:((float)((rgbValue & 0xFF00) >> 8))/255.0 \
blue:((float)(rgbValue & 0xFF))/255.0 alpha:1.0]
For these lines of Java
utf8 = s.getBytes(ENCODING_UTF8);
new String(decodedHexString, ENCODING_UTF8);
Objective-C equivalents would be
utf8 = [s UTF8String];
[NSString initWithUTF8String:decodedHexString];
To make an NSString with the hexadecimal representation of a character string:
NSMutableString *hex = [NSMutableString string];
while ( *utf8 ) [hex appendFormat:#"%02X" , *utf8++ & 0x00FF];
You will have to make your own decodeHex function. Just pull two characters out of the string and, if they are valid, add a byte to the result.
There is a problem with your stringToHex method - it drops leading 0s, and ignores 00s. Just as a quick fix, I made the below:
+ (NSString *) stringToHex:(NSString *)str
{
NSUInteger len = [str length];
unichar *chars = malloc(len * sizeof(unichar));
[str getCharacters:chars];
NSMutableString *hexString = [[NSMutableString alloc] init];
for(NSUInteger i = 0; i < len; i++ )
{
// [hexString [NSString stringWithFormat:#"%02x", chars[i]]]; /*previous input*/
[hexString appendFormat:#"%02x", chars[i]]; /*EDITED PER COMMENT BELOW*/
}
free(chars);
return [hexString autorelease];
}
Thanks to all who contributed on this thread. It was a great help to me. Since things have moved on a little since the original post, here's my updated implementation for iOS 6. I went with the categories approach, but chose to split the load between NSData and NSString. Comments welcomed.
First, the NSString half, which handles decoding a hex encoded string into an NSData object.
#implementation NSString (StringToHexData)
//
// Decodes an NSString containing hex encoded bytes into an NSData object
//
- (NSData *) stringToHexData
{
int len = [self length] / 2; // Target length
unsigned char *buf = malloc(len)
unsigned char *whole_byte = buf;
char byte_chars[3] = {'\0','\0','\0'};
int i;
for (i=0; i < [self length] / 2; i++) {
byte_chars[0] = [self characterAtIndex:i*2];
byte_chars[1] = [self characterAtIndex:i*2+1];
*whole_byte = strtol(byte_chars, NULL, 16);
whole_byte++;
}
NSData *data = [NSData dataWithBytes:buf length:len];
free( buf );
return data;
}
#end
The changes were mostly for efficiency's sake: some simple old-fashioned pointer arithmetic means I could allocate the whole buffer in one go, and populate it byte by byte. Then the whole thing is passed to NSData in one go.
The encoding part, in NSData, looks like this:
#implementation NSData (DataToHexString)
- (NSString *) dataToHexString
{
NSUInteger len = [self length];
char * chars = (char *)[self bytes];
NSMutableString * hexString = [[NSMutableString alloc] init];
for(NSUInteger i = 0; i < len; i++ )
[hexString appendString:[NSString stringWithFormat:#"%0.2hhx", chars[i]]];
return hexString;
}
#end
Again, some minor changes, though I suspect no efficiency gains here. The use of "%0.2hhx" solved all the problems of missing leading zero's and ensures that only a single-byte is output at a time.
Hope this helps the next person taking this on!
One possible solution:
+(NSString*)hexFromStr:(NSString*)str
{
NSData* nsData = [str dataUsingEncoding:NSUTF8StringEncoding];
const char* data = [nsData bytes];
NSUInteger len = nsData.length;
NSMutableString* hex = [NSMutableString string];
for(int i = 0; i < len; ++i)[hex appendFormat:#"%02X", data[i]];
return hex;
}
So, first off, I would like to thank drawnonward for his answer. This gave me the first function, mean and clean. In the same spirit, I wrote the other one. Hope you like it.
#synthesize unsigned char* value= _value;
- (NSString*) hexString
{
_value[CONSTANT]= '\0';
unsigned char* ptr= _value;
NSMutableString* hex = [[NSMutableString alloc] init];
while ( *ptr ) [hex appendFormat:#"%02x", *ptr++ & 0x00FF];
return [hex autorelease];
}
- (void) setHexString:(NSString*)hexString
{
_value[CONSTANT]= '\0';
unsigned char* ptr= _value;
for (const char* src= [hexString cStringUsingEncoding:NSASCIIStringEncoding];
*src;
src+=2)
{
unsigned int hexByte;
/*int res=*/ sscanf(src,"%02x",&hexByte);
*ptr++= (unsigned char)(hexByte & 0x00FF);
}
*ptr= '\0';
}
My input was an digit base10 string, and the output should be the hex representation in string format. Examples:
#"10" -> #"A"
#"1128" -> #"468"
#"1833828235" -> #"6D4DFF8B"
Implementation:
+ (NSString *) stringToHex:(NSString *)str{
NSInteger result = [str integerValue];
NSString *hexStr = (result)?#"":#"0";
while (result!=0) {
NSInteger reminder = result % 16;
if(reminder>=0 && reminder<=9){
hexStr = [[NSString stringWithFormat:#"%ld",(long)reminder] stringByAppendingString:hexStr];
}else if(reminder==10){
hexStr = [#"A" stringByAppendingString:hexStr];
}else if(reminder==11){
hexStr = [#"B" stringByAppendingString:hexStr];
}else if(reminder==12){
hexStr = [#"C" stringByAppendingString:hexStr];
}else if(reminder==13){
hexStr = [#"D" stringByAppendingString:hexStr];
}else if(reminder==14){
hexStr = [#"E" stringByAppendingString:hexStr];
}else{
hexStr = [#"F" stringByAppendingString:hexStr];
}
result /=16;
}
return hexStr;
}
Perhaps you should use NSString dataUsingEncoding: to encode and initWithData:length:encoding: to decode. Depends on where you are getting the data from.