Unrecognized selector woes - iphone

I'm puzzled... I have this function "colorWithHexString"... when I include it in the viewcontroller that's calling it then it works fine. But when I move it to a separate "BSJax" class and call it with the same input parameter it throws an unrecognized selector error. Here's the call:
BSjax *bsjax = [BSjax new];
NSString *hexString = [NSString stringWithString:#"CCCCFF"];
[self.view setBackgroundColor:[bsjax colorWithHexString:hexString]];
I'm pretty sure there's something about the way I'm calling the function that prevents it from working as a bsjax method. Any feedback will be appreciated.
BSjax.h includes:
+ (UIColor *)colorWithHexString:(NSString *)stringToConvert;
... and BSjax.m includes:
+ (UIColor *)colorWithHexString:(NSString *)stringToConvert
{
NSString *cString = [[stringToConvert stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]] uppercaseString];
// String should be 6 or 8 characters
if ([cString length] < 6) NSLog(#"colorWithHexString called with parameter < 6 characters in length");
// strip 0X if it appears
if ([cString hasPrefix:#"0X"]) cString = [cString substringFromIndex:2];
if ([cString length] != 6) NSLog(#"colorWithHexString called with parameter != 6 characters in length");
// Separate into r, g, b substrings
NSRange range;
range.location = 0;
range.length = 2;
NSString *rString = [cString substringWithRange:range];
range.location = 2;
NSString *gString = [cString substringWithRange:range];
range.location = 4;
NSString *bString = [cString substringWithRange:range];
// Scan values
unsigned int r, g, b;
[[NSScanner scannerWithString:rString] scanHexInt:&r];
[[NSScanner scannerWithString:gString] scanHexInt:&g];
[[NSScanner scannerWithString:bString] scanHexInt:&b];
return [UIColor colorWithRed:((float) r / 255.0f)
green:((float) g / 255.0f)
blue:((float) b / 255.0f)
alpha:1.0f];
}

You are trying to call a class method on an instance.
Notice the +:
+ (UIColor *)colorWithHexString:(NSString *)stringToConvert;
It means you can only call the method as [ClassName classmethod]
And then here you are trying to use the method with an instance [instanceObject classmethod]:
BSjax *bsjax = [BSjax new];
[self.view setBackgroundColor:[bsjax colorWithHexString:hexString]];
Try changing it to:
[self.view setBackgroundColor:[BSjax colorWithHexString:hexString]];
And that should set you straight.

Is colorWithHexString declared in #interface BSjax in a header, and did you #import that header into the source file where the error is reported?
Edit:
+ (UIColor *)colorWithHexString:(NSString *)stringToConvert;
The above code (the +) declares a class method, meaning it should be called with the class name. You are calling it with an instance of the class, for which it is not defined. Try:
[self.view setBackgroundColor:[BSjax colorWithHexString:hexString]];

Related

How do I convert a hex color in an NSString to three separate rgb ints in Objective C?

I may be making something incredibly simple incredibly complicated, but nothing I've tried so far seems to work.
I have NSStrings like #"BD8F60" and I would like to turn them into ints like: r = 189, g = 143, b = 96.
Have found ways to convert hex values that are already ints into rgb ints, but am stuck on how to change the NSString with the letters in it into an int where the letters have been converted to their numerical counterparts. Apologize in advance if this is incredibly basic--I'm still learning this stuff at an incredibly basic level.
You need to parse the NSString and interpret the hex values.
You may do this in multiple ways, one being using an NSScanner
NSScanner* scanner = [NSScanner scannerWithString:#"BD8F60"];
int hex;
if ([scanner scanHexInt:&hex]) {
// Parsing successful. We have a big int representing the 0xBD8F60 value
int r = (hex >> 16) & 0xFF; // get the first byte
int g = (hex >> 8) & 0xFF; // get the middle byte
int b = (hex ) & 0xFF; // get the last byte
} else {
NSLog(#"Parsing error: no hex value found in string");
}
There are some other possibilities like splitting the string in 3 and scan the values separately (instead of doing bitwise shift and masking) but the idea remains the same.
Note: as scanHexInt: documentation explains, this also works if your string is prefixed with 0x like #"0xBD8F60". Does not automatically work with strings prefixed by a hash like #"#BD8F60". Use a substring in this case.
This method turns the given hex-string into a UIColor:
- (UIColor *)colorWithHexString:(NSString *)stringToConvert {
NSScanner *scanner = [NSScanner scannerWithString:stringToConvert];
unsigned hex;
if (![scanner scanHexInt:&hex]) return nil;
int r = (hex >> 16) & 0xFF;
int g = (hex >> 8) & 0xFF;
int b = (hex) & 0xFF;
return [UIColor colorWithRed:r / 255.0f
green:g / 255.0f
blue:b / 255.0f
alpha:1.0f];
}
a category on UIColor that also deals with alpha values rrggbbaa and short forms as rgb or rgba.
use it it like
UIColor *color = [UIColor colorFromHexString:#"#998997FF"]; //#RRGGBBAA
or
UIColor *color = [UIColor colorFromHexString:#"998997FF"]; //RRGGBBAA
or
UIColor *color = [UIColor colorFromHexString:#"0x998997FF"];// 0xRRGGBBAA
or
UIColor *color = [UIColor colorFromHexString:#"#999"]; // #RGB -> #RRGGBB
or
UIColor *color = [UIColor colorFromHexString:#"#9acd"]; // #RGBA -> #RRGGBBAA
#implementation UIColor (Creation)
+(UIColor *)_colorFromHex:(NSUInteger)hexInt
{
int r,g,b,a;
r = (hexInt >> 030) & 0xFF;
g = (hexInt >> 020) & 0xFF;
b = (hexInt >> 010) & 0xFF;
a = hexInt & 0xFF;
return [UIColor colorWithRed:r / 255.0f
green:g / 255.0f
blue:b / 255.0f
alpha:a / 255.0f];
}
+(UIColor *)colorFromHexString:(NSString *)hexString
{
hexString = [hexString stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceCharacterSet]];
if ([hexString hasPrefix:#"#"])
hexString = [hexString substringFromIndex:1];
else if([hexString hasPrefix:#"0x"])
hexString = [hexString substringFromIndex:2];
int l = [hexString length];
if ((l!=3) && (l!=4) && (l!=6) && (l!=8))
return nil;
if ([hexString length] > 2 && [hexString length]< 5) {
NSMutableString *newHexString = [[NSMutableString alloc] initWithCapacity:[hexString length]*2];
[hexString enumerateSubstringsInRange:NSMakeRange(0, [hexString length])
options:NSStringEnumerationByComposedCharacterSequences
usingBlock:^(NSString *substring,
NSRange substringRange,
NSRange enclosingRange,
BOOL *stop)
{
[newHexString appendFormat:#"%#%#", substring, substring];
}];
hexString = newHexString;
}
if ([hexString length] == 6)
hexString = [hexString stringByAppendingString:#"ff"];
NSScanner *scanner = [NSScanner scannerWithString:hexString];
unsigned hexNum;
if (![scanner scanHexInt:&hexNum])
return nil;
return [self _colorFromHex:hexNum];
}
#end
use it in swift:
add #include "UIColor+Creation.h" to Bridging Header
use
UIColor(fromHexString:"62AF3C")

-[NSCFDictionary length]: unrecognized selector

I have the next problem with this code:
NSDictionary * imagen = [[NSDictionary alloc] initWithDictionary:[envio resultValue]];
NSString *imagenS = [imagen valueForKey:#"/Result"];
[Base64 initialize];
NSData * imagenDecode = [Base64 decode:imagenS];
NSLog(#"%#", [imagenS length]);
//SAVE IMAGE
NSArray *sysPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES);
NSString *docDirectory = [sysPaths objectAtIndex:0];
NSString *filePath = [NSString stringWithFormat:#"%#david.png",docDirectory];
[imagenDecode writeToFile:filePath atomically:YES];
[envio resultValue] --> return a NSDictionary with one image in Base 64 codification.
I want decoder and save this image but in my console I have showed this message:
2011-08-23 20:15:36.539 WSStub[39226:a0f] -[NSCFDictionary length]: unrecognized selector sent to instance 0xd00ee0
The Base 64 code is:
//
// Base64.m
// CryptTest
//
// Created by Kiichi Takeuchi on 4/20/10.
// Copyright 2010 ObjectGraph LLC. All rights reserved.
//
#import "Base64.h"
#implementation Base64
#define ArrayLength(x) (sizeof(x)/sizeof(*(x)))
static char encodingTable[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
static char decodingTable[128];
+ (void) initialize
{
if (self == [Base64 class])
{
memset(decodingTable, 0, ArrayLength(decodingTable));
for (NSInteger i = 0; i < ArrayLength(encodingTable); i++) {
decodingTable[encodingTable[i]] = i;
}
}
}
+ (NSData*) decode:(const char*) string length:(NSInteger) inputLength
{
if ((string == NULL) || (inputLength % 4 != 0))
{
return nil;
}
while (inputLength > 0 && string[inputLength - 1] == '=')
{
inputLength--;
}
NSInteger outputLength = inputLength * 3 / 4;
NSMutableData* data = [NSMutableData dataWithLength:outputLength];
uint8_t* output = data.mutableBytes;
NSInteger inputPoint = 0;
NSInteger outputPoint = 0;
while (inputPoint < inputLength)
{
char i0 = string[inputPoint++];
char i1 = string[inputPoint++];
char i2 = inputPoint < inputLength ? string[inputPoint++] : 'A'; /* 'A' will decode to \0 */
char i3 = inputPoint < inputLength ? string[inputPoint++] : 'A';
output[outputPoint++] = (decodingTable[i0] << 2) | (decodingTable[i1] >> 4);
if (outputPoint < outputLength)
{
output[outputPoint++] = ((decodingTable[i1] & 0xf) << 4) | (decodingTable[i2] >> 2);
}
if (outputPoint < outputLength)
{
output[outputPoint++] = ((decodingTable[i2] & 0x3) << 6) | decodingTable[i3];
}
}
return data;
}
+ (NSData*) decode:(NSString*) string
{
return [self decode:[string cStringUsingEncoding:NSASCIIStringEncoding] length:[string length]];
}
#end
The line
NSString *imagenS = [imagen valueForKey:#"/Result"];
is returning a dictionary, not a string. You'll have to inspect your data source to determine whether this is correct or not.
Your NSLog() call is wrong. To show a length, it should be:
NSLog(#"%lu", [imagenS length]);
But that is probably not the problem.
You seem to be invoking length on an NSDictionary. Hard to tell where you do that, since you don't show the piece of code where that happens. It could be that imagenS is not an NSString, but an NSDictionary instead.
Try to do:
NSLog(#"%#", [imagenS class]);
and see what is displayed. If probably tells you that imagenS is not a string, but an NSCFDictionary or similar instead.
FWIW, NSCFDictionary is one of the subclasses of NSDictionary that actually implement the different versions of the main class. This is called a class cluster.

Truncate a string and add ellipsis at the end in Objective-c

How to truncate a string in Objective-C and then add the ellipsis at the end?
NSString *origString = #"A very long string blah blah blah";
const int clipLength = 18;
if([origString length]>clipLength)
{
origString = [NSString stringWithFormat:#"%#...",[origString substringToIndex:clipLength]];
}
Use one of these NSString methods to truncate, probably the last:
– substringFromIndex:
– substringWithRange:
– substringToIndex:
and then use the NSString method
– stringByAppendingString:
to add #"..." or whatever ellopsis you like.
For example:
NSString *newString = [[string substringToIndex:12] stringByAppendingString:#"..."];
For your reading pleasure, I recommend the NSString Class Reference.
In case you wish to truncate and add ellipsis to a string with the maximum being a specific width, here is an implementation that takes into account font and size:
+ (NSString *)stringByTruncatingString: (NSString *)string toWidth: (CGFloat)width withFont: (UIFont *)font
{
#define ellipsis #"..."
NSMutableString *truncatedString = [string mutableCopy];
if ([string sizeWithAttributes: #{NSFontAttributeName: font}].width > width) {
width -= [ellipsis sizeWithAttributes: #{NSFontAttributeName: font}].width;
NSRange range = {truncatedString.length - 1, 1};
while ([truncatedString sizeWithAttributes: #{NSFontAttributeName: font}].width > width) {
[truncatedString deleteCharactersInRange:range];
range.location--;
}
[truncatedString replaceCharactersInRange:range withString:ellipsis];
}
return truncatedString;
}
Don't need chuck of code for do this..
the easiest way to do this,
for drawRect
- (void)drawRect:(NSRect)dirtyRect{
NSString *theText = #"bla blah bla bhla bla bla";
NSMutableParagraphStyle *style = [[NSParagraphStyle defaultParagraphStyle] mutableCopy];
[style setLineBreakMode:NSLineBreakByTruncatingTail];
[theText drawInRect:dirtyRect withAttributes:[NSDictionary dictionaryWithObjectsAndKeys:style, NSParagraphStyleAttributeName,nil]];
}
hear I use dirtyRect for String's Drawing area you can change it as you wish.
for NSTextField
NSTextField *_warningTF = [[NSTextField alloc]init];
[_warningTF setStringValue:#"sfdsf sdfdsfdsfdsfdsfdsfdsf 1234566789123456789sfdsf dsf dsfdsf"];
[_warningTF.cell setLineBreakMode:NSLineBreakByTruncatingTail];
I wrote simple category to truncate NSString by words:
#interface NSString (TFDString)
- (NSString *)truncateByWordWithLimit:(NSInteger)limit;
#end
#implementation NSString (TFDString)
- (NSString *)truncateByWordWithLimit:(NSInteger)limit {
NSRange r = NSMakeRange(0, self.length);
while (r.length > limit) {
NSRange r0 = [self rangeOfString:#" " options:NSBackwardsSearch range:r];
if (!r0.length) break;
r = NSMakeRange(0, r0.location);
}
if (r.length == self.length) return self;
return [[self substringWithRange:r] stringByAppendingString:#"..."];
}
#end
Usage:
NSString *xx = #"This string is too long, somebody just need to take and truncate it, but by word, please.";
xx = [xx truncateByWordWithLimit:50];
Result:
This string is too long, somebody just need to...
Hope it helps somebody.
the drawWithRect:options:attributes:context method helps. you can try this:
[_text drawWithRect:_textRect options:NSStringDrawingUsesLineFragmentOrigin | NSStringDrawingTruncatesLastVisibleLine attributes:attributes context:nil];

HEX to UIColorfromRGB

I'm following a tutorial here
It's fairly straight forward and simple, only 2 steps. But on the last step, I have the HEX code in a UITextField as hexText.text, but how do i put that into UIColorFromRGB?
here is a solution that avoids the macro stuff. You can add it to a category to UIColor and use it more nicely.
Sorin has the right idea here I think. Much more cocoa-like, and will result in fewer headaches. To answer your question at a high level, you'd need to convert your string to a hexadecimal number, and then pass that resulting value in to the macro. I think you'd be better served just passing the string value into the category listed in Sorin's link.
This will solve any case
+ (UIColor *) colorWithHexString: (NSString *) stringToConvert{
NSString *cString = [[stringToConvert stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]] uppercaseString];
unsigned int r, g, b,alpha = 1;
NSRange range;
range.location = 0;
range.length = 2;
// String should be 6 or 8 characters
if ([cString length] < 6) return [UIColor blackColor];
// strip 0X if it appears
if ([cString hasPrefix:#"0X"]) cString = [cString substringFromIndex:2];
if ([cString hasPrefix:#"#"]) cString = [cString substringFromIndex:1];
if ([cString length] == 8) {
[[NSScanner scannerWithString:[cString substringWithRange:range]] scanHexInt:&alpha];
cString = [cString substringFromIndex:2];
}
if ([cString length] != 6) return [UIColor blackColor];
// Separate into r, g, b substrings
NSString *rString = [cString substringWithRange:range];
range.location = 2;
NSString *gString = [cString substringWithRange:range];
range.location = 4;
NSString *bString = [cString substringWithRange:range];
// Scan values
[[NSScanner scannerWithString:rString] scanHexInt:&r];
[[NSScanner scannerWithString:gString] scanHexInt:&g];
[[NSScanner scannerWithString:bString] scanHexInt:&b];
return [UIColor colorWithRed:((float) r / 255.0f)
green:((float) g / 255.0f)
blue:((float) b / 255.0f)
alpha:((float) alpha / 255.0f)];
}

How to convert an NSString to hex values

I'd like to convert a regular NSString into an NSString with the (what I assume are) ASCII hex values and back.
I need to produce the same output that the Java methods below do, but I can't seem to find a way to do it in Objective-C. I've found some examples in C and C++ but I've had a hard time working them into my code.
Here are the Java methods I'm trying to reproduce:
/**
* Encodes the given string by using the hexadecimal representation of its UTF-8 bytes.
*
* #param s The string to encode.
* #return The encoded string.
*/
public static String utf8HexEncode(String s) {
if (s == null) {
return null;
}
byte[] utf8;
try {
utf8 = s.getBytes(ENCODING_UTF8);
} catch (UnsupportedEncodingException x) {
throw new RuntimeException(x);
}
return String.valueOf(Hex.encodeHex(utf8));
}
/**
* Decodes the given string by using the hexadecimal representation of its UTF-8 bytes.
*
* #param s The string to decode.
* #return The decoded string.
* #throws Exception If an error occurs.
*/
public static String utf8HexDecode(String s) throws Exception {
if (s == null) {
return null;
}
return new String(Hex.decodeHex(s.toCharArray()), ENCODING_UTF8);
}
Update: Thanks to drawnonward's answer here's the method I wrote to create the hex NSStrings. It gives me an "Initialization discards qualifiers from pointer target type" warning on the char declaration line, but it works.
- (NSString *)stringToHex:(NSString *)string
{
char *utf8 = [string UTF8String];
NSMutableString *hex = [NSMutableString string];
while ( *utf8 ) [hex appendFormat:#"%02X" , *utf8++ & 0x00FF];
return [NSString stringWithFormat:#"%#", hex];
}
Haven't had time to write the decoding method yet. When I do, I'll edit this to post it for anyone else interested.
Update2: So the method I posted above actually doesn't output what I'm looking for. Instead of outputting hex values in 0-f format, it was instead outputting all numbers. I finally got back to working on this problem and was able to write a category for NSString that exactly duplicates the Java methods I posted. Here it is:
//
// NSString+hex.h
// Created by Ben Baron on 10/20/10.
//
#interface NSString (hex)
+ (NSString *) stringFromHex:(NSString *)str;
+ (NSString *) stringToHex:(NSString *)str;
#end
//
// NSString+hex.m
// Created by Ben Baron on 10/20/10.
//
#import "NSString+hex.h"
#implementation NSString (hex)
+ (NSString *) stringFromHex:(NSString *)str
{
NSMutableData *stringData = [[[NSMutableData alloc] init] autorelease];
unsigned char whole_byte;
char byte_chars[3] = {'\0','\0','\0'};
int i;
for (i=0; i < [str length] / 2; i++) {
byte_chars[0] = [str characterAtIndex:i*2];
byte_chars[1] = [str characterAtIndex:i*2+1];
whole_byte = strtol(byte_chars, NULL, 16);
[stringData appendBytes:&whole_byte length:1];
}
return [[[NSString alloc] initWithData:stringData encoding:NSASCIIStringEncoding] autorelease];
}
+ (NSString *) stringToHex:(NSString *)str
{
NSUInteger len = [str length];
unichar *chars = malloc(len * sizeof(unichar));
[str getCharacters:chars];
NSMutableString *hexString = [[NSMutableString alloc] init];
for(NSUInteger i = 0; i < len; i++ )
{
[hexString appendString:[NSString stringWithFormat:#"%x", chars[i]]];
}
free(chars);
return [hexString autorelease];
}
#end
The perfect and short way to convert nsstring to hexadecimal values
NSMutableString *tempHex=[[NSMutableString alloc] init];
[tempHex appendString:#"0xD2D2D2"];
unsigned colorInt = 0;
[[NSScanner scannerWithString:tempHex] scanHexInt:&colorInt];
lblAttString.backgroundColor=UIColorFromRGB(colorInt);
The macro used for this code is----
#define UIColorFromRGB(rgbValue)
[UIColor \colorWithRed:((float)((rgbValue & 0xFF0000) >> 16))/255.0 \
green:((float)((rgbValue & 0xFF00) >> 8))/255.0 \
blue:((float)(rgbValue & 0xFF))/255.0 alpha:1.0]
For these lines of Java
utf8 = s.getBytes(ENCODING_UTF8);
new String(decodedHexString, ENCODING_UTF8);
Objective-C equivalents would be
utf8 = [s UTF8String];
[NSString initWithUTF8String:decodedHexString];
To make an NSString with the hexadecimal representation of a character string:
NSMutableString *hex = [NSMutableString string];
while ( *utf8 ) [hex appendFormat:#"%02X" , *utf8++ & 0x00FF];
You will have to make your own decodeHex function. Just pull two characters out of the string and, if they are valid, add a byte to the result.
There is a problem with your stringToHex method - it drops leading 0s, and ignores 00s. Just as a quick fix, I made the below:
+ (NSString *) stringToHex:(NSString *)str
{
NSUInteger len = [str length];
unichar *chars = malloc(len * sizeof(unichar));
[str getCharacters:chars];
NSMutableString *hexString = [[NSMutableString alloc] init];
for(NSUInteger i = 0; i < len; i++ )
{
// [hexString [NSString stringWithFormat:#"%02x", chars[i]]]; /*previous input*/
[hexString appendFormat:#"%02x", chars[i]]; /*EDITED PER COMMENT BELOW*/
}
free(chars);
return [hexString autorelease];
}
Thanks to all who contributed on this thread. It was a great help to me. Since things have moved on a little since the original post, here's my updated implementation for iOS 6. I went with the categories approach, but chose to split the load between NSData and NSString. Comments welcomed.
First, the NSString half, which handles decoding a hex encoded string into an NSData object.
#implementation NSString (StringToHexData)
//
// Decodes an NSString containing hex encoded bytes into an NSData object
//
- (NSData *) stringToHexData
{
int len = [self length] / 2; // Target length
unsigned char *buf = malloc(len)
unsigned char *whole_byte = buf;
char byte_chars[3] = {'\0','\0','\0'};
int i;
for (i=0; i < [self length] / 2; i++) {
byte_chars[0] = [self characterAtIndex:i*2];
byte_chars[1] = [self characterAtIndex:i*2+1];
*whole_byte = strtol(byte_chars, NULL, 16);
whole_byte++;
}
NSData *data = [NSData dataWithBytes:buf length:len];
free( buf );
return data;
}
#end
The changes were mostly for efficiency's sake: some simple old-fashioned pointer arithmetic means I could allocate the whole buffer in one go, and populate it byte by byte. Then the whole thing is passed to NSData in one go.
The encoding part, in NSData, looks like this:
#implementation NSData (DataToHexString)
- (NSString *) dataToHexString
{
NSUInteger len = [self length];
char * chars = (char *)[self bytes];
NSMutableString * hexString = [[NSMutableString alloc] init];
for(NSUInteger i = 0; i < len; i++ )
[hexString appendString:[NSString stringWithFormat:#"%0.2hhx", chars[i]]];
return hexString;
}
#end
Again, some minor changes, though I suspect no efficiency gains here. The use of "%0.2hhx" solved all the problems of missing leading zero's and ensures that only a single-byte is output at a time.
Hope this helps the next person taking this on!
One possible solution:
+(NSString*)hexFromStr:(NSString*)str
{
NSData* nsData = [str dataUsingEncoding:NSUTF8StringEncoding];
const char* data = [nsData bytes];
NSUInteger len = nsData.length;
NSMutableString* hex = [NSMutableString string];
for(int i = 0; i < len; ++i)[hex appendFormat:#"%02X", data[i]];
return hex;
}
So, first off, I would like to thank drawnonward for his answer. This gave me the first function, mean and clean. In the same spirit, I wrote the other one. Hope you like it.
#synthesize unsigned char* value= _value;
- (NSString*) hexString
{
_value[CONSTANT]= '\0';
unsigned char* ptr= _value;
NSMutableString* hex = [[NSMutableString alloc] init];
while ( *ptr ) [hex appendFormat:#"%02x", *ptr++ & 0x00FF];
return [hex autorelease];
}
- (void) setHexString:(NSString*)hexString
{
_value[CONSTANT]= '\0';
unsigned char* ptr= _value;
for (const char* src= [hexString cStringUsingEncoding:NSASCIIStringEncoding];
*src;
src+=2)
{
unsigned int hexByte;
/*int res=*/ sscanf(src,"%02x",&hexByte);
*ptr++= (unsigned char)(hexByte & 0x00FF);
}
*ptr= '\0';
}
My input was an digit base10 string, and the output should be the hex representation in string format. Examples:
#"10" -> #"A"
#"1128" -> #"468"
#"1833828235" -> #"6D4DFF8B"
Implementation:
+ (NSString *) stringToHex:(NSString *)str{
NSInteger result = [str integerValue];
NSString *hexStr = (result)?#"":#"0";
while (result!=0) {
NSInteger reminder = result % 16;
if(reminder>=0 && reminder<=9){
hexStr = [[NSString stringWithFormat:#"%ld",(long)reminder] stringByAppendingString:hexStr];
}else if(reminder==10){
hexStr = [#"A" stringByAppendingString:hexStr];
}else if(reminder==11){
hexStr = [#"B" stringByAppendingString:hexStr];
}else if(reminder==12){
hexStr = [#"C" stringByAppendingString:hexStr];
}else if(reminder==13){
hexStr = [#"D" stringByAppendingString:hexStr];
}else if(reminder==14){
hexStr = [#"E" stringByAppendingString:hexStr];
}else{
hexStr = [#"F" stringByAppendingString:hexStr];
}
result /=16;
}
return hexStr;
}
Perhaps you should use NSString dataUsingEncoding: to encode and initWithData:length:encoding: to decode. Depends on where you are getting the data from.