I am looking for a nice-cocoa way to serialize an NSData object into a hexadecimal string. The idea is to serialize the deviceToken used for notification before sending it to my server.
I have the following implementation, but I am thinking there must be some shorter and nicer way to do it.
+ (NSString*) serializeDeviceToken:(NSData*) deviceToken
{
NSMutableString *str = [NSMutableString stringWithCapacity:64];
int length = [deviceToken length];
char *bytes = malloc(sizeof(char) * length);
[deviceToken getBytes:bytes length:length];
for (int i = 0; i < length; i++)
{
[str appendFormat:#"%02.2hhX", bytes[i]];
}
free(bytes);
return str;
}
This is a category applied to NSData that I wrote. It returns a hexadecimal NSString representing the NSData, where the data can be any length. Returns an empty string if NSData is empty.
NSData+Conversion.h
#import <Foundation/Foundation.h>
#interface NSData (NSData_Conversion)
#pragma mark - String Conversion
- (NSString *)hexadecimalString;
#end
NSData+Conversion.m
#import "NSData+Conversion.h"
#implementation NSData (NSData_Conversion)
#pragma mark - String Conversion
- (NSString *)hexadecimalString {
/* Returns hexadecimal string of NSData. Empty string if data is empty. */
const unsigned char *dataBuffer = (const unsigned char *)[self bytes];
if (!dataBuffer)
return [NSString string];
NSUInteger dataLength = [self length];
NSMutableString *hexString = [NSMutableString stringWithCapacity:(dataLength * 2)];
for (int i = 0; i < dataLength; ++i)
[hexString appendString:[NSString stringWithFormat:#"%02lx", (unsigned long)dataBuffer[i]]];
return [NSString stringWithString:hexString];
}
#end
Usage:
NSData *someData = ...;
NSString *someDataHexadecimalString = [someData hexadecimalString];
This is "probably" better than calling [someData description] and then stripping the spaces, <'s, and >'s. Stripping characters just feels too "hacky". Plus you never know if Apple will change the formatting of NSData's -description in the future.
NOTE: I have had people reach out to me about licensing for the code in this answer. I hereby dedicate my copyright in the code I posted in this answer to the public domain.
Here's a highly optimized NSData category method for generating a hex string. While #Dave Gallagher's answer is sufficient for a relatively small size, memory and cpu performance deteriorate for large amounts of data. I profiled this with a 2MB file on my iPhone 5. Time comparison was 0.05 vs 12 seconds. Memory footprint is negligible with this method while the other method grew the heap to 70MBs!
- (NSString *) hexString
{
NSUInteger bytesCount = self.length;
if (bytesCount) {
const char *hexChars = "0123456789ABCDEF";
const unsigned char *dataBuffer = self.bytes;
char *chars = malloc(sizeof(char) * (bytesCount * 2 + 1));
if (chars == NULL) {
// malloc returns null if attempting to allocate more memory than the system can provide. Thanks Cœur
[NSException raise:NSInternalInconsistencyException format:#"Failed to allocate more memory" arguments:nil];
return nil;
}
char *s = chars;
for (unsigned i = 0; i < bytesCount; ++i) {
*s++ = hexChars[((*dataBuffer & 0xF0) >> 4)];
*s++ = hexChars[(*dataBuffer & 0x0F)];
dataBuffer++;
}
*s = '\0';
NSString *hexString = [NSString stringWithUTF8String:chars];
free(chars);
return hexString;
}
return #"";
}
Using the description property of NSData should not be considered an acceptable mechanism for HEX encoding the string. That property is for description only and can change at any time. As a note, pre-iOS, the NSData description property didn't even return it's data in hex form.
Sorry for harping on the solution but it's important to take the energy to serialize it without piggy-backing off an API that is meant for something else other than data serialization.
#implementation NSData (Hex)
- (NSString*)hexString
{
NSUInteger length = self.length;
unichar* hexChars = (unichar*)malloc(sizeof(unichar) * (length*2));
unsigned char* bytes = (unsigned char*)self.bytes;
for (NSUInteger i = 0; i < length; i++) {
unichar c = bytes[i] / 16;
if (c < 10) {
c += '0';
} else {
c += 'A' - 10;
}
hexChars[i*2] = c;
c = bytes[i] % 16;
if (c < 10) {
c += '0';
} else {
c += 'A' - 10;
}
hexChars[i*2+1] = c;
}
NSString* retVal = [[NSString alloc] initWithCharactersNoCopy:hexChars length:length*2 freeWhenDone:YES];
return [retVal autorelease];
}
#end
Here is a faster way to do the conversion:
BenchMark (mean time for a 1024 bytes data conversion repeated 100 times):
Dave Gallagher : ~8.070 ms
NSProgrammer : ~0.077 ms
Peter : ~0.031 ms
This One : ~0.017 ms
#implementation NSData (BytesExtras)
static char _NSData_BytesConversionString_[512] = "000102030405060708090a0b0c0d0e0f101112131415161718191a1b1c1d1e1f202122232425262728292a2b2c2d2e2f303132333435363738393a3b3c3d3e3f404142434445464748494a4b4c4d4e4f505152535455565758595a5b5c5d5e5f606162636465666768696a6b6c6d6e6f707172737475767778797a7b7c7d7e7f808182838485868788898a8b8c8d8e8f909192939495969798999a9b9c9d9e9fa0a1a2a3a4a5a6a7a8a9aaabacadaeafb0b1b2b3b4b5b6b7b8b9babbbcbdbebfc0c1c2c3c4c5c6c7c8c9cacbcccdcecfd0d1d2d3d4d5d6d7d8d9dadbdcdddedfe0e1e2e3e4e5e6e7e8e9eaebecedeeeff0f1f2f3f4f5f6f7f8f9fafbfcfdfeff";
-(NSString*)bytesString
{
UInt16* mapping = (UInt16*)_NSData_BytesConversionString_;
register UInt16 len = self.length;
char* hexChars = (char*)malloc( sizeof(char) * (len*2) );
// --- Coeur's contribution - a safe way to check the allocation
if (hexChars == NULL) {
// we directly raise an exception instead of using NSAssert to make sure assertion is not disabled as this is irrecoverable
[NSException raise:#"NSInternalInconsistencyException" format:#"failed malloc" arguments:nil];
return nil;
}
// ---
register UInt16* dst = ((UInt16*)hexChars) + len-1;
register unsigned char* src = (unsigned char*)self.bytes + len-1;
while (len--) *dst-- = mapping[*src--];
NSString* retVal = [[NSString alloc] initWithBytesNoCopy:hexChars length:self.length*2 encoding:NSASCIIStringEncoding freeWhenDone:YES];
#if (!__has_feature(objc_arc))
return [retVal autorelease];
#else
return retVal;
#endif
}
#end
Functional Swift version
One liner:
let hexString = UnsafeBufferPointer<UInt8>(start: UnsafePointer(data.bytes),
count: data.length).map { String(format: "%02x", $0) }.joinWithSeparator("")
Here's in a reusable and self documenting extension form:
extension NSData {
func base16EncodedString(uppercase uppercase: Bool = false) -> String {
let buffer = UnsafeBufferPointer<UInt8>(start: UnsafePointer(self.bytes),
count: self.length)
let hexFormat = uppercase ? "X" : "x"
let formatString = "%02\(hexFormat)"
let bytesAsHexStrings = buffer.map {
String(format: formatString, $0)
}
return bytesAsHexStrings.joinWithSeparator("")
}
}
Alternatively, use reduce("", combine: +) instead of joinWithSeparator("") to be seen as a functional master by your peers.
Edit: I changed String($0, radix: 16) to String(format: "%02x", $0), because one digit numbers needed to having a padding zero
Peter's answer ported to Swift
func hexString(data:NSData)->String{
if data.length > 0 {
let hexChars = Array("0123456789abcdef".utf8) as [UInt8];
let buf = UnsafeBufferPointer<UInt8>(start: UnsafePointer(data.bytes), count: data.length);
var output = [UInt8](count: data.length*2 + 1, repeatedValue: 0);
var ix:Int = 0;
for b in buf {
let hi = Int((b & 0xf0) >> 4);
let low = Int(b & 0x0f);
output[ix++] = hexChars[ hi];
output[ix++] = hexChars[low];
}
let result = String.fromCString(UnsafePointer(output))!;
return result;
}
return "";
}
swift3
func hexString()->String{
if count > 0 {
let hexChars = Array("0123456789abcdef".utf8) as [UInt8];
return withUnsafeBytes({ (bytes:UnsafePointer<UInt8>) -> String in
let buf = UnsafeBufferPointer<UInt8>(start: bytes, count: self.count);
var output = [UInt8](repeating: 0, count: self.count*2 + 1);
var ix:Int = 0;
for b in buf {
let hi = Int((b & 0xf0) >> 4);
let low = Int(b & 0x0f);
output[ix] = hexChars[ hi];
ix += 1;
output[ix] = hexChars[low];
ix += 1;
}
return String(cString: UnsafePointer(output));
})
}
return "";
}
Swift 5
func hexString()->String{
if count > 0 {
let hexChars = Array("0123456789abcdef".utf8) as [UInt8];
return withUnsafeBytes{ bytes->String in
var output = [UInt8](repeating: 0, count: bytes.count*2 + 1);
var ix:Int = 0;
for b in bytes {
let hi = Int((b & 0xf0) >> 4);
let low = Int(b & 0x0f);
output[ix] = hexChars[ hi];
ix += 1;
output[ix] = hexChars[low];
ix += 1;
}
return String(cString: UnsafePointer(output));
}
}
return "";
}
I needed to solve this problem and found the answers here very useful, but I worry about performance. Most of these answers involve copying the data in bulk out of NSData so I wrote the following to do the conversion with low overhead:
#interface NSData (HexString)
#end
#implementation NSData (HexString)
- (NSString *)hexString {
NSMutableString *string = [NSMutableString stringWithCapacity:self.length * 3];
[self enumerateByteRangesUsingBlock:^(const void *bytes, NSRange byteRange, BOOL *stop){
for (NSUInteger offset = 0; offset < byteRange.length; ++offset) {
uint8_t byte = ((const uint8_t *)bytes)[offset];
if (string.length == 0)
[string appendFormat:#"%02X", byte];
else
[string appendFormat:#" %02X", byte];
}
}];
return string;
}
This pre-allocates space in the string for the entire result and avoids ever copying the NSData contents out by using enumerateByteRangesUsingBlock. Changing the X to an x in the format string will use lowercase hex digits. If you don't want a separator between the bytes you can reduce the statement
if (string.length == 0)
[string appendFormat:#"%02X", byte];
else
[string appendFormat:#" %02X", byte];
down to just
[string appendFormat:#"%02X", byte];
I needed an answer that would work for variable length strings, so here's what I did:
+ (NSString *)stringWithHexFromData:(NSData *)data
{
NSString *result = [[data description] stringByReplacingOccurrencesOfString:#" " withString:#""];
result = [result substringWithRange:NSMakeRange(1, [result length] - 2)];
return result;
}
Works great as an extension for the NSString class.
You can always use [yourString uppercaseString] to capitalize letters in data description
A better way to serialize/deserialize NSData into NSString is to use the Google Toolbox for Mac Base64 encoder/decoder. Just drag into your App Project the files GTMBase64.m, GTMBase64.h e GTMDefines.h from the package Foundation and the do something like
/**
* Serialize NSData to Base64 encoded NSString
*/
-(void) serialize:(NSData*)data {
self.encodedData = [GTMBase64 stringByEncodingData:data];
}
/**
* Deserialize Base64 NSString to NSData
*/
-(NSData*) deserialize {
return [GTMBase64 decodeString:self.encodedData];
}
Here is a solution using Swift 3
extension Data {
public var hexadecimalString : String {
var str = ""
enumerateBytes { buffer, index, stop in
for byte in buffer {
str.append(String(format:"%02x",byte))
}
}
return str
}
}
extension NSData {
public var hexadecimalString : String {
return (self as Data).hexadecimalString
}
}
#implementation NSData (Extn)
- (NSString *)description
{
NSMutableString *str = [[NSMutableString alloc] init];
const char *bytes = self.bytes;
for (int i = 0; i < [self length]; i++) {
[str appendFormat:#"%02hhX ", bytes[i]];
}
return [str autorelease];
}
#end
Now you can call NSLog(#"hex value: %#", data)
Change %08x to %08X to get capital characters.
Swift + Property.
I prefer to have hex representation as property (the same as bytes and description properties):
extension NSData {
var hexString: String {
let buffer = UnsafeBufferPointer<UInt8>(start: UnsafePointer(self.bytes), count: self.length)
return buffer.map { String(format: "%02x", $0) }.joinWithSeparator("")
}
var heXString: String {
let buffer = UnsafeBufferPointer<UInt8>(start: UnsafePointer(self.bytes), count: self.length)
return buffer.map { String(format: "%02X", $0) }.joinWithSeparator("")
}
}
Idea is borrowed from this answer
[deviceToken description]
You'll need to remove the spaces.
Personally I base64 encode the deviceToken, but it's a matter of taste.
Related
my problem is pretty simple. I assign a value to string variable in xcode which looks like this:
ARAMAUBEBABRBGCNDKDEEEFO
and I need it like this:
AR,AM,AU,BE,BA,BR,BG,CN,DK,DE,EE,FO
The length is different in each variable.
thanx in advance
This function is usefull for numbers that need coma every thousands... which is what I wanted, hope it helps.
//add comas to a a string...
//example1: #"5123" = #"5,123"
//example2: #"123" = #"123"
//example3: #"123123123" = #"123,123,123"
-(NSString*) addComasToStringEvery3chrsFromRightToLeft:(NSString*) myString{
NSMutableString *stringFormatted = [NSMutableString stringWithFormat:#"%#",myString];
for(NSInteger i=[stringFormatted length]-3;i>0;i=i-3) {
if (i>0) {
[stringFormatted insertString: #"," atIndex: i];
}
}
return stringFormatted;
}
Try this:
int num;
NSMutableString *string1 = [NSMutableString stringWithString: #"1234567890"];
num = [string1 length];
for(int i=3;i<=num+1;i++) {
[string1 insertString: #"," atIndex: i];
i+=3;
}
NSString *yourString; // the string you want to process
int len = 2; // the length
NSMutableString *str = [NSMutableString string];
int i = 0;
for (; i < [yourString length]; i+=len) {
NSRange range = NSMakeRange(i, len);
[str appendString:[yourString substringWithRange:range]];
[str appendString:#","];
}
if (i < [str length]-1) { // add remain part
[str appendString:[yourString substringFromIndex:i]];
}
// str now is what your want
This would work well when your string is not very large:
NSString * StringByInsertingStringEveryNCharacters(NSString * const pString,
NSString * const pStringToInsert,
const size_t n) {
NSMutableString * const s = pString.mutableCopy;
for (NSUInteger pos = n, advance = n + pStringToInsert.length; pos < s.length; pos += advance) {
[s insertString:pStringToInsert atIndex:pos];
}
return s.copy;
}
If the string is very large, you should favor to compose it without insertion (append-only).
(define your own error detection)
I have the next problem with this code:
NSDictionary * imagen = [[NSDictionary alloc] initWithDictionary:[envio resultValue]];
NSString *imagenS = [imagen valueForKey:#"/Result"];
[Base64 initialize];
NSData * imagenDecode = [Base64 decode:imagenS];
NSLog(#"%#", [imagenS length]);
//SAVE IMAGE
NSArray *sysPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES);
NSString *docDirectory = [sysPaths objectAtIndex:0];
NSString *filePath = [NSString stringWithFormat:#"%#david.png",docDirectory];
[imagenDecode writeToFile:filePath atomically:YES];
[envio resultValue] --> return a NSDictionary with one image in Base 64 codification.
I want decoder and save this image but in my console I have showed this message:
2011-08-23 20:15:36.539 WSStub[39226:a0f] -[NSCFDictionary length]: unrecognized selector sent to instance 0xd00ee0
The Base 64 code is:
//
// Base64.m
// CryptTest
//
// Created by Kiichi Takeuchi on 4/20/10.
// Copyright 2010 ObjectGraph LLC. All rights reserved.
//
#import "Base64.h"
#implementation Base64
#define ArrayLength(x) (sizeof(x)/sizeof(*(x)))
static char encodingTable[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
static char decodingTable[128];
+ (void) initialize
{
if (self == [Base64 class])
{
memset(decodingTable, 0, ArrayLength(decodingTable));
for (NSInteger i = 0; i < ArrayLength(encodingTable); i++) {
decodingTable[encodingTable[i]] = i;
}
}
}
+ (NSData*) decode:(const char*) string length:(NSInteger) inputLength
{
if ((string == NULL) || (inputLength % 4 != 0))
{
return nil;
}
while (inputLength > 0 && string[inputLength - 1] == '=')
{
inputLength--;
}
NSInteger outputLength = inputLength * 3 / 4;
NSMutableData* data = [NSMutableData dataWithLength:outputLength];
uint8_t* output = data.mutableBytes;
NSInteger inputPoint = 0;
NSInteger outputPoint = 0;
while (inputPoint < inputLength)
{
char i0 = string[inputPoint++];
char i1 = string[inputPoint++];
char i2 = inputPoint < inputLength ? string[inputPoint++] : 'A'; /* 'A' will decode to \0 */
char i3 = inputPoint < inputLength ? string[inputPoint++] : 'A';
output[outputPoint++] = (decodingTable[i0] << 2) | (decodingTable[i1] >> 4);
if (outputPoint < outputLength)
{
output[outputPoint++] = ((decodingTable[i1] & 0xf) << 4) | (decodingTable[i2] >> 2);
}
if (outputPoint < outputLength)
{
output[outputPoint++] = ((decodingTable[i2] & 0x3) << 6) | decodingTable[i3];
}
}
return data;
}
+ (NSData*) decode:(NSString*) string
{
return [self decode:[string cStringUsingEncoding:NSASCIIStringEncoding] length:[string length]];
}
#end
The line
NSString *imagenS = [imagen valueForKey:#"/Result"];
is returning a dictionary, not a string. You'll have to inspect your data source to determine whether this is correct or not.
Your NSLog() call is wrong. To show a length, it should be:
NSLog(#"%lu", [imagenS length]);
But that is probably not the problem.
You seem to be invoking length on an NSDictionary. Hard to tell where you do that, since you don't show the piece of code where that happens. It could be that imagenS is not an NSString, but an NSDictionary instead.
Try to do:
NSLog(#"%#", [imagenS class]);
and see what is displayed. If probably tells you that imagenS is not a string, but an NSCFDictionary or similar instead.
FWIW, NSCFDictionary is one of the subclasses of NSDictionary that actually implement the different versions of the main class. This is called a class cluster.
When I use this code to create a sha256 of a string
unsigned char hashedChars[32];
NSString *inputString;
inputString = [NSString stringWithFormat:#"hello"];
NSData * inputData = [inputString dataUsingEncoding:NSUTF8StringEncoding];
CC_SHA256(inputData.bytes, inputData.length, hashedChars);
It returns the hash correctly, but I need to insert a string like this \x00\x25\x53 and in this case, the function returns a sha256 of empty string because the specified encoding cannot be used to convert the receiver.
Now, my question is:How to insert this special characters for generate a correct hash? Thanks
Try this, it worked for me
1) To get a hash for plain text input
-(NSString*)sha256HashFor:(NSString*)input
{
const char* str = [input UTF8String];
unsigned char result[CC_SHA256_DIGEST_LENGTH];
CC_SHA256(str, strlen(str), result);
NSMutableString *ret = [NSMutableString stringWithCapacity:CC_SHA256_DIGEST_LENGTH*2];
for(int i = 0; i<CC_SHA256_DIGEST_LENGTH; i++)
{
[ret appendFormat:#"%02x",result[i]];
}
return ret;
}
2) To get hash for NSData as input
Note:- I have used NSData category, so the code is as follow
- (NSString *)SHA256_HASH {
//if (!self) return nil;
unsigned char hash[CC_SHA256_DIGEST_LENGTH];
if ( CC_SHA256([(NSData*)self bytes], [(NSData*)self length], hash) ) {
NSData *sha2 = [NSData dataWithBytes:hash length:CC_SHA256_DIGEST_LENGTH];
// description converts to hex but puts <> around it and spaces every 4 bytes
NSString *hash = [sha2 description];
hash = [hash stringByReplacingOccurrencesOfString:#" " withString:#""];
hash = [hash stringByReplacingOccurrencesOfString:#"<" withString:#""];
hash = [hash stringByReplacingOccurrencesOfString:#">" withString:#""];
// hash is now a string with just the 40char hash value in it
//NSLog(#"hash = %#",hash);
// Format SHA256 fingerprint like
// 00:00:00:00:00:00:00:00:00
int keyLength=[hash length];
NSString *formattedKey = #"";
for (int i=0; i<keyLength; i+=2) {
NSString *substr=[hash substringWithRange:NSMakeRange(i, 2)];
if (i!=keyLength-2)
substr=[substr stringByAppendingString:#":"];
formattedKey = [formattedKey stringByAppendingString:substr];
}
return formattedKey;
}
return nil;
}
It's important to know that you need to import:
#import <CommonCrypto/CommonDigest.h>
Hope this help!
You probably should use NSData instead of NSString then. Where do you get that string from?
Some one looking the solution in Swift 3.0. here is
extension String {
// MARK: - SHA256
func get_sha256_String() -> String {
guard let data = self.data(using: .utf8) else {
print("Data not available")
return ""
}
return getHexString(fromData: digest(input: data as NSData))
}
private func digest(input : NSData) -> NSData {
let digestLength = Int(CC_SHA256_DIGEST_LENGTH)
var hashValue = [UInt8](repeating: 0, count: digestLength)
CC_SHA256(input.bytes, UInt32(input.length), &hashValue)
return NSData(bytes: hashValue, length: digestLength)
}
private func getHexString(fromData data: NSData) -> String {
var bytes = [UInt8](repeating: 0, count: data.length)
data.getBytes(&bytes, length: data.length)
var hexString = ""
for byte in bytes {
hexString += String(format:"%02x", UInt8(byte))
}
return hexString
}}
How to Use it
let signatures = "yourStringToBeConverted".get_sha256_String()
also don't forgot to import #import <CommonCrypto/CommonHMAC.h> in your Bridging-Header.h
I'd like to convert a regular NSString into an NSString with the (what I assume are) ASCII hex values and back.
I need to produce the same output that the Java methods below do, but I can't seem to find a way to do it in Objective-C. I've found some examples in C and C++ but I've had a hard time working them into my code.
Here are the Java methods I'm trying to reproduce:
/**
* Encodes the given string by using the hexadecimal representation of its UTF-8 bytes.
*
* #param s The string to encode.
* #return The encoded string.
*/
public static String utf8HexEncode(String s) {
if (s == null) {
return null;
}
byte[] utf8;
try {
utf8 = s.getBytes(ENCODING_UTF8);
} catch (UnsupportedEncodingException x) {
throw new RuntimeException(x);
}
return String.valueOf(Hex.encodeHex(utf8));
}
/**
* Decodes the given string by using the hexadecimal representation of its UTF-8 bytes.
*
* #param s The string to decode.
* #return The decoded string.
* #throws Exception If an error occurs.
*/
public static String utf8HexDecode(String s) throws Exception {
if (s == null) {
return null;
}
return new String(Hex.decodeHex(s.toCharArray()), ENCODING_UTF8);
}
Update: Thanks to drawnonward's answer here's the method I wrote to create the hex NSStrings. It gives me an "Initialization discards qualifiers from pointer target type" warning on the char declaration line, but it works.
- (NSString *)stringToHex:(NSString *)string
{
char *utf8 = [string UTF8String];
NSMutableString *hex = [NSMutableString string];
while ( *utf8 ) [hex appendFormat:#"%02X" , *utf8++ & 0x00FF];
return [NSString stringWithFormat:#"%#", hex];
}
Haven't had time to write the decoding method yet. When I do, I'll edit this to post it for anyone else interested.
Update2: So the method I posted above actually doesn't output what I'm looking for. Instead of outputting hex values in 0-f format, it was instead outputting all numbers. I finally got back to working on this problem and was able to write a category for NSString that exactly duplicates the Java methods I posted. Here it is:
//
// NSString+hex.h
// Created by Ben Baron on 10/20/10.
//
#interface NSString (hex)
+ (NSString *) stringFromHex:(NSString *)str;
+ (NSString *) stringToHex:(NSString *)str;
#end
//
// NSString+hex.m
// Created by Ben Baron on 10/20/10.
//
#import "NSString+hex.h"
#implementation NSString (hex)
+ (NSString *) stringFromHex:(NSString *)str
{
NSMutableData *stringData = [[[NSMutableData alloc] init] autorelease];
unsigned char whole_byte;
char byte_chars[3] = {'\0','\0','\0'};
int i;
for (i=0; i < [str length] / 2; i++) {
byte_chars[0] = [str characterAtIndex:i*2];
byte_chars[1] = [str characterAtIndex:i*2+1];
whole_byte = strtol(byte_chars, NULL, 16);
[stringData appendBytes:&whole_byte length:1];
}
return [[[NSString alloc] initWithData:stringData encoding:NSASCIIStringEncoding] autorelease];
}
+ (NSString *) stringToHex:(NSString *)str
{
NSUInteger len = [str length];
unichar *chars = malloc(len * sizeof(unichar));
[str getCharacters:chars];
NSMutableString *hexString = [[NSMutableString alloc] init];
for(NSUInteger i = 0; i < len; i++ )
{
[hexString appendString:[NSString stringWithFormat:#"%x", chars[i]]];
}
free(chars);
return [hexString autorelease];
}
#end
The perfect and short way to convert nsstring to hexadecimal values
NSMutableString *tempHex=[[NSMutableString alloc] init];
[tempHex appendString:#"0xD2D2D2"];
unsigned colorInt = 0;
[[NSScanner scannerWithString:tempHex] scanHexInt:&colorInt];
lblAttString.backgroundColor=UIColorFromRGB(colorInt);
The macro used for this code is----
#define UIColorFromRGB(rgbValue)
[UIColor \colorWithRed:((float)((rgbValue & 0xFF0000) >> 16))/255.0 \
green:((float)((rgbValue & 0xFF00) >> 8))/255.0 \
blue:((float)(rgbValue & 0xFF))/255.0 alpha:1.0]
For these lines of Java
utf8 = s.getBytes(ENCODING_UTF8);
new String(decodedHexString, ENCODING_UTF8);
Objective-C equivalents would be
utf8 = [s UTF8String];
[NSString initWithUTF8String:decodedHexString];
To make an NSString with the hexadecimal representation of a character string:
NSMutableString *hex = [NSMutableString string];
while ( *utf8 ) [hex appendFormat:#"%02X" , *utf8++ & 0x00FF];
You will have to make your own decodeHex function. Just pull two characters out of the string and, if they are valid, add a byte to the result.
There is a problem with your stringToHex method - it drops leading 0s, and ignores 00s. Just as a quick fix, I made the below:
+ (NSString *) stringToHex:(NSString *)str
{
NSUInteger len = [str length];
unichar *chars = malloc(len * sizeof(unichar));
[str getCharacters:chars];
NSMutableString *hexString = [[NSMutableString alloc] init];
for(NSUInteger i = 0; i < len; i++ )
{
// [hexString [NSString stringWithFormat:#"%02x", chars[i]]]; /*previous input*/
[hexString appendFormat:#"%02x", chars[i]]; /*EDITED PER COMMENT BELOW*/
}
free(chars);
return [hexString autorelease];
}
Thanks to all who contributed on this thread. It was a great help to me. Since things have moved on a little since the original post, here's my updated implementation for iOS 6. I went with the categories approach, but chose to split the load between NSData and NSString. Comments welcomed.
First, the NSString half, which handles decoding a hex encoded string into an NSData object.
#implementation NSString (StringToHexData)
//
// Decodes an NSString containing hex encoded bytes into an NSData object
//
- (NSData *) stringToHexData
{
int len = [self length] / 2; // Target length
unsigned char *buf = malloc(len)
unsigned char *whole_byte = buf;
char byte_chars[3] = {'\0','\0','\0'};
int i;
for (i=0; i < [self length] / 2; i++) {
byte_chars[0] = [self characterAtIndex:i*2];
byte_chars[1] = [self characterAtIndex:i*2+1];
*whole_byte = strtol(byte_chars, NULL, 16);
whole_byte++;
}
NSData *data = [NSData dataWithBytes:buf length:len];
free( buf );
return data;
}
#end
The changes were mostly for efficiency's sake: some simple old-fashioned pointer arithmetic means I could allocate the whole buffer in one go, and populate it byte by byte. Then the whole thing is passed to NSData in one go.
The encoding part, in NSData, looks like this:
#implementation NSData (DataToHexString)
- (NSString *) dataToHexString
{
NSUInteger len = [self length];
char * chars = (char *)[self bytes];
NSMutableString * hexString = [[NSMutableString alloc] init];
for(NSUInteger i = 0; i < len; i++ )
[hexString appendString:[NSString stringWithFormat:#"%0.2hhx", chars[i]]];
return hexString;
}
#end
Again, some minor changes, though I suspect no efficiency gains here. The use of "%0.2hhx" solved all the problems of missing leading zero's and ensures that only a single-byte is output at a time.
Hope this helps the next person taking this on!
One possible solution:
+(NSString*)hexFromStr:(NSString*)str
{
NSData* nsData = [str dataUsingEncoding:NSUTF8StringEncoding];
const char* data = [nsData bytes];
NSUInteger len = nsData.length;
NSMutableString* hex = [NSMutableString string];
for(int i = 0; i < len; ++i)[hex appendFormat:#"%02X", data[i]];
return hex;
}
So, first off, I would like to thank drawnonward for his answer. This gave me the first function, mean and clean. In the same spirit, I wrote the other one. Hope you like it.
#synthesize unsigned char* value= _value;
- (NSString*) hexString
{
_value[CONSTANT]= '\0';
unsigned char* ptr= _value;
NSMutableString* hex = [[NSMutableString alloc] init];
while ( *ptr ) [hex appendFormat:#"%02x", *ptr++ & 0x00FF];
return [hex autorelease];
}
- (void) setHexString:(NSString*)hexString
{
_value[CONSTANT]= '\0';
unsigned char* ptr= _value;
for (const char* src= [hexString cStringUsingEncoding:NSASCIIStringEncoding];
*src;
src+=2)
{
unsigned int hexByte;
/*int res=*/ sscanf(src,"%02x",&hexByte);
*ptr++= (unsigned char)(hexByte & 0x00FF);
}
*ptr= '\0';
}
My input was an digit base10 string, and the output should be the hex representation in string format. Examples:
#"10" -> #"A"
#"1128" -> #"468"
#"1833828235" -> #"6D4DFF8B"
Implementation:
+ (NSString *) stringToHex:(NSString *)str{
NSInteger result = [str integerValue];
NSString *hexStr = (result)?#"":#"0";
while (result!=0) {
NSInteger reminder = result % 16;
if(reminder>=0 && reminder<=9){
hexStr = [[NSString stringWithFormat:#"%ld",(long)reminder] stringByAppendingString:hexStr];
}else if(reminder==10){
hexStr = [#"A" stringByAppendingString:hexStr];
}else if(reminder==11){
hexStr = [#"B" stringByAppendingString:hexStr];
}else if(reminder==12){
hexStr = [#"C" stringByAppendingString:hexStr];
}else if(reminder==13){
hexStr = [#"D" stringByAppendingString:hexStr];
}else if(reminder==14){
hexStr = [#"E" stringByAppendingString:hexStr];
}else{
hexStr = [#"F" stringByAppendingString:hexStr];
}
result /=16;
}
return hexStr;
}
Perhaps you should use NSString dataUsingEncoding: to encode and initWithData:length:encoding: to decode. Depends on where you are getting the data from.
fairly new iPhone developer here. Building an app to send RS232 commands to a device expecting them over a TCP/IP socket connection. I've got the comms part down, and can send ASCII commands fine. It's the hex code commands I'm having trouble with.
So lets say I have the following hex data to send (in this format):
\x1C\x02d\x00\x00\x00\xFF\x7F
How do I convert this into an NSData object, which my send method expects?
Obviously this does not work for this hex data (but does for standard ascii commands):
NSString *commandascii;
NSData *commandToSend;
commandascii = #"\x1C\x02d\x00\x00\x00\xFF\x7F";
commandToSend = [commandascii dataUsingEncoding:NSStringEncoding];
For a start, some of the \x hex codes are escape characters, and I get an "input conversion stopped..." warning when compiling in XCode. And NSStringEncoding obviously isn't right for this hex string either.
So the first problem is how to store this hex string I guess, then how to convert to NSData.
Any ideas?
Code for hex in NSStrings like "00 05 22 1C EA 01 00 FF". 'command' is the hex NSString.
command = [command stringByReplacingOccurrencesOfString:#" " withString:#""];
NSMutableData *commandToSend= [[NSMutableData alloc] init];
unsigned char whole_byte;
char byte_chars[3] = {'\0','\0','\0'};
for (int i = 0; i < ([command length] / 2); i++) {
byte_chars[0] = [command characterAtIndex:i*2];
byte_chars[1] = [command characterAtIndex:i*2+1];
whole_byte = strtol(byte_chars, NULL, 16);
[commandToSend appendBytes:&whole_byte length:1];
}
NSLog(#"%#", commandToSend);
Here's an example decoder implemented on a category on NSString.
#import <stdio.h>
#import <stdlib.h>
#import <string.h>
unsigned char strToChar (char a, char b)
{
char encoder[3] = {'\0','\0','\0'};
encoder[0] = a;
encoder[1] = b;
return (char) strtol(encoder,NULL,16);
}
#interface NSString (NSStringExtensions)
- (NSData *) decodeFromHexidecimal;
#end
#implementation NSString (NSStringExtensions)
- (NSData *) decodeFromHexidecimal;
{
const char * bytes = [self cStringUsingEncoding: NSUTF8StringEncoding];
NSUInteger length = strlen(bytes);
unsigned char * r = (unsigned char *) malloc(length / 2 + 1);
unsigned char * index = r;
while ((*bytes) && (*(bytes +1))) {
*index = strToChar(*bytes, *(bytes +1));
index++;
bytes+=2;
}
*index = '\0';
NSData * result = [NSData dataWithBytes: r length: length / 2];
free(r);
return result;
}
#end
If you can hard code the hex data:
const char bytes[] = "\x00\x12\x45\xAB";
size_t length = (sizeof bytes) - 1; //string literals have implicit trailing '\0'
NSData *data = [NSData dataWithBytes:bytes length:length];
If your code must interpret the hex string (assuming the hex string is in a variable called inputData and lengthOfInputData is the length of inputData):
#define HexCharToNybble(x) ((char)((x > '9') ? tolower(x) - 'a' + 10 : x - '0') & 0xF)
int i;
NSMutableData *data = [NSMutableData data];
for (i = 0; i < lengthOfInputData;)
{
char byteToAppend;
if (i < (lengthOfInputData - 3) &&
inputData[i+0] == '\\' &&
inputData[i+1] == 'x' &&
isxdigit(inputData[i+2]) &&
isxdigit(inputData[i+3]))
{
byteToAppend = HexCharToNybble(inputData[i+2]) << 4 + HexCharToNybble(input[i+3]);
i += 4;
}
else
{
byteToAppend = inputData[i];
i += 1;
}
[data appendBytes:&byteToAppend length:1];
}
This is an old topic, but I'd like to add some remarks.
• Scanning a string with [NSString characterAtIndex] is not very efficient.
Get the C string in UTF8, then scan it using a *char++ is much faster.
• It's better to allocate NSMutableData with capacity, to avoid time consuming block resizing. I think NSData is even better ( see next point )
• Instead of create NSData using malloc, then [NSData dataWithBytes] and finally free, use malloc, and [NSData dataWithBytesNoCopy:length:freeWhenDone:]
It also avoids memory operation ( reallocate, copy, free ). The freeWhenDone boolean tells the NSData to take ownership of the memory block, and free it when it will be released.
• Here is the function I have to convert hex strings to bytes blocks. There is not much error checking on input string, but the allocation is tested.
The formatting of the input string ( like remove 0x, spaces and punctuation marks ) is better out of the conversion function.
Why would we lose some time doing extra processing if we are sure the input is OK.
+(NSData*)bytesStringToData:(NSString*)bytesString
{
if (!bytesString || !bytesString.length) return NULL;
// Get the c string
const char *scanner=[bytesString cStringUsingEncoding:NSUTF8StringEncoding];
char twoChars[3]={0,0,0};
long bytesBlockSize = formattedBytesString.length/2;
long counter = bytesBlockSize;
Byte *bytesBlock = malloc(bytesBlockSize);
if (!bytesBlock) return NULL;
Byte *writer = bytesBlock;
while (counter--) {
twoChars[0]=*scanner++;
twoChars[1]=*scanner++;
*writer++ = strtol(twoChars, NULL, 16);
}
return[NSData dataWithBytesNoCopy:bytesBlock length:bytesBlockSize freeWhenDone:YES];
}
If I want to hard-code the bytes, I do something like this:
enum { numCommandBytes = 8 };
static const unsigned char commandBytes[numCommandBytes] = { 0x1c, 0x02, 'd', 0x0, 0x0, 0x0, 0xff, 0x7f };
If you're obtaining these backslash-escaped bytes at run time, try the strunvis function.
Obviously this does not work for this hex data (but does for standard ascii commands):
NSString *commandascii;
NSData *commandToSend;
commandascii = #"\x1C\x02d\x00\x00\x00\xFF\x7F";
commandToSend = [commandascii dataUsingEncoding:NSStringEncoding];
For a start, some of the \x hex codes are escape characters, and I get an "input conversion stopped..." warning when compiling in XCode. And NSStringEncoding obviously isn't right for this hex string either.
First, it's Xcode, with a lowercase c.
Second, NSStringEncoding is a type, not an encoding identifier. That code shouldn't compile at all.
More to the point, backslash-escaping is not an encoding; in fact, it's largely independent of encoding. The backslash and 'x' are characters, not bytes, which means that they must be encoded to (and decoded from) bytes, which is the job of an encoding.
Another way to do it.
-(NSData *) dataFromHexString:(NSString *) hexstr
{
NSMutableData *data = [[NSMutableData alloc] init];
NSString *inputStr = [hexstr uppercaseString];
NSString *hexChars = #"0123456789ABCDEF";
Byte b1,b2;
b1 = 255;
b2 = 255;
for (int i=0; i<hexstr.length; i++) {
NSString *subStr = [inputStr substringWithRange:NSMakeRange(i, 1)];
NSRange loc = [hexChars rangeOfString:subStr];
if (loc.location == NSNotFound) continue;
if (255 == b1) {
b1 = (Byte)loc.location;
}else {
b2 = (Byte)loc.location;
//Appending the Byte to NSData
Byte *bytes = malloc(sizeof(Byte) *1);
bytes[0] = ((b1<<4) & 0xf0) | (b2 & 0x0f);
[data appendBytes:bytes length:1];
b1 = b2 = 255;
}
}
return data;
}
-(NSData*) convertToByteArray:(NSString*) command {
if (command == nil || command.length == 0) return nil;
NSString *command1 = command;
if(command1.length%2 != 0) {
// to handle odd bytes like 1000 decimal = 3E8 with is of length = 3
command1 = [NSString stringWithFormat:#"0%#",command1];
}
NSUInteger length = command1.length/2 ;
NSMutableData *commandToSend = [[NSMutableData alloc] initWithLength:length];
char byte_chars[3] = {'\0','\0','\0'};
unsigned char whole_byte;
for (int i=0; i<length; i++) {
byte_chars[0] = [command1 characterAtIndex:i*2];
byte_chars[1] = [command1 characterAtIndex:i*2+1];
whole_byte = strtol(byte_chars, NULL, 16);
[commandToSend appendBytes:&whole_byte length:1];
}
NSRange commandRange = NSMakeRange(commandToSend.length - length, length);
NSData *result = [commandToSend subdataWithRange:commandRange];
return result;
}
I know this is a very old thread, but there is an encoding scheme in Objective C that can easily convert your string of hex codes into ASCII characters.
1) remove the \x from the string and with out keeping spaces in the string just convert the string to NSData using :
[[NSData alloc] initWithData:[stringToBeConverted dataUsingEncoding:NSASCIIStringEncoding]];
Hex data is just bytes in memory, you think of it as a string because that's how you see it but they could represent anything.
Try: (typed in the browser, may contain errors)
NSMutableData *hexData = [[NSMutableData alloc] init];
[hexData appendBytes: 0x1C];
[hexData appendBytes: 0x02D];
etc...