Reading binary image data from a web service into UIImage - iphone

I'm consuming a web service in my iPhone app. The web service method returns a response which has several fields (eg. ID, Description, etc..). One of these fields contains binary image data which I need to convert to UIImage in my iPhone application.
I'm using a NSXMLParser successfully to extract data from the XML response. In the parser:foundCharacters: selector of the XMLParser, it gives a NSString* pointing to the string within each field. Since this is a string, this is what I do to read image data when i encounter the image field:
UIImage *img = [[UIImage alloc] initWithData:[string dataUsingEncoding:NSUTF8StringEncoding]];
But img variable is still "nil" after this line. Seems like the data from XML string is not compatible with conversion. What am I doing wrong here? (I'm capable of reading other fields into my variables but not this image data field)
Thanks in advance..

Following fbrereto's answer, I managed to convert the string into NSData using a code sample at this link: http://www.cocoadev.com/index.pl?BaseSixtyFour (scroll to the code sample by MiloBird user). Now I can construct the image successfully.
It uses objective-c categories to add the selector + (id)dataWithBase64EncodedString:(NSString *)string; to the NSData class. Here's the necessary code sample I extracted from the above link:
//MBBase64.h
#interface NSData (MBBase64)
+ (id)dataWithBase64EncodedString:(NSString *)string; // Padding '=' characters are optional. Whitespace is ignored.
#end
//MBBase64.m
static const char encodingTable[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
#implementation NSData (MBBase64)
+ (id)dataWithBase64EncodedString:(NSString *)string;
{
if (string == nil)
[NSException raise:NSInvalidArgumentException format:nil];
if ([string length] == 0)
return [NSData data];
static char *decodingTable = NULL;
if (decodingTable == NULL)
{
decodingTable = malloc(256);
if (decodingTable == NULL)
return nil;
memset(decodingTable, CHAR_MAX, 256);
NSUInteger i;
for (i = 0; i < 64; i++)
decodingTable[(short)encodingTable[i]] = i;
}
const char *characters = [string cStringUsingEncoding:NSASCIIStringEncoding];
if (characters == NULL) // Not an ASCII string!
return nil;
char *bytes = malloc((([string length] + 3) / 4) * 3);
if (bytes == NULL)
return nil;
NSUInteger length = 0;
NSUInteger i = 0;
while (YES)
{
char buffer[4];
short bufferLength;
for (bufferLength = 0; bufferLength < 4; i++)
{
if (characters[i] == '\0')
break;
if (isspace(characters[i]) || characters[i] == '=')
continue;
buffer[bufferLength] = decodingTable[(short)characters[i]];
if (buffer[bufferLength++] == CHAR_MAX) // Illegal character!
{
free(bytes);
return nil;
}
}
if (bufferLength == 0)
break;
if (bufferLength == 1) // At least two characters are needed to produce one byte!
{
free(bytes);
return nil;
}
// Decode the characters in the buffer to bytes.
bytes[length++] = (buffer[0] << 2) | (buffer[1] >> 4);
if (bufferLength > 2)
bytes[length++] = (buffer[1] << 4) | (buffer[2] >> 2);
if (bufferLength > 3)
bytes[length++] = (buffer[2] << 6) | buffer[3];
}
realloc(bytes, length);
return [NSData dataWithBytesNoCopy:bytes length:length];
}
#end
Thank you all!

The trick with NSString is that there is an implicit encoding associated with the data it contains. The image you are receiving, however, is likely in a format that will not convert properly, as it is either binary data or some lossless encoding of binary data (e.g., base64). The trick, then, is to make sure you don't let NSString perform any encoding conversions at all, otherwise your image data will be corrupted. Instead of using dataUsingEncoding: I would try an API more like getBytes:maxLength:usedLength:encoding:options:range:remainingRange. While more complex than dataUsingEncoding: I think it'll give you the flexibility you need to get just the data from the NSString and nothing more.

Related

how to retreive image in iPhone which is in binary format in server [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
In server, the image is stored in binary format. I have to retrieve the image in iphone using json. How can I do that? Is it possible to do this using NSData also?
Yes, you need to covert binary data to NSData like this:
NSData *imgData = [NSData dataWithBase64EncodedString:yourelement];
UIImage *theImg = [UIImage imageWithData:imgData];
You would need MBBase64 class which is available here: https://github.com/jerrykrinock/CategoriesObjC
You have to get the binary value from the server using json parsing and then convert that string to NSData.
This is the standard code used for converting the base64 string to NSData.
//MBBase64.h
#interface NSData (MBBase64)
+ (id)dataWithBase64EncodedString:(NSString *)string; // Padding '=' characters are optional. Whitespace is ignored.
#end
//MBBase64.m
static const char encodingTable[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
#implementation NSData (MBBase64)
+ (id)dataWithBase64EncodedString:(NSString *)string;
{
if (string == nil)
[NSException raise:NSInvalidArgumentException format:nil];
if ([string length] == 0)
return [NSData data];
static char *decodingTable = NULL;
if (decodingTable == NULL)
{
decodingTable = malloc(256);
if (decodingTable == NULL)
return nil;
memset(decodingTable, CHAR_MAX, 256);
NSUInteger i;
for (i = 0; i < 64; i++)
decodingTable[(short)encodingTable[i]] = i;
}
const char *characters = [string cStringUsingEncoding:NSASCIIStringEncoding];
if (characters == NULL) // Not an ASCII string!
return nil;
char *bytes = malloc((([string length] + 3) / 4) * 3);
if (bytes == NULL)
return nil;
NSUInteger length = 0;
NSUInteger i = 0;
while (YES)
{
char buffer[4];
short bufferLength;
for (bufferLength = 0; bufferLength < 4; i++)
{
if (characters[i] == '\0')
break;
if (isspace(characters[i]) || characters[i] == '=')
continue;
buffer[bufferLength] = decodingTable[(short)characters[i]];
if (buffer[bufferLength++] == CHAR_MAX) // Illegal character!
{
free(bytes);
return nil;
}
}
if (bufferLength == 0)
break;
if (bufferLength == 1) // At least two characters are needed to produce one byte!
{
free(bytes);
return nil;
}
// Decode the characters in the buffer to bytes.
bytes[length++] = (buffer[0] << 2) | (buffer[1] >> 4);
if (bufferLength > 2)
bytes[length++] = (buffer[1] << 4) | (buffer[2] >> 2);
if (bufferLength > 3)
bytes[length++] = (buffer[2] << 6) | buffer[3];
}
realloc(bytes, length);
return [NSData dataWithBytesNoCopy:bytes length:length];
}
#end
Then load that resulted NSdata in UIimageview
yourimageview.image = [[UIImage alloc] initWithData:resultdata];

How to check if downloaded PNG image is corrupt?

I am downloading multiple images from below code and saving to the DB. But for some images I am getting below error.
Error: ImageIO: PNG invalid distance too far back
Error: ImageIO: PNG incorrect data check
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0ul);
dispatch_async(queue, ^{
NSString *imgStr = [dict objectForKey:#"image"];
imgStr = [imgStr stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding];
NSData *imgData = [NSData dataWithContentsOfURL:[NSURL URLWithString:imgStr]];
UIImage *image = [UIImage imageWithData:imgData];
dispatch_sync(dispatch_get_main_queue(), ^{
mYImageView.image = image;
});
});
How to check stored image is valid or not, so I can download image again?
For PNG images, check their first two bytes and last two bytes. Below is the method, hope it helps.
Thanks. :)
- (BOOL)isImageValid:(NSData *)data
{
BOOL val = YES;
if ([data length] < 4)
val = NO;
const char * bytes = (const char *)[data bytes];
if (bytes[0] != 0x89 || bytes[1] != 0x50)
val = NO;
if (bytes[[data length] - 2] != 0x60 ||
bytes[[data length] - 1] != 0x82)
val = NO;
return val;
}
The accepted answer by swati sharma works great. Here’s a Swift extension for anyone looking to do the same in Swift:
extension Data
{
/// Returns whether or not the data is for a valid PNG file.
var isValidPNG: Bool
{
guard self.count > 4 else { return false }
return self[0] == 0x89 &&
self[1] == 0x50 &&
self[self.count - 2] == 0x60 &&
self[self.count - 1] == 0x82
}
}

saving images in string format (to use in xml)..not working

i am converting image from picker into nsdata(jpeg representation) and then converting it into nsstring using the following code
NSData *data=UIImageJPEGRepresentation(image,1.0);
NSString *imageString=[[NSString alloc]initWithData:data encoding:NSUTF8StringEncoding];
[[NSUserDefaults standardUserDefaults] setObject:imageString forKey:#"image_name"];
and at the other end where i need to display the image the uiimage is formed as follows.
NSString *imageString=[[NSString alloc] init];
imageString=[[NSUserDefaults standardUserDefaults] objectForKey:#"image_name"];
UIImage *image=[UIImage imageWithData:[imageString dataUsingEncoding:NSUTF8StringEncoding]];
the image variable used in the top code is not nil ,but the image formed from data is getting nil...when i nslogged userdefaults some string is present for the key mentioned above .can anyone explain why is this so..what is the right way to do this
If it goes through a web server or the like, you could encapsulate it with base64 enc/decoding or some other plain encoder.
It removes "bad" char, ie that screw up the string during transformation, and change them to generic alphabetical chars and then back again.
if this is the reason to your issues, here is a short one i use (which I most probably stole and adapted, but do not remember from whom. Sorry! :-) )
base64helper.h
#import <Foundation/Foundation.h>
#interface NSData (MBBase64)
base64helper.m
#import "base64helper.h"
static const char encodingTable[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
#implementation NSData (MBBase64)
+ (id)dataWithBase64EncodedString:(NSString *)string;
{
if (string == nil)
[NSException raise:NSInvalidArgumentException format:nil];
if ([string length] == 0)
return [NSData data];
static char *decodingTable = NULL;
if (decodingTable == NULL)
{
decodingTable = malloc(256);
if (decodingTable == NULL)
return nil;
memset(decodingTable, CHAR_MAX, 256);
NSUInteger i;
for (i = 0; i < 64; i++)
decodingTable[(short)encodingTable[i]] = i;
}
const char *characters = [string cStringUsingEncoding:NSASCIIStringEncoding];
if (characters == NULL) // Not an ASCII string!
return nil;
char *bytes = malloc((([string length] + 3) / 4) * 3);
if (bytes == NULL)
return nil;
NSUInteger length = 0;
NSUInteger i = 0;
while (YES)
{
char buffer[4];
short bufferLength;
for (bufferLength = 0; bufferLength < 4; i++)
{
if (characters[i] == '\0')
break;
if (isspace(characters[i]) || characters[i] == '=')
continue;
buffer[bufferLength] = decodingTable[(short)characters[i]];
if (buffer[bufferLength++] == CHAR_MAX) // Illegal character!
{
free(bytes);
return nil;
}
}
if (bufferLength == 0)
break;
if (bufferLength == 1) // At least two characters are needed to produce one byte!
{
free(bytes);
return nil;
}
// Decode the characters in the buffer to bytes.
bytes[length++] = (buffer[0] << 2) | (buffer[1] >> 4);
if (bufferLength > 2)
bytes[length++] = (buffer[1] << 4) | (buffer[2] >> 2);
if (bufferLength > 3)
bytes[length++] = (buffer[2] << 6) | buffer[3];
}
realloc(bytes, length);
return [NSData dataWithBytesNoCopy:bytes length:length];
}
- (NSString *)base64Encoding;
{
if ([self length] == 0)
return #"";
char *characters = malloc((([self length] + 2) / 3) * 4);
if (characters == NULL)
return nil;
NSUInteger length = 0;
NSUInteger i = 0;
while (i < [self length])
{
char buffer[3] = {0,0,0};
short bufferLength = 0;
while (bufferLength < 3 && i < [self length])
buffer[bufferLength++] = ((char *)[self bytes])[i++];
// Encode the bytes in the buffer to four characters, including padding "=" characters if necessary.
characters[length++] = encodingTable[(buffer[0] & 0xFC) >> 2];
characters[length++] = encodingTable[((buffer[0] & 0x03) << 4) | ((buffer[1] & 0xF0) >> 4)];
if (bufferLength > 1)
characters[length++] = encodingTable[((buffer[1] & 0x0F) << 2) | ((buffer[2] & 0xC0) >> 6)];
else characters[length++] = '=';
if (bufferLength > 2)
characters[length++] = encodingTable[buffer[2] & 0x3F];
else characters[length++] = '=';
}
return [[[NSString alloc] initWithBytesNoCopy:characters length:length encoding:NSASCIIStringEncoding freeWhenDone:YES] autorelease];
}
#end

Convert hex data string to NSData in Objective C (cocoa)

fairly new iPhone developer here. Building an app to send RS232 commands to a device expecting them over a TCP/IP socket connection. I've got the comms part down, and can send ASCII commands fine. It's the hex code commands I'm having trouble with.
So lets say I have the following hex data to send (in this format):
\x1C\x02d\x00\x00\x00\xFF\x7F
How do I convert this into an NSData object, which my send method expects?
Obviously this does not work for this hex data (but does for standard ascii commands):
NSString *commandascii;
NSData *commandToSend;
commandascii = #"\x1C\x02d\x00\x00\x00\xFF\x7F";
commandToSend = [commandascii dataUsingEncoding:NSStringEncoding];
For a start, some of the \x hex codes are escape characters, and I get an "input conversion stopped..." warning when compiling in XCode. And NSStringEncoding obviously isn't right for this hex string either.
So the first problem is how to store this hex string I guess, then how to convert to NSData.
Any ideas?
Code for hex in NSStrings like "00 05 22 1C EA 01 00 FF". 'command' is the hex NSString.
command = [command stringByReplacingOccurrencesOfString:#" " withString:#""];
NSMutableData *commandToSend= [[NSMutableData alloc] init];
unsigned char whole_byte;
char byte_chars[3] = {'\0','\0','\0'};
for (int i = 0; i < ([command length] / 2); i++) {
byte_chars[0] = [command characterAtIndex:i*2];
byte_chars[1] = [command characterAtIndex:i*2+1];
whole_byte = strtol(byte_chars, NULL, 16);
[commandToSend appendBytes:&whole_byte length:1];
}
NSLog(#"%#", commandToSend);
Here's an example decoder implemented on a category on NSString.
#import <stdio.h>
#import <stdlib.h>
#import <string.h>
unsigned char strToChar (char a, char b)
{
char encoder[3] = {'\0','\0','\0'};
encoder[0] = a;
encoder[1] = b;
return (char) strtol(encoder,NULL,16);
}
#interface NSString (NSStringExtensions)
- (NSData *) decodeFromHexidecimal;
#end
#implementation NSString (NSStringExtensions)
- (NSData *) decodeFromHexidecimal;
{
const char * bytes = [self cStringUsingEncoding: NSUTF8StringEncoding];
NSUInteger length = strlen(bytes);
unsigned char * r = (unsigned char *) malloc(length / 2 + 1);
unsigned char * index = r;
while ((*bytes) && (*(bytes +1))) {
*index = strToChar(*bytes, *(bytes +1));
index++;
bytes+=2;
}
*index = '\0';
NSData * result = [NSData dataWithBytes: r length: length / 2];
free(r);
return result;
}
#end
If you can hard code the hex data:
const char bytes[] = "\x00\x12\x45\xAB";
size_t length = (sizeof bytes) - 1; //string literals have implicit trailing '\0'
NSData *data = [NSData dataWithBytes:bytes length:length];
If your code must interpret the hex string (assuming the hex string is in a variable called inputData and lengthOfInputData is the length of inputData):
#define HexCharToNybble(x) ((char)((x > '9') ? tolower(x) - 'a' + 10 : x - '0') & 0xF)
int i;
NSMutableData *data = [NSMutableData data];
for (i = 0; i < lengthOfInputData;)
{
char byteToAppend;
if (i < (lengthOfInputData - 3) &&
inputData[i+0] == '\\' &&
inputData[i+1] == 'x' &&
isxdigit(inputData[i+2]) &&
isxdigit(inputData[i+3]))
{
byteToAppend = HexCharToNybble(inputData[i+2]) << 4 + HexCharToNybble(input[i+3]);
i += 4;
}
else
{
byteToAppend = inputData[i];
i += 1;
}
[data appendBytes:&byteToAppend length:1];
}
This is an old topic, but I'd like to add some remarks.
• Scanning a string with [NSString characterAtIndex] is not very efficient.
Get the C string in UTF8, then scan it using a *char++ is much faster.
• It's better to allocate NSMutableData with capacity, to avoid time consuming block resizing. I think NSData is even better ( see next point )
• Instead of create NSData using malloc, then [NSData dataWithBytes] and finally free, use malloc, and [NSData dataWithBytesNoCopy:length:freeWhenDone:]
It also avoids memory operation ( reallocate, copy, free ). The freeWhenDone boolean tells the NSData to take ownership of the memory block, and free it when it will be released.
• Here is the function I have to convert hex strings to bytes blocks. There is not much error checking on input string, but the allocation is tested.
The formatting of the input string ( like remove 0x, spaces and punctuation marks ) is better out of the conversion function.
Why would we lose some time doing extra processing if we are sure the input is OK.
+(NSData*)bytesStringToData:(NSString*)bytesString
{
if (!bytesString || !bytesString.length) return NULL;
// Get the c string
const char *scanner=[bytesString cStringUsingEncoding:NSUTF8StringEncoding];
char twoChars[3]={0,0,0};
long bytesBlockSize = formattedBytesString.length/2;
long counter = bytesBlockSize;
Byte *bytesBlock = malloc(bytesBlockSize);
if (!bytesBlock) return NULL;
Byte *writer = bytesBlock;
while (counter--) {
twoChars[0]=*scanner++;
twoChars[1]=*scanner++;
*writer++ = strtol(twoChars, NULL, 16);
}
return[NSData dataWithBytesNoCopy:bytesBlock length:bytesBlockSize freeWhenDone:YES];
}
If I want to hard-code the bytes, I do something like this:
enum { numCommandBytes = 8 };
static const unsigned char commandBytes[numCommandBytes] = { 0x1c, 0x02, 'd', 0x0, 0x0, 0x0, 0xff, 0x7f };
If you're obtaining these backslash-escaped bytes at run time, try the strunvis function.
Obviously this does not work for this hex data (but does for standard ascii commands):
NSString *commandascii;
NSData *commandToSend;
commandascii = #"\x1C\x02d\x00\x00\x00\xFF\x7F";
commandToSend = [commandascii dataUsingEncoding:NSStringEncoding];
For a start, some of the \x hex codes are escape characters, and I get an "input conversion stopped..." warning when compiling in XCode. And NSStringEncoding obviously isn't right for this hex string either.
First, it's Xcode, with a lowercase c.
Second, NSStringEncoding is a type, not an encoding identifier. That code shouldn't compile at all.
More to the point, backslash-escaping is not an encoding; in fact, it's largely independent of encoding. The backslash and 'x' are characters, not bytes, which means that they must be encoded to (and decoded from) bytes, which is the job of an encoding.
Another way to do it.
-(NSData *) dataFromHexString:(NSString *) hexstr
{
NSMutableData *data = [[NSMutableData alloc] init];
NSString *inputStr = [hexstr uppercaseString];
NSString *hexChars = #"0123456789ABCDEF";
Byte b1,b2;
b1 = 255;
b2 = 255;
for (int i=0; i<hexstr.length; i++) {
NSString *subStr = [inputStr substringWithRange:NSMakeRange(i, 1)];
NSRange loc = [hexChars rangeOfString:subStr];
if (loc.location == NSNotFound) continue;
if (255 == b1) {
b1 = (Byte)loc.location;
}else {
b2 = (Byte)loc.location;
//Appending the Byte to NSData
Byte *bytes = malloc(sizeof(Byte) *1);
bytes[0] = ((b1<<4) & 0xf0) | (b2 & 0x0f);
[data appendBytes:bytes length:1];
b1 = b2 = 255;
}
}
return data;
}
-(NSData*) convertToByteArray:(NSString*) command {
if (command == nil || command.length == 0) return nil;
NSString *command1 = command;
if(command1.length%2 != 0) {
// to handle odd bytes like 1000 decimal = 3E8 with is of length = 3
command1 = [NSString stringWithFormat:#"0%#",command1];
}
NSUInteger length = command1.length/2 ;
NSMutableData *commandToSend = [[NSMutableData alloc] initWithLength:length];
char byte_chars[3] = {'\0','\0','\0'};
unsigned char whole_byte;
for (int i=0; i<length; i++) {
byte_chars[0] = [command1 characterAtIndex:i*2];
byte_chars[1] = [command1 characterAtIndex:i*2+1];
whole_byte = strtol(byte_chars, NULL, 16);
[commandToSend appendBytes:&whole_byte length:1];
}
NSRange commandRange = NSMakeRange(commandToSend.length - length, length);
NSData *result = [commandToSend subdataWithRange:commandRange];
return result;
}
I know this is a very old thread, but there is an encoding scheme in Objective C that can easily convert your string of hex codes into ASCII characters.
1) remove the \x from the string and with out keeping spaces in the string just convert the string to NSData using :
[[NSData alloc] initWithData:[stringToBeConverted dataUsingEncoding:NSASCIIStringEncoding]];
Hex data is just bytes in memory, you think of it as a string because that's how you see it but they could represent anything.
Try: (typed in the browser, may contain errors)
NSMutableData *hexData = [[NSMutableData alloc] init];
[hexData appendBytes: 0x1C];
[hexData appendBytes: 0x02D];
etc...

Best way to serialize an NSData into a hexadeximal string

I am looking for a nice-cocoa way to serialize an NSData object into a hexadecimal string. The idea is to serialize the deviceToken used for notification before sending it to my server.
I have the following implementation, but I am thinking there must be some shorter and nicer way to do it.
+ (NSString*) serializeDeviceToken:(NSData*) deviceToken
{
NSMutableString *str = [NSMutableString stringWithCapacity:64];
int length = [deviceToken length];
char *bytes = malloc(sizeof(char) * length);
[deviceToken getBytes:bytes length:length];
for (int i = 0; i < length; i++)
{
[str appendFormat:#"%02.2hhX", bytes[i]];
}
free(bytes);
return str;
}
This is a category applied to NSData that I wrote. It returns a hexadecimal NSString representing the NSData, where the data can be any length. Returns an empty string if NSData is empty.
NSData+Conversion.h
#import <Foundation/Foundation.h>
#interface NSData (NSData_Conversion)
#pragma mark - String Conversion
- (NSString *)hexadecimalString;
#end
NSData+Conversion.m
#import "NSData+Conversion.h"
#implementation NSData (NSData_Conversion)
#pragma mark - String Conversion
- (NSString *)hexadecimalString {
/* Returns hexadecimal string of NSData. Empty string if data is empty. */
const unsigned char *dataBuffer = (const unsigned char *)[self bytes];
if (!dataBuffer)
return [NSString string];
NSUInteger dataLength = [self length];
NSMutableString *hexString = [NSMutableString stringWithCapacity:(dataLength * 2)];
for (int i = 0; i < dataLength; ++i)
[hexString appendString:[NSString stringWithFormat:#"%02lx", (unsigned long)dataBuffer[i]]];
return [NSString stringWithString:hexString];
}
#end
Usage:
NSData *someData = ...;
NSString *someDataHexadecimalString = [someData hexadecimalString];
This is "probably" better than calling [someData description] and then stripping the spaces, <'s, and >'s. Stripping characters just feels too "hacky". Plus you never know if Apple will change the formatting of NSData's -description in the future.
NOTE: I have had people reach out to me about licensing for the code in this answer. I hereby dedicate my copyright in the code I posted in this answer to the public domain.
Here's a highly optimized NSData category method for generating a hex string. While #Dave Gallagher's answer is sufficient for a relatively small size, memory and cpu performance deteriorate for large amounts of data. I profiled this with a 2MB file on my iPhone 5. Time comparison was 0.05 vs 12 seconds. Memory footprint is negligible with this method while the other method grew the heap to 70MBs!
- (NSString *) hexString
{
NSUInteger bytesCount = self.length;
if (bytesCount) {
const char *hexChars = "0123456789ABCDEF";
const unsigned char *dataBuffer = self.bytes;
char *chars = malloc(sizeof(char) * (bytesCount * 2 + 1));
if (chars == NULL) {
// malloc returns null if attempting to allocate more memory than the system can provide. Thanks Cœur
[NSException raise:NSInternalInconsistencyException format:#"Failed to allocate more memory" arguments:nil];
return nil;
}
char *s = chars;
for (unsigned i = 0; i < bytesCount; ++i) {
*s++ = hexChars[((*dataBuffer & 0xF0) >> 4)];
*s++ = hexChars[(*dataBuffer & 0x0F)];
dataBuffer++;
}
*s = '\0';
NSString *hexString = [NSString stringWithUTF8String:chars];
free(chars);
return hexString;
}
return #"";
}
Using the description property of NSData should not be considered an acceptable mechanism for HEX encoding the string. That property is for description only and can change at any time. As a note, pre-iOS, the NSData description property didn't even return it's data in hex form.
Sorry for harping on the solution but it's important to take the energy to serialize it without piggy-backing off an API that is meant for something else other than data serialization.
#implementation NSData (Hex)
- (NSString*)hexString
{
NSUInteger length = self.length;
unichar* hexChars = (unichar*)malloc(sizeof(unichar) * (length*2));
unsigned char* bytes = (unsigned char*)self.bytes;
for (NSUInteger i = 0; i < length; i++) {
unichar c = bytes[i] / 16;
if (c < 10) {
c += '0';
} else {
c += 'A' - 10;
}
hexChars[i*2] = c;
c = bytes[i] % 16;
if (c < 10) {
c += '0';
} else {
c += 'A' - 10;
}
hexChars[i*2+1] = c;
}
NSString* retVal = [[NSString alloc] initWithCharactersNoCopy:hexChars length:length*2 freeWhenDone:YES];
return [retVal autorelease];
}
#end
Here is a faster way to do the conversion:
BenchMark (mean time for a 1024 bytes data conversion repeated 100 times):
Dave Gallagher : ~8.070 ms
NSProgrammer : ~0.077 ms
Peter : ~0.031 ms
This One : ~0.017 ms
#implementation NSData (BytesExtras)
static char _NSData_BytesConversionString_[512] = "000102030405060708090a0b0c0d0e0f101112131415161718191a1b1c1d1e1f202122232425262728292a2b2c2d2e2f303132333435363738393a3b3c3d3e3f404142434445464748494a4b4c4d4e4f505152535455565758595a5b5c5d5e5f606162636465666768696a6b6c6d6e6f707172737475767778797a7b7c7d7e7f808182838485868788898a8b8c8d8e8f909192939495969798999a9b9c9d9e9fa0a1a2a3a4a5a6a7a8a9aaabacadaeafb0b1b2b3b4b5b6b7b8b9babbbcbdbebfc0c1c2c3c4c5c6c7c8c9cacbcccdcecfd0d1d2d3d4d5d6d7d8d9dadbdcdddedfe0e1e2e3e4e5e6e7e8e9eaebecedeeeff0f1f2f3f4f5f6f7f8f9fafbfcfdfeff";
-(NSString*)bytesString
{
UInt16* mapping = (UInt16*)_NSData_BytesConversionString_;
register UInt16 len = self.length;
char* hexChars = (char*)malloc( sizeof(char) * (len*2) );
// --- Coeur's contribution - a safe way to check the allocation
if (hexChars == NULL) {
// we directly raise an exception instead of using NSAssert to make sure assertion is not disabled as this is irrecoverable
[NSException raise:#"NSInternalInconsistencyException" format:#"failed malloc" arguments:nil];
return nil;
}
// ---
register UInt16* dst = ((UInt16*)hexChars) + len-1;
register unsigned char* src = (unsigned char*)self.bytes + len-1;
while (len--) *dst-- = mapping[*src--];
NSString* retVal = [[NSString alloc] initWithBytesNoCopy:hexChars length:self.length*2 encoding:NSASCIIStringEncoding freeWhenDone:YES];
#if (!__has_feature(objc_arc))
return [retVal autorelease];
#else
return retVal;
#endif
}
#end
Functional Swift version
One liner:
let hexString = UnsafeBufferPointer<UInt8>(start: UnsafePointer(data.bytes),
count: data.length).map { String(format: "%02x", $0) }.joinWithSeparator("")
Here's in a reusable and self documenting extension form:
extension NSData {
func base16EncodedString(uppercase uppercase: Bool = false) -> String {
let buffer = UnsafeBufferPointer<UInt8>(start: UnsafePointer(self.bytes),
count: self.length)
let hexFormat = uppercase ? "X" : "x"
let formatString = "%02\(hexFormat)"
let bytesAsHexStrings = buffer.map {
String(format: formatString, $0)
}
return bytesAsHexStrings.joinWithSeparator("")
}
}
Alternatively, use reduce("", combine: +) instead of joinWithSeparator("") to be seen as a functional master by your peers.
Edit: I changed String($0, radix: 16) to String(format: "%02x", $0), because one digit numbers needed to having a padding zero
Peter's answer ported to Swift
func hexString(data:NSData)->String{
if data.length > 0 {
let hexChars = Array("0123456789abcdef".utf8) as [UInt8];
let buf = UnsafeBufferPointer<UInt8>(start: UnsafePointer(data.bytes), count: data.length);
var output = [UInt8](count: data.length*2 + 1, repeatedValue: 0);
var ix:Int = 0;
for b in buf {
let hi = Int((b & 0xf0) >> 4);
let low = Int(b & 0x0f);
output[ix++] = hexChars[ hi];
output[ix++] = hexChars[low];
}
let result = String.fromCString(UnsafePointer(output))!;
return result;
}
return "";
}
swift3
func hexString()->String{
if count > 0 {
let hexChars = Array("0123456789abcdef".utf8) as [UInt8];
return withUnsafeBytes({ (bytes:UnsafePointer<UInt8>) -> String in
let buf = UnsafeBufferPointer<UInt8>(start: bytes, count: self.count);
var output = [UInt8](repeating: 0, count: self.count*2 + 1);
var ix:Int = 0;
for b in buf {
let hi = Int((b & 0xf0) >> 4);
let low = Int(b & 0x0f);
output[ix] = hexChars[ hi];
ix += 1;
output[ix] = hexChars[low];
ix += 1;
}
return String(cString: UnsafePointer(output));
})
}
return "";
}
Swift 5
func hexString()->String{
if count > 0 {
let hexChars = Array("0123456789abcdef".utf8) as [UInt8];
return withUnsafeBytes{ bytes->String in
var output = [UInt8](repeating: 0, count: bytes.count*2 + 1);
var ix:Int = 0;
for b in bytes {
let hi = Int((b & 0xf0) >> 4);
let low = Int(b & 0x0f);
output[ix] = hexChars[ hi];
ix += 1;
output[ix] = hexChars[low];
ix += 1;
}
return String(cString: UnsafePointer(output));
}
}
return "";
}
I needed to solve this problem and found the answers here very useful, but I worry about performance. Most of these answers involve copying the data in bulk out of NSData so I wrote the following to do the conversion with low overhead:
#interface NSData (HexString)
#end
#implementation NSData (HexString)
- (NSString *)hexString {
NSMutableString *string = [NSMutableString stringWithCapacity:self.length * 3];
[self enumerateByteRangesUsingBlock:^(const void *bytes, NSRange byteRange, BOOL *stop){
for (NSUInteger offset = 0; offset < byteRange.length; ++offset) {
uint8_t byte = ((const uint8_t *)bytes)[offset];
if (string.length == 0)
[string appendFormat:#"%02X", byte];
else
[string appendFormat:#" %02X", byte];
}
}];
return string;
}
This pre-allocates space in the string for the entire result and avoids ever copying the NSData contents out by using enumerateByteRangesUsingBlock. Changing the X to an x in the format string will use lowercase hex digits. If you don't want a separator between the bytes you can reduce the statement
if (string.length == 0)
[string appendFormat:#"%02X", byte];
else
[string appendFormat:#" %02X", byte];
down to just
[string appendFormat:#"%02X", byte];
I needed an answer that would work for variable length strings, so here's what I did:
+ (NSString *)stringWithHexFromData:(NSData *)data
{
NSString *result = [[data description] stringByReplacingOccurrencesOfString:#" " withString:#""];
result = [result substringWithRange:NSMakeRange(1, [result length] - 2)];
return result;
}
Works great as an extension for the NSString class.
You can always use [yourString uppercaseString] to capitalize letters in data description
A better way to serialize/deserialize NSData into NSString is to use the Google Toolbox for Mac Base64 encoder/decoder. Just drag into your App Project the files GTMBase64.m, GTMBase64.h e GTMDefines.h from the package Foundation and the do something like
/**
* Serialize NSData to Base64 encoded NSString
*/
-(void) serialize:(NSData*)data {
self.encodedData = [GTMBase64 stringByEncodingData:data];
}
/**
* Deserialize Base64 NSString to NSData
*/
-(NSData*) deserialize {
return [GTMBase64 decodeString:self.encodedData];
}
Here is a solution using Swift 3
extension Data {
public var hexadecimalString : String {
var str = ""
enumerateBytes { buffer, index, stop in
for byte in buffer {
str.append(String(format:"%02x",byte))
}
}
return str
}
}
extension NSData {
public var hexadecimalString : String {
return (self as Data).hexadecimalString
}
}
#implementation NSData (Extn)
- (NSString *)description
{
NSMutableString *str = [[NSMutableString alloc] init];
const char *bytes = self.bytes;
for (int i = 0; i < [self length]; i++) {
[str appendFormat:#"%02hhX ", bytes[i]];
}
return [str autorelease];
}
#end
Now you can call NSLog(#"hex value: %#", data)
Change %08x to %08X to get capital characters.
Swift + Property.
I prefer to have hex representation as property (the same as bytes and description properties):
extension NSData {
var hexString: String {
let buffer = UnsafeBufferPointer<UInt8>(start: UnsafePointer(self.bytes), count: self.length)
return buffer.map { String(format: "%02x", $0) }.joinWithSeparator("")
}
var heXString: String {
let buffer = UnsafeBufferPointer<UInt8>(start: UnsafePointer(self.bytes), count: self.length)
return buffer.map { String(format: "%02X", $0) }.joinWithSeparator("")
}
}
Idea is borrowed from this answer
[deviceToken description]
You'll need to remove the spaces.
Personally I base64 encode the deviceToken, but it's a matter of taste.