How to show content of NSData in bits? - iphone

I have NSData and I need to view its content in pure bits. Tried NSLog [NSData description] but it returns NSString. Any suggestions?

use this for bytes
const char *byte = [data bytes];
NSLog(#"%s",byte);
this is for bits
const char *byte = [data bytes];
unsigned int length = [data length];
for (int i=0; i<length; i++) {
char n = byte[i];
char buffer[9];
buffer[8] = 0; //for null
int j = 8;
while(j > 0)
{
if(n & 0x01)
{
buffer[--j] = '1';
} else
{
buffer[--j] = '0';
}
n >>= 1;
}
printf("%s ",buffer);

You can look at these bytes in memory browser window:
void* bytes_memory = [yourData bytes]; // set breakpoint after this line
... after stopping on breakpoint find bytes_memory in Local variables window, right click on it and choose View memory of *bytes_memory.
If you want to print to console bits (in format 10011100), then you will need to convert data into corresponding string representation (here is example).

Related

iPhone - finalizing Apple's vague "VerificationController.m"

I am trying to implement the new VerificationController.m class that Apple released to fix the in-app purchase fraud problem.
As everything released by Apple, this is one more vague, incomplete and bad explained document with a lot of voids and unknowns that cannot be circumvented/understood by everyone.
I am trying to implement that, but at the end of the code we see these four methods:
- (NSString *)encodeBase64:(const uint8_t *)input length:(NSInteger)length
{
#warning Replace this method.
return nil;
}
- (NSString *)decodeBase64:(NSString *)input length:(NSInteger *)length
{
#warning Replace this method.
return nil;
}
#warning Implement this function.
char* base64_encode(const void* buf, size_t size)
{ return NULL; }
#warning Implement this function.
void * base64_decode(const char* s, size_t * data_len)
{ return NULL; }
You can see that Apple was lazy to implement the C functions at the end of the code. As my C/C++ abilities stink, I see I need to implement these two functions in C/C++ and that they must return char and void (???). Other people have posted routines to do that on SO, but they are either in Objective-C or not returning chars and void (??).
NOTE: this is another problem I have: how can a method return void if it is used by Apple in this form?
uint8_t *purchase_info_bytes = base64_decode([purchase_info_string cStringUsingEncoding:NSASCIIStringEncoding], &purchase_info_length);
shouldn't it be returning uint8_t?
NOTE2: another problem I have is that apple says base64_encode is required but it is not being used on the code provided by them. I think they are smoking bad stuff or my C/C++ knowledge really stink.
So, returning to my first question. Can someone post/point a method that can do the job that follows the requirements of the declared methods base64_encode and base64_decode? Please refrain from posting objective-c methods that are not compatible with these requirements imposed by Apple.
Thanks.
This solution should be pretty straight forward, which includes all the methods to populate the missing information. Tested and functional within the sandbox.
// single base64 character conversion
static int POS(char c)
{
if (c>='A' && c<='Z') return c - 'A';
if (c>='a' && c<='z') return c - 'a' + 26;
if (c>='0' && c<='9') return c - '0' + 52;
if (c == '+') return 62;
if (c == '/') return 63;
if (c == '=') return -1;
[NSException raise:#"invalid BASE64 encoding" format:#"Invalid BASE64 encoding"];
return 0;
}
- (NSString *)encodeBase64:(const uint8_t *)input length:(NSInteger)length
{
return [NSString stringWithUTF8String:base64_encode(input, (size_t)length)];
}
- (NSString *)decodeBase64:(NSString *)input length:(NSInteger *)length
{
size_t retLen;
uint8_t *retStr = base64_decode([input UTF8String], &retLen);
if (length)
*length = (NSInteger)retLen;
NSString *st = [[[NSString alloc] initWithBytes:retStr
length:retLen
encoding:NSUTF8StringEncoding] autorelease];
free(retStr); // If base64_decode returns dynamically allocated memory
return st;
}
char* base64_encode(const void* buf, size_t size)
{
static const char base64[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
char* str = (char*) malloc((size+3)*4/3 + 1);
char* p = str;
unsigned char* q = (unsigned char*) buf;
size_t i = 0;
while(i < size) {
int c = q[i++];
c *= 256;
if (i < size) c += q[i];
i++;
c *= 256;
if (i < size) c += q[i];
i++;
*p++ = base64[(c & 0x00fc0000) >> 18];
*p++ = base64[(c & 0x0003f000) >> 12];
if (i > size + 1)
*p++ = '=';
else
*p++ = base64[(c & 0x00000fc0) >> 6];
if (i > size)
*p++ = '=';
else
*p++ = base64[c & 0x0000003f];
}
*p = 0;
return str;
}
void* base64_decode(const char* s, size_t* data_len_ptr)
{
size_t len = strlen(s);
if (len % 4)
[NSException raise:#"Invalid input in base64_decode" format:#"%d is an invalid length for an input string for BASE64 decoding", len];
unsigned char* data = (unsigned char*) malloc(len/4*3);
int n[4];
unsigned char* q = (unsigned char*) data;
for(const char*p=s; *p; )
{
n[0] = POS(*p++);
n[1] = POS(*p++);
n[2] = POS(*p++);
n[3] = POS(*p++);
if (n[0]==-1 || n[1]==-1)
[NSException raise:#"Invalid input in base64_decode" format:#"Invalid BASE64 encoding"];
if (n[2]==-1 && n[3]!=-1)
[NSException raise:#"Invalid input in base64_decode" format:#"Invalid BASE64 encoding"];
q[0] = (n[0] << 2) + (n[1] >> 4);
if (n[2] != -1) q[1] = ((n[1] & 15) << 4) + (n[2] >> 2);
if (n[3] != -1) q[2] = ((n[2] & 3) << 6) + n[3];
q += 3;
}
// make sure that data_len_ptr is not null
if (!data_len_ptr)
[NSException raise:#"Invalid input in base64_decode" format:#"Invalid destination for output string length"];
*data_len_ptr = q-data - (n[2]==-1) - (n[3]==-1);
return data;
}
Here is a base 64 encode function for NSString to NSString:
+(NSString *) encodeString:(NSString *)inString
{
NSData *data = [inString dataUsingEncoding:NSUTF8StringEncoding];
//Point to start of the data and set buffer sizes
int inLength = [data length];
int outLength = ((((inLength * 4)/3)/4)*4) + (((inLength * 4)/3)%4 ? 4 : 0);
const char *inputBuffer = [data bytes];
char *outputBuffer = malloc(outLength);
outputBuffer[outLength] = 0;
//64 digit code
static char Encode[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
//start the count
int cycle = 0;
int inpos = 0;
int outpos = 0;
char temp;
outputBuffer[outLength-1] = '=';
outputBuffer[outLength-2] = '=';
while (inpos < inLength){
switch (cycle) {
case 0:
outputBuffer[outpos++] = Encode[(inputBuffer[inpos]&0xFC)>>2];
cycle = 1;
break;
case 1:
temp = (inputBuffer[inpos++]&0x03)<<4;
outputBuffer[outpos] = Encode[temp];
cycle = 2;
break;
case 2:
outputBuffer[outpos++] = Encode[temp|(inputBuffer[inpos]&0xF0)>> 4];
temp = (inputBuffer[inpos++]&0x0F)<<2;
outputBuffer[outpos] = Encode[temp];
cycle = 3;
break;
case 3:
outputBuffer[outpos++] = Encode[temp|(inputBuffer[inpos]&0xC0)>>6];
cycle = 4;
break;
case 4:
outputBuffer[outpos++] = Encode[inputBuffer[inpos++]&0x3f];
cycle = 0;
break;
default:
cycle = 0;
break;
}
}
NSString *pictemp = [NSString stringWithUTF8String:outputBuffer];
free(outputBuffer);
return pictemp;
}
and Here is a base 64 decode function for NSString to NSString:
+(NSString *) decodeString:(NSString *)inString
{
const char* string = [inString cStringUsingEncoding:NSASCIIStringEncoding];
NSInteger inputLength = inString.length;
static char decodingTable[128];
static char encodingTable[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
for (NSInteger i = 0; i < 128; i++) {
decodingTable[encodingTable[i]] = i;
}
if ((string == NULL) || (inputLength % 4 != 0)) {
return nil;
}
while (inputLength > 0 && string[inputLength - 1] == '=') {
inputLength--;
}
NSInteger outputLength = inputLength * 3 / 4;
NSMutableData* data = [NSMutableData dataWithLength:outputLength];
uint8_t* output = data.mutableBytes;
NSInteger inputPoint = 0;
NSInteger outputPoint = 0;
while (inputPoint < inputLength) {
char i0 = string[inputPoint++];
char i1 = string[inputPoint++];
char i2 = inputPoint < inputLength ? string[inputPoint++] : 'A'; /* 'A' will decode to \0 */
char i3 = inputPoint < inputLength ? string[inputPoint++] : 'A';
output[outputPoint++] = (decodingTable[i0] << 2) | (decodingTable[i1] >> 4);
if (outputPoint < outputLength) {
output[outputPoint++] = ((decodingTable[i1] & 0xf) << 4) | (decodingTable[i2] >> 2);
}
if (outputPoint < outputLength) {
output[outputPoint++] = ((decodingTable[i2] & 0x3) << 6) | decodingTable[i3];
}
}
NSLog(#"%#",data);
NSString *finalString = [[NSString alloc] initWithData:data encoding:NSASCIIStringEncoding];
return finalString;
}
These were pieced together from examples I found in various places on the internet when I was searching for them a while ago. They, may be easier for you to implement. I just created a Base64 class and placed these methods in it.
Here are the C wrappers around Justin's answer:
char* base64_encode(const void* buf, size_t size)
{
NSData* data = [NSData dataWithBytesNoCopy:(void*)buf length:size];
NSString* string = [[NSString alloc] initWithData:data encoding:NSASCIIStringEncoding];
return [[_Class_ encode:string] UTF8String];
}
void* base64_Decode (const char* s, size_t* data_len)
{
NSString* result = [_Class_ decode:[NSString stringWithCString:s encoding:NSASCIIStringEncoding]];
*data_len = result.length;
return [result UTF8String];
}
Where Class is the class that contains Justin's functions.

How to create byte array from NSData

Please any one guide me how to create bytes array from nsdata here is my code for createing nsdata
NSData* data = UIImagePNGRepresentation(img);
If you only want to read them, there's a really easy method :
unsigned char *bytes = [data bytes];
If you want to edit the data, there's a method on NSData that does this.
// Make your array to hold the bytes
NSUInteger length = [data length];
unsigned char *bytes = malloc( length * sizeof(unsigned char) );
// Get the data
[data getBytes:bytes length:length];
NB Don't forget - if you're copying the data, you also have to call free(bytes) at some point ;)
Here is fastest way (but pretty danger) to get array:
unsigned char *bytesArray = data.bytes;
NSUInteger lengthOfBytesArray = data.length;
before trying to get byte#100 you should check lengthOfBytesArray like:
if (lengthOfBytesArray > 100 + 1)
{
unsigned char byteWithOffset100 = bytesArray[100];
}
And another safe and more objc-like way:
- (NSArray*) arrayOfBytesFromData:(NSData*) data
{
if (data.length > 0)
{
NSMutableArray *array = [NSMutableArray arrayWithCapacity:data.length];
NSUInteger i = 0;
for (i = 0; i < data.length; i++)
{
unsigned char byteFromArray = data.bytes[i];
[array addObject:[NSValue valueWithBytes:&byteFromArray
objCType:#encode(unsigned char)]];
}
return [NSArray arrayWithArray:array];
}
return nil;
}

Retrieve NSData to Hex by length

I got a NSData that contain bytes like <00350029 0033> with length 6, is there any correct way to split the bytes to array somehow like (00, 35, 00, 29, 00, 33) ?
NSData *data = ...;
NSMutableArray *bytes = [NSMutableArray array];
for (NSUInteger i = 0; i < [data length]; i++) {
unsigned char byte;
[data getBytes:&byte range:NSMakeRange(i, 1)];
[bytes addObject:[NSString stringWithFormat:#"%x", byte]];
}
NSLog(#"%#", bytes);
(Assuming you want the bytes as a hex string representation, as in your example. Otherwise, use NSNumber.)
You could use the NSData method
- (void)getBytes:(void *)buffer range:(NSRange)range
to get the bytes in a given range (after having allocated the right amount of memory, using malloc), then use
+ (id)dataWithBytes:(const void *)bytes length:(NSUInteger)length
to create new small (1 byte long) data objects which you then put into an array. However if you just retrieve the pointer to the bytes themselves (using [data bytes]), that gives you a pointer (kind of an array in the C sense, not an NSArray, but could also be used and far more efficient).
static NSString* HexStringFromNSData(NSData* data) {
NSUInteger n = data.length;
NSMutableString* s = [NSMutableString stringWithCapacity:(2 * n)];
const unsigned char* ptr = [data bytes];
for(NSUInteger i = 0; i < n; i++, ptr++) {
[s appendFormat:#"%02x", (long)*ptr];
}
return [NSString stringWithString:s];
}

How can i convert NSString to uint8_t array

I have an NSString like so: 850210
How can i convert it to uint8_t data[]={0x85,0x02,0x10};
Could someone please help me out with this??
BR,
Suppi
#import <Foundation/Foundation.h>
static NSString * const hexString = #"850210";
int main (int argc, const char * argv[])
{
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
uint8_t data[3]={0};
const char * theUniChar = hexString.UTF8String;
for( int i = 0, c = (int)strlen(theUniChar)/2; i < c && i < sizeof(data)/sizeof(*data); i++ )
{
int theNum = 0;
sscanf( theUniChar + 2*i, "%2x", &theNum );
data[i] = theNum;
}
printf( "{%x,%x,%x}\n", data[0],data[1],data[2] );
[pool drain];
return 0;
}
Loop over the string and start by checking for strings of length < 3. If it is just simply call intValue on the NSString itself and you have your first and only byte.
If it's larger than or equal to three start by reading in three numbers and checking if that is larger than 255. If it is re-read in the two bytes and check what value they have and that's a byte. If it's not then take the three first numbers and that's a byte.
Repeat this process on the remaining string that you have (a string created by disregarding the numbers you have already read in).
Good luck :)
I was assuming that string was in decimal format. If that's hex then simply read in two numbers at a time. And use this question to help you: How can I convert hex number to integers and strings in Objective C?
Second edit:
I have not tested this but you if there are any errors they should be easy to fix:
NSString *hexString = #"8510A0";
uint8_t *result = (uint8_t *)malloc(sizeof(uint8_t) * ([hexString length] / 2));
for(int i = 0; i < [hexString length]; i += 2) {
NSRange range = { i, 2 };
NSString *subString = [hexString subStringWithRange:range];
unsigned value;
[[NSScanner scannerWithString:subString] scanHexInt:&value];
result[i / 2] = (uint8_t)value;
}
// result now contains what you want
free(result)

Convert hex data string to NSData in Objective C (cocoa)

fairly new iPhone developer here. Building an app to send RS232 commands to a device expecting them over a TCP/IP socket connection. I've got the comms part down, and can send ASCII commands fine. It's the hex code commands I'm having trouble with.
So lets say I have the following hex data to send (in this format):
\x1C\x02d\x00\x00\x00\xFF\x7F
How do I convert this into an NSData object, which my send method expects?
Obviously this does not work for this hex data (but does for standard ascii commands):
NSString *commandascii;
NSData *commandToSend;
commandascii = #"\x1C\x02d\x00\x00\x00\xFF\x7F";
commandToSend = [commandascii dataUsingEncoding:NSStringEncoding];
For a start, some of the \x hex codes are escape characters, and I get an "input conversion stopped..." warning when compiling in XCode. And NSStringEncoding obviously isn't right for this hex string either.
So the first problem is how to store this hex string I guess, then how to convert to NSData.
Any ideas?
Code for hex in NSStrings like "00 05 22 1C EA 01 00 FF". 'command' is the hex NSString.
command = [command stringByReplacingOccurrencesOfString:#" " withString:#""];
NSMutableData *commandToSend= [[NSMutableData alloc] init];
unsigned char whole_byte;
char byte_chars[3] = {'\0','\0','\0'};
for (int i = 0; i < ([command length] / 2); i++) {
byte_chars[0] = [command characterAtIndex:i*2];
byte_chars[1] = [command characterAtIndex:i*2+1];
whole_byte = strtol(byte_chars, NULL, 16);
[commandToSend appendBytes:&whole_byte length:1];
}
NSLog(#"%#", commandToSend);
Here's an example decoder implemented on a category on NSString.
#import <stdio.h>
#import <stdlib.h>
#import <string.h>
unsigned char strToChar (char a, char b)
{
char encoder[3] = {'\0','\0','\0'};
encoder[0] = a;
encoder[1] = b;
return (char) strtol(encoder,NULL,16);
}
#interface NSString (NSStringExtensions)
- (NSData *) decodeFromHexidecimal;
#end
#implementation NSString (NSStringExtensions)
- (NSData *) decodeFromHexidecimal;
{
const char * bytes = [self cStringUsingEncoding: NSUTF8StringEncoding];
NSUInteger length = strlen(bytes);
unsigned char * r = (unsigned char *) malloc(length / 2 + 1);
unsigned char * index = r;
while ((*bytes) && (*(bytes +1))) {
*index = strToChar(*bytes, *(bytes +1));
index++;
bytes+=2;
}
*index = '\0';
NSData * result = [NSData dataWithBytes: r length: length / 2];
free(r);
return result;
}
#end
If you can hard code the hex data:
const char bytes[] = "\x00\x12\x45\xAB";
size_t length = (sizeof bytes) - 1; //string literals have implicit trailing '\0'
NSData *data = [NSData dataWithBytes:bytes length:length];
If your code must interpret the hex string (assuming the hex string is in a variable called inputData and lengthOfInputData is the length of inputData):
#define HexCharToNybble(x) ((char)((x > '9') ? tolower(x) - 'a' + 10 : x - '0') & 0xF)
int i;
NSMutableData *data = [NSMutableData data];
for (i = 0; i < lengthOfInputData;)
{
char byteToAppend;
if (i < (lengthOfInputData - 3) &&
inputData[i+0] == '\\' &&
inputData[i+1] == 'x' &&
isxdigit(inputData[i+2]) &&
isxdigit(inputData[i+3]))
{
byteToAppend = HexCharToNybble(inputData[i+2]) << 4 + HexCharToNybble(input[i+3]);
i += 4;
}
else
{
byteToAppend = inputData[i];
i += 1;
}
[data appendBytes:&byteToAppend length:1];
}
This is an old topic, but I'd like to add some remarks.
• Scanning a string with [NSString characterAtIndex] is not very efficient.
Get the C string in UTF8, then scan it using a *char++ is much faster.
• It's better to allocate NSMutableData with capacity, to avoid time consuming block resizing. I think NSData is even better ( see next point )
• Instead of create NSData using malloc, then [NSData dataWithBytes] and finally free, use malloc, and [NSData dataWithBytesNoCopy:length:freeWhenDone:]
It also avoids memory operation ( reallocate, copy, free ). The freeWhenDone boolean tells the NSData to take ownership of the memory block, and free it when it will be released.
• Here is the function I have to convert hex strings to bytes blocks. There is not much error checking on input string, but the allocation is tested.
The formatting of the input string ( like remove 0x, spaces and punctuation marks ) is better out of the conversion function.
Why would we lose some time doing extra processing if we are sure the input is OK.
+(NSData*)bytesStringToData:(NSString*)bytesString
{
if (!bytesString || !bytesString.length) return NULL;
// Get the c string
const char *scanner=[bytesString cStringUsingEncoding:NSUTF8StringEncoding];
char twoChars[3]={0,0,0};
long bytesBlockSize = formattedBytesString.length/2;
long counter = bytesBlockSize;
Byte *bytesBlock = malloc(bytesBlockSize);
if (!bytesBlock) return NULL;
Byte *writer = bytesBlock;
while (counter--) {
twoChars[0]=*scanner++;
twoChars[1]=*scanner++;
*writer++ = strtol(twoChars, NULL, 16);
}
return[NSData dataWithBytesNoCopy:bytesBlock length:bytesBlockSize freeWhenDone:YES];
}
If I want to hard-code the bytes, I do something like this:
enum { numCommandBytes = 8 };
static const unsigned char commandBytes[numCommandBytes] = { 0x1c, 0x02, 'd', 0x0, 0x0, 0x0, 0xff, 0x7f };
If you're obtaining these backslash-escaped bytes at run time, try the strunvis function.
Obviously this does not work for this hex data (but does for standard ascii commands):
NSString *commandascii;
NSData *commandToSend;
commandascii = #"\x1C\x02d\x00\x00\x00\xFF\x7F";
commandToSend = [commandascii dataUsingEncoding:NSStringEncoding];
For a start, some of the \x hex codes are escape characters, and I get an "input conversion stopped..." warning when compiling in XCode. And NSStringEncoding obviously isn't right for this hex string either.
First, it's Xcode, with a lowercase c.
Second, NSStringEncoding is a type, not an encoding identifier. That code shouldn't compile at all.
More to the point, backslash-escaping is not an encoding; in fact, it's largely independent of encoding. The backslash and 'x' are characters, not bytes, which means that they must be encoded to (and decoded from) bytes, which is the job of an encoding.
Another way to do it.
-(NSData *) dataFromHexString:(NSString *) hexstr
{
NSMutableData *data = [[NSMutableData alloc] init];
NSString *inputStr = [hexstr uppercaseString];
NSString *hexChars = #"0123456789ABCDEF";
Byte b1,b2;
b1 = 255;
b2 = 255;
for (int i=0; i<hexstr.length; i++) {
NSString *subStr = [inputStr substringWithRange:NSMakeRange(i, 1)];
NSRange loc = [hexChars rangeOfString:subStr];
if (loc.location == NSNotFound) continue;
if (255 == b1) {
b1 = (Byte)loc.location;
}else {
b2 = (Byte)loc.location;
//Appending the Byte to NSData
Byte *bytes = malloc(sizeof(Byte) *1);
bytes[0] = ((b1<<4) & 0xf0) | (b2 & 0x0f);
[data appendBytes:bytes length:1];
b1 = b2 = 255;
}
}
return data;
}
-(NSData*) convertToByteArray:(NSString*) command {
if (command == nil || command.length == 0) return nil;
NSString *command1 = command;
if(command1.length%2 != 0) {
// to handle odd bytes like 1000 decimal = 3E8 with is of length = 3
command1 = [NSString stringWithFormat:#"0%#",command1];
}
NSUInteger length = command1.length/2 ;
NSMutableData *commandToSend = [[NSMutableData alloc] initWithLength:length];
char byte_chars[3] = {'\0','\0','\0'};
unsigned char whole_byte;
for (int i=0; i<length; i++) {
byte_chars[0] = [command1 characterAtIndex:i*2];
byte_chars[1] = [command1 characterAtIndex:i*2+1];
whole_byte = strtol(byte_chars, NULL, 16);
[commandToSend appendBytes:&whole_byte length:1];
}
NSRange commandRange = NSMakeRange(commandToSend.length - length, length);
NSData *result = [commandToSend subdataWithRange:commandRange];
return result;
}
I know this is a very old thread, but there is an encoding scheme in Objective C that can easily convert your string of hex codes into ASCII characters.
1) remove the \x from the string and with out keeping spaces in the string just convert the string to NSData using :
[[NSData alloc] initWithData:[stringToBeConverted dataUsingEncoding:NSASCIIStringEncoding]];
Hex data is just bytes in memory, you think of it as a string because that's how you see it but they could represent anything.
Try: (typed in the browser, may contain errors)
NSMutableData *hexData = [[NSMutableData alloc] init];
[hexData appendBytes: 0x1C];
[hexData appendBytes: 0x02D];
etc...