I have this method to make a xor between 2 NSStrings, i´m printing the result on NSLog but it isn´t the expect.
Can´t figure out what i´m doing wrong.
(void)XorSecretKeyDeviceId
{
NSString* secretKey = #"123";//
NSString* deviceId = #"abcdef";//
NSData* stringKey = [secretKey dataUsingEncoding:NSUTF8StringEncoding];
NSData* stringDeviceId = [deviceId dataUsingEncoding:NSUTF8StringEncoding];
unsigned char* pBytesInput = (unsigned char*)[stringKey bytes]; //Bytes
unsigned char* pBytesKey = (unsigned char*)[stringDeviceId bytes];
unsigned int vlen = [secretKey length]; //Keys Length
unsigned int klen = [deviceId length];
unsigned int v;
unsigned int k = vlen % klen;
unsigned char c;
for(v = 0; v < vlen; v++)
{
c = pBytesInput[v] ^ pBytesKey[k];
pBytesInput[v] = c;
NSLog(#"%c", c);
k = (++k < klen ? k : 0);
}
}
Are you setting your pBytesInput and pBytesKey variables correctly? At the moment, you have unsigned char* pBytesInput = (unsigned char*)[stringKey bytes]; (i.e. the input is the "key"), and pBytesKey is the device ID. This seems odd.
Also, be careful using UTF-8 encoding. UTF-8 uses the high bit on any byte in the string to indicate a "continuation" of a multi-byte character into the next byte. Your encoding could plausibly generate invalid UTF-8 by giving the setting the high bit of the final byte in the encryption.
For more than that, you'll have to say what the "wrong result" is.
Related
Considering having, for example, this type of hex string:
char hex_str[100] = "0x01 0x03 0x04 0x0A";
How to get out of this string the byte array representation in CAPL, like:
byte hex_str_as_byte_arr[4] = {0x01, 0x03, 0x04, 0x0A};
EDIT: Only Vector CANoe supported data types/functions are allowed!
Use strtok to split the character array into separate hex strings, then use long strtol( const char *restrict str, char **restrict str_end, int base ) to convert each hex string to an integral value.
Thanks to all...
Actually I've found a solution myself:
char hex_str[100] = "0x01 0x03 0x04 0x0A";
long data[4];
dword pos = 0;
pos = strtol(hex_str, pos, data[0]);
pos = strtol(hex_str, pos, data[1]);
pos = strtol(hex_str, pos, data[2]);
pos = strtol(hex_str, pos, data[3]);
write("0x%02x,0x%02x,0x%02x, 0x%02x", data[0], data[1], data[2], data[3]);
Now it's a simple cast: (byte) data[0]
We can use sscanf() to convert the numbers to unsigned char. In a loop, we'll need to also use a %n conversion to determine the reading position for the next iteration.
Here's a simple example (in real life, you'll need some range checking to make sure you don't overrun the output buffer):
#include <stdio.h>
int main(void)
{
const char hex_str[100] = "0x01, 0x03, 0x04, 0x0A";
unsigned char bytes[4];
{
int position;
unsigned char *b = bytes;
for (const char *input = hex_str; sscanf(input, "%hhi, %n", b, &position) == 1; ++b) {
input += position;
}
}
/* prove we did it */
for (size_t i = 0; i < sizeof bytes; ++i) {
printf("%hhu ", bytes[i]);
}
puts("");
}
int main()
{
unsigned short crc = 0x00;
unsigned char buffer[4] = {0x01,0x02,0x72,0xAE};
memcpy((void *)&crc, (void *)&buffer[2],2);
printf("crc = 0x%x \n",crc);
return 0;
}
for above program i was expecting crc value to be : 0x72AE but it comes out to be:
crc = 0xAE72
I am not able to understand why the bytes are shuffled, even though i am doing a memcpy?
Any kind of help will be appreciated. Thanks in advance.
I am implementing AES decoder, for creating IV and key, the algorithm is such that
IV Key's 16 bytes:the first 16 bytes of ProductID.getBytes("UTF-8")
(If there are no enough bytes,
make up to 16 bytes at right by 0x32)
and my code for padding
- (char*)paddedStringFromString:(NSString *)string withLength:(NSUInteger)length{
const char *stringC = [string UTF8String];
char * output;
output = malloc(length+1);
for (NSInteger i = 0; i < length; i++) {
if (i < string.length) output[i] = stringC[i];
else output[i] = 0x32;
}
return output;
}
But I am not getting the right result. Is my approach for padding is right. Please help
I think the args length and string.length are not the same, right?
Can anyone help converting the Int to char array
as i have buffer as
char *buffer = NULL;
int lengthOfComponent = -1;
char *obj;
buffer[index]= (char *)&lengthOfComponent;
if i do this it is thorwing EXCESS BAD ACCESS after the execution how to store the value of the obj to buffer using memcpy
Of course you cannot write in buffer[index], it is not allocated!
buffer = malloc(sizeof(char) * lengthOfBuffer);
should do it. After that you can write the buffer with memcpy or with an assignation, like you are doing.
buffer[index] = (char *)&lengthOfComponent;
buffer[index] is like dereferencing the pointer. But buffer is not pointing to any valid location. Hence the runtime error.
The C solution is using snprintf. Try -
int i = 11;
char buffer[10];
snprintf(buffer, sizeof(buffer), "%d", i);
I'm looking for an Objective-C way of sorting characters in a string, as per the answer to this question.
Ideally a function that takes an NSString and returns the sorted equivalent.
Additionally I'd like to run length encode sequences of 3 or more repeats. So, for example "mississippi" first becomes "iiiimppssss", and then could be shortened by encoding as "4impp4s".
I'm not expert in Objective-C (more Java and C++ background) so I'd also like some clue as to what is the best practice for dealing with the memory management (retain counts etc - no GC on the iphone) for the return value of such a function. My source string is in an iPhone search bar control and so is an NSString *.
int char_compare(const char* a, const char* b) {
if(*a < *b) {
return -1;
} else if(*a > *b) {
return 1;
} else {
return 0;
}
}
NSString *sort_str(NSString *unsorted) {
int len = [unsorted length] + 1;
char *cstr = malloc(len);
[unsorted getCString:cstr maxLength:len encoding:NSISOLatin1StringEncoding];
qsort(cstr, len - 1, sizeof(char), char_compare);
NSString *sorted = [NSString stringWithCString:cstr encoding:NSISOLatin1StringEncoding];
free(cstr);
return sorted;
}
The return value is autoreleased so if you want to hold on to it in the caller you'll need to retain it. Not Unicode safe.
With a bounded code-set, radix sort is best:
NSString * sortString(NSString* word) {
int rads[128];
const char *cstr = [word UTF8String];
char *buff = calloc([word length]+1, sizeof(char));
int p = 0;
for(int c = 'a'; c <= 'z'; c++) {
rads[c] = 0;
}
for(int k = 0; k < [word length]; k++) {
int c = cstr[k];
rads[c]++;
}
for(int c = 'a'; c <= 'z'; c++) {
int n = rads[c];
while (n > 0) {
buff[p++] = c;
n--;
}
}
buff[p++] = 0;
return [NSString stringWithUTF8String: buff];
}
Note that the example above only works for lowercase letters (copied from a specific app which needs to sort lowercase strings). To expand it to handle all of the ASCII 127, just do for(c=0; c <= 127; c++).