Encrypted on iPhone, but absolutely will not decrypt in php - iphone

ok
I'm using this to encrypt my data on iPhone:
- (NSData *)AES128EncryptWithKey:(NSString *)key{
char keyPtr[kCCKeySizeAES128 + 1]; // room for terminator (unused)
bzero( keyPtr, sizeof( keyPtr ) ); // fill with zeroes (for padding)
// fetch key data
[key getCString:keyPtr maxLength:sizeof( keyPtr ) encoding:NSUTF8StringEncoding];
NSUInteger dataLength = [self length];
//See the doc: For block ciphers, the output size will always be less than or
//equal to the input size plus the size of one block.
//That's why we need to add the size of one block here
size_t bufferSize = dataLength + kCCBlockSizeAES128;
void *buffer = malloc( bufferSize );
size_t numBytesEncrypted = 0;
CCCryptorStatus cryptStatus = CCCrypt( kCCEncrypt, kCCAlgorithmAES128, kCCOptionECBMode /*| kCCOptionPKCS7Padding*/,
keyPtr, kCCKeySizeAES128,
NULL /* initialization vector (optional) */,
[self bytes], dataLength, /* input */
buffer, bufferSize, /* output */
&numBytesEncrypted );
if( cryptStatus == kCCSuccess )
{
//the returned NSData takes ownership of the buffer and will free it on deallocation
return [NSData dataWithBytesNoCopy:buffer length:numBytesEncrypted];
}
free( buffer ); //free the buffer
return nil;}
on my server my php script uses:
$base64encoded_ciphertext = $pass;
mcrypt_decrypt(MCRYPT_RIJNDAEL_128, $key, base64_decode($pass), 'ecb');
$decrypted = $res_non;
$dec_s2 = strlen($decrypted);
$padding = ord($decrypted[$dec_s2-1]);
$decrypted = substr($decrypted, 0, -$padding);
return $decrypted;
However, no matter what the key is, it fails.
The keys are always 10 characters long. I build a password using the system clock to get values.
On the php side I duplicate the key building and according to my script, the key ALWAYS matches what the iPhone used to encrypt.
This code worked in a different script, from a different app… and still works.
I've done a dead cut and paste of all related code and still nothing.
I just don't know what I'm doing wrong… beyond what I'm trying to do maybe being absolutely impossible

For AES 128 you should use 16 byte Encryption key, AES 192 it is 24, AES 256 it is 32 byte. For your case you should use encryption key like #"1234567890123456".

Related

WinSock: How send() a PByte type?

Firstly, i want know if the PByte type is equivalent to a BYTE*(byte pointer) in C++. In negative case, what's could be on Delphi that more near to BYTE* of C++?
Well, suppose that i'm right about that PByte is BYTE* (C++), then based on following C++ code, how send() this data type (PByte) correctly using native WinSock?
See:
C++:
SOCKET sock;
BITMAPINFO bmpInfo;
BYTE *bytes = NULL;
BYTE *temp_bytes = NULL;
DWORD workSpaceSize, fragmntWorkSpaceSize, size;
RtlGetCompressionWorkSpaceSize(COMPRESSION_FORMAT_LZNT1, &workSpaceSize, &fragmntWorkSpaceSize);
bytes = (BYTE *) Alloc(bmpInfo.bmiHeader.biSizeImage);
temp_bytes = (BYTE *) Alloc(bmpInfo.bmiHeader.biSizeImage);
BYTE *memory = (BYTE *) Alloc(workSpaceSize);
RtlCompressBuffer(COMPRESSION_FORMAT_LZNT1,
bytes,
bmpInfo.bmiHeader.biSizeImage,
temp_bytes,
bmpInfo.bmiHeader.biSizeImage,
2048,
&size,
memory);
free(bytes);
free(memory);
if(Send(sock, (char *) temp_bytes, size, 0) <= 0) return;
free(temp_bytes);
Delphi:
var
Sock: TSocket;
bmpInfo: TBitMapInfo;
bytes: PByte = nil;
temp_bytes: PByte = nil;
memory: PByte;
workSpaceSize, fragmntWorkSpaceSize, Size: Cardinal;
//...
RtlGetCompressionWorkSpaceSize(COMPRESSION_FORMAT_LZNT1, #workSpaceSize, #fragmntWorkSpaceSize);
bytes := AllocMem(bmpInfo.bmiHeader.biSizeImage);
temp_bytes := AllocMem(bmpInfo.bmiHeader.biSizeImage);
memory := AllocMem(workSpaceSize);
RtlCompressBuffer(COMPRESSION_FORMAT_LZNT1, bytes, bmpInfo.bmiHeader.biSizeImage,
temp_bytes, bmpInfo.bmiHeader.biSizeImage, 2048, #Size, memory);
FreeMem(bytes);
FreeMem(memory);
if send(Sock, temp_bytes^, Size, 0) <= 0 then Exit;
FreeMem(temp_bytes);
Reference to RtlGetCompressionWorkSpaceSize() and RtlCompressBuffer() functions in C++.
Reference to RtlGetCompressionWorkSpaceSize() and RtlCompressBuffer() functions in Delphi.

Objective C Converting NSString to UTF8 with padding

I am implementing AES decoder, for creating IV and key, the algorithm is such that
IV Key's 16 bytes:the first 16 bytes of ProductID.getBytes("UTF-8")
(If there are no enough bytes,
make up to 16 bytes at right by 0x32)
and my code for padding
- (char*)paddedStringFromString:(NSString *)string withLength:(NSUInteger)length{
const char *stringC = [string UTF8String];
char * output;
output = malloc(length+1);
for (NSInteger i = 0; i < length; i++) {
if (i < string.length) output[i] = stringC[i];
else output[i] = 0x32;
}
return output;
}
But I am not getting the right result. Is my approach for padding is right. Please help
I think the args length and string.length are not the same, right?

How to store int in char * in iphone

Can anyone help converting the Int to char array
as i have buffer as
char *buffer = NULL;
int lengthOfComponent = -1;
char *obj;
buffer[index]= (char *)&lengthOfComponent;
if i do this it is thorwing EXCESS BAD ACCESS after the execution how to store the value of the obj to buffer using memcpy
Of course you cannot write in buffer[index], it is not allocated!
buffer = malloc(sizeof(char) * lengthOfBuffer);
should do it. After that you can write the buffer with memcpy or with an assignation, like you are doing.
buffer[index] = (char *)&lengthOfComponent;
buffer[index] is like dereferencing the pointer. But buffer is not pointing to any valid location. Hence the runtime error.
The C solution is using snprintf. Try -
int i = 11;
char buffer[10];
snprintf(buffer, sizeof(buffer), "%d", i);

Xor between 2 NSString gives wrong result

I have this method to make a xor between 2 NSStrings, i´m printing the result on NSLog but it isn´t the expect.
Can´t figure out what i´m doing wrong.
(void)XorSecretKeyDeviceId
{
NSString* secretKey = #"123";//
NSString* deviceId = #"abcdef";//
NSData* stringKey = [secretKey dataUsingEncoding:NSUTF8StringEncoding];
NSData* stringDeviceId = [deviceId dataUsingEncoding:NSUTF8StringEncoding];
unsigned char* pBytesInput = (unsigned char*)[stringKey bytes]; //Bytes
unsigned char* pBytesKey = (unsigned char*)[stringDeviceId bytes];
unsigned int vlen = [secretKey length]; //Keys Length
unsigned int klen = [deviceId length];
unsigned int v;
unsigned int k = vlen % klen;
unsigned char c;
for(v = 0; v < vlen; v++)
{
c = pBytesInput[v] ^ pBytesKey[k];
pBytesInput[v] = c;
NSLog(#"%c", c);
k = (++k < klen ? k : 0);
}
}
Are you setting your pBytesInput and pBytesKey variables correctly? At the moment, you have unsigned char* pBytesInput = (unsigned char*)[stringKey bytes]; (i.e. the input is the "key"), and pBytesKey is the device ID. This seems odd.
Also, be careful using UTF-8 encoding. UTF-8 uses the high bit on any byte in the string to indicate a "continuation" of a multi-byte character into the next byte. Your encoding could plausibly generate invalid UTF-8 by giving the setting the high bit of the final byte in the encryption.
For more than that, you'll have to say what the "wrong result" is.

Encoding images to video with ffmpeg

I am trying to encode series of images to one video file. I am using code from api-example.c, its works, but it gives me weird green colors in video. I know, I need to convert my RGB images to YUV, I found some solution, but its doesn't works, the colors is not green but very strange, so thats the code:
// Register all formats and codecs
av_register_all();
AVCodec *codec;
AVCodecContext *c= NULL;
int i, out_size, size, outbuf_size;
FILE *f;
AVFrame *picture;
uint8_t *outbuf;
printf("Video encoding\n");
/* find the mpeg video encoder */
codec = avcodec_find_encoder(CODEC_ID_MPEG2VIDEO);
if (!codec) {
fprintf(stderr, "codec not found\n");
exit(1);
}
c= avcodec_alloc_context();
picture= avcodec_alloc_frame();
/* put sample parameters */
c->bit_rate = 400000;
/* resolution must be a multiple of two */
c->width = 352;
c->height = 288;
/* frames per second */
c->time_base= (AVRational){1,25};
c->gop_size = 10; /* emit one intra frame every ten frames */
c->max_b_frames=1;
c->pix_fmt = PIX_FMT_YUV420P;
/* open it */
if (avcodec_open(c, codec) < 0) {
fprintf(stderr, "could not open codec\n");
exit(1);
}
f = fopen(filename, "wb");
if (!f) {
fprintf(stderr, "could not open %s\n", filename);
exit(1);
}
/* alloc image and output buffer */
outbuf_size = 100000;
outbuf = malloc(outbuf_size);
size = c->width * c->height;
#pragma mark -
AVFrame* outpic = avcodec_alloc_frame();
int nbytes = avpicture_get_size(PIX_FMT_YUV420P, c->width, c->height);
//create buffer for the output image
uint8_t* outbuffer = (uint8_t*)av_malloc(nbytes);
#pragma mark -
for(i=1;i<77;i++) {
fflush(stdout);
int numBytes = avpicture_get_size(PIX_FMT_YUV420P, c->width, c->height);
uint8_t *buffer = (uint8_t *)av_malloc(numBytes*sizeof(uint8_t));
UIImage *image = [UIImage imageNamed:[NSString stringWithFormat:#"10%d", i]];
CGImageRef newCgImage = [image CGImage];
CGDataProviderRef dataProvider = CGImageGetDataProvider(newCgImage);
CFDataRef bitmapData = CGDataProviderCopyData(dataProvider);
buffer = (uint8_t *)CFDataGetBytePtr(bitmapData);
avpicture_fill((AVPicture*)picture, buffer, PIX_FMT_RGB8, c->width, c->height);
avpicture_fill((AVPicture*)outpic, outbuffer, PIX_FMT_YUV420P, c->width, c->height);
struct SwsContext* fooContext = sws_getContext(c->width, c->height,
PIX_FMT_RGB8,
c->width, c->height,
PIX_FMT_YUV420P,
SWS_FAST_BILINEAR, NULL, NULL, NULL);
//perform the conversion
sws_scale(fooContext, picture->data, picture->linesize, 0, c->height, outpic->data, outpic->linesize);
// Here is where I try to convert to YUV
/* encode the image */
out_size = avcodec_encode_video(c, outbuf, outbuf_size, outpic);
printf("encoding frame %3d (size=%5d)\n", i, out_size);
fwrite(outbuf, 1, out_size, f);
free(buffer);
buffer = NULL;
}
/* get the delayed frames */
for(; out_size; i++) {
fflush(stdout);
out_size = avcodec_encode_video(c, outbuf, outbuf_size, NULL);
printf("write frame %3d (size=%5d)\n", i, out_size);
fwrite(outbuf, 1, outbuf_size, f);
}
/* add sequence end code to have a real mpeg file */
outbuf[0] = 0x00;
outbuf[1] = 0x00;
outbuf[2] = 0x01;
outbuf[3] = 0xb7;
fwrite(outbuf, 1, 4, f);
fclose(f);
free(outbuf);
avcodec_close(c);
av_free(c);
av_free(picture);
printf("\n");
Please give me advice how to fix that problem.
You can see article http://unick-soft.ru/Articles.cgi?id=20. But it is article on Russian, but it includes code samples and VS Example.
Has anyone found a fix for this? I am seeing the green video problem on the decode side. That is, when I decode incoming PIX_FMT_YUV420 packets and then swsscale them to PIX_FMT_RGBA.
Thanks!
EDIT:
The green images are probably due to an arm optimization backfiring. I used this to fix the problem in my case:
http://ffmpeg-users.933282.n4.nabble.com/green-distorded-output-image-on-iPhone-td2231805.html
I guess the idea is to not specify any architecture (the config will you a warning about the architecture being unknown but you can continue to 'make' anyway). That way, the arm optimizations are not used. There maybe a slight performance hit (if any), but atleast it works! :)
I think the problem is most likely that you are using PIX_FMT_RGB8 as your input pixel format. This does not mean 8 bits per channel like the commonly used 24-bit RGB or 32-bit ARGB. It means 8 bits per pixel, meaning that all three color channels are housed in a single byte. I am guessing that this is not the format of your image since it is quite uncommon, so you need to use PIX_FMT_RGB24 or PIX_FMT_RGB32 depending on whether or not your input image has an alpha channel. See this documentation page for info on the pixel formats.