How to create PBKDF2 key on iOS device - iphone

I need to create a PBKDF2 key to use in my AES encryption routine in my iPhone Xcode application. I have seen references to using OpenSSL to do this, but not found specific references to what module within OpenSSL to call.
I have scanned various OpenSSL .h files searching for a means to make this call, but have so far been unsuccessful.
The key I will be using is 5-digits, Salt is 12 characters, Iterations is 1000, and I need a 128-bit generated key.

You can use the PKCS5_PBKDF2_HMAC_SHA1() function in openssl/evp.h. Divining how to use the function is pretty easy from the declaration:
int PKCS5_PBKDF2_HMAC_SHA1(const char *pass, int passlen,
const unsigned char *salt, int saltlen, int iter,
int keylen, unsigned char *out);

I think p5_crpt2.c is what you are looking for.

Related

NSCoder for unsigned integers with swift

In my swift app, I really need to store a UInt32 value and retrieve it using NSCoder.
The issue is, there are methods for signed integers :
coder.decodeInteger()
coder.decodeInt32()
coder.decodeInt64()
but not for unsigned integers.
Also, casting an Int32 of negative value in an UInt32 does not seem to work.
Am I missing some point?
The tool you want is init(bitPattern:). This is the same as C-style casting on integers (which is how you use these methods in ObjC when you need unsigned ints). It just reinterprets the bits.
UInt32(bitPattern: coder.decodeInt32())

Char string encoding differences between native C++ and C++/CLI?

I have a strange problem for which I believe there is a solution but I cannot find it. Your help would be appreciated.
On the one hand, I have a native C++ class named Native which has a static wchar_t array containing accentuated characters. This array is const and defined at build time.
/// Header file
Native
{
public:
static const wchar_t* Array() const { return mArray; }
private:
static const wchar_t *mArray;
};
//--------------------------------------------------------------
/// .cpp file
const wchar_t* Native::mArray = {L"This is a description éàçï"};
On the other hand, I have a C++/CLI class that uses the array like this:
/// C++/CLI use
System::String^ S1 = gcnew System::String( Native::Array() );
System::String^ S2 = gcnew System::String( L"This is a description éàçï" };
The problem is that while S2 gives This is a description éàçï as expected, S1 gives This is a description éà çï. I do not understand why passing a pointer to a static array will not give the same result as giving the same array directly???
I guess this is an encoding problem but I would have expected the same results for both S1 and S2. Do you know how to solve the problem? The way I must use it in my program is like S1 i.e. by accessing the build time static array with a static method that returns a const wchar_t*.
Thanks for your help!
EDIT 1
What is the best way to define literals at build time in C++ using Intel C++ 13.0 to make them directly usable in C++/CLI System::String constructor? This could be the ultimate question for my problem.
I don't have enough reputation to add a comment to ask this question, so I apologize for posting this as an answer if that seems inappropriate.
Could the problem be that your compiler defines wchar_t to be 8 bits? I'm basing that is possible on this answer:
Should I use wchar_t when using UTF-8?
To answer your question (in the comments) about building a UTF-16 array at build time, I believe you can force it to be UTF-16 by using u"..." for your literal instead of L"..." (see http://en.cppreference.com/w/cpp/language/string_literal)
Edit 1:
For what it's worth, I tried your code (after fixing a couple compile errors) using Microsoft Visual Studio 10 and didn't have the same problem (both strings printed as expected).
I don't know if it will help you, but another possible way to statically initialize this wchar_t array is to use std::wstring to wrap your literal and then set your array to the c-string pointer returned by wstring::c_str(), shown as follows:
std::wstring ws(L"This is a description éàçï");
const wchar_t* Native::mArray = ws.c_str();
This edit was inspired by Dynamic wchar_t array (C++ beginner)

Best pratice for typedef of uint32

On a system where both long and int is 4 bytes which is the best and why?
typedef unsigned long u32;
or
typedef unsigned int u32;
note: uint32_t is not an option
Nowadays every platform has stdint.h or its C++ equivalent cstdint which define uint32_t. Please use the standard type rather than creating your own.
http://pubs.opengroup.org/onlinepubs/7999959899/basedefs/stdint.h.html
http://www.cplusplus.com/reference/cstdint/
http://msdn.microsoft.com/en-us/library/hh874765.aspx
The size will be the same between both, so it depend only on your use.
If you need to store decimal values, use long.
A better and complete answer here:
https://stackoverflow.com/questions/271076/what-is-the-difference-between-an-int-and-a-long-in-c/271132
Edit: I'm not sure about decimal with long, if someone can confirm, thanks.
Since you said the standard uint32_t is not an option, using long and int are both correct on 32-bit machines, I'll say
typedef unsigned int u32;
is a little better, because on two popular 64-bit machine data models (LLP64 and LP64), int is still 32-bit, while long could be 32-bit or 64-bit. See 64-bit data models

About using Bob Jenkins' perfect hash library

When i use Bob Jenkins's perfect hash package, After build "perfect" binary, i could not even pass the example by "./perfect < sample_input" , it always warn me that "fatal error: Cannot perfect hash: cannot build tab[]", Has anyone met this issue before ? Are there any other steady perfect hash concerned library or package, Thanks in advanced!
Quote Jenkins' perfect hash library link as below:
http://burtleburtle.net/bob/hash/perfect.html
I faced the same problem with gcc and clang under 64 bit Linux and found the reason:
The type definitions of the 4 byte types ub4 and sb4 must be changed in standard.h from
typedef unsigned long int ub4;
typedef signed long int sb4;
to
typedef unsigned int ub4;
typedef signed int sb4;
or might be defined as aliases to types from stdint.h (uint32_t and int32_t).

What encoding do strings found in the Mach-O __DATA segment, __cfstring section use?

I'm wondering how to properly read strings from a specific section of a Mach-O binary. (This is a binary for iOS.)
I'm curious about the strings found in the __DATA segment, __cfstring section. These sections appear to contain arrays of simple structures:
NSConstantString
{
Class class;
const char *string;
int length;
}
The question comes down to: how do you decide the encoding of the string?
It's described in the source of CFString available here. It's either in ASCII or UTF16 (in the processor endian-ness.)
Also see the source code of clang, available here. Look for GenerateConstantString. Constant strings are eventually generated by this piece of code, look for GetAddrOfConstantCFString. The source code says that the constant CFString is of the format
struct __builtin_CFString {
const int *isa; // point to __CFConstantStringClassReference
int flags;
const char *str;
long length;
};
(at least on OS X, I'm not sure about iOS.) flags tells you whether it's ASCII or UTF16.