Compilation error on blocks with return type - iphone

I have the following block code
typedef BOOL(^FieldValidationBlock)(NSString *);
FieldValidationBlock aBlock = ^(NSString *input){
return ([input length] == 10) ;
};
which throws me a compilation error thats tates the return type is int and should be BOOL.
when I add a cast it works just fine:
typedef BOOL(^FieldValidationBlock)(NSString *);
FieldValidationBlock aBlock = ^(NSString *input){
return (BOOL)([input length] == 10) ;
};
why this happen?

Because BOOL is an objective C type, and the logical comparison operators are standard C. In standard C the return type of comparison operators is an int. This is important to know sometimes, as when you negate a value that you assume to be boolean, but is in fact an int, it's not necessarily going to be what you expect.
In your example, casting to a BOOL is fine.

Related

Print UTF-32 string with wprintf

I am migrating some code from using wchar_t to char32_t, and when compiling with the -Werror=pointer-sign flag set, I am getting the following issue:
// main.c
#include <uchar.h>
#include <wchar.h>
int main(void) {
wprintf(U"some data\n");
}
Compiling: gcc -std=c11 -Werror=pointer-sign main.c
Output:
main.c: In function ‘main’:
main.c:5:10: error: pointer targets in passing argument 1 of ‘wprintf’ differ in signedness [-Werror=pointer-sign]
wprintf(U"some data\n");
^~~~~~~~~~~~~~
In file included from main.c:2:
/usr/include/wchar.h:587:12: note: expected ‘const wchar_t * restrict’ {aka ‘const int * restrict’} but argument is of type ‘unsigned int *’
extern int wprintf (const wchar_t *__restrict __format, ...)
^~~~~~~
To remedy this, I can do:
wprintf((const int *)U"some data\n");
//or
printf("%ls\n", U"some data");
Although this is quite a pain. Is there a nice and easy way to do this? What is the real difference between const unsigned int* vs const signed int*, aside from the data type it points to? Is this possibly dangerous, or should I just disable the flag altogether?
char32_t is an unsigned type.
wchar_t is either signed or unsigned, depending on implementation. In your case, it is signed.
You can't pass a pointer-to-unsigned where a pointer-to-signed is expected. So yes, you need a type-cast, however you should be casting to const wchar_t *, since that is what wprintf() actually expects (wchar_t just happens to be implemented as an int on your compiler, but don't cast to that directly):
wprintf((const wchar_t *)U"some data\n");
It doesn't get much cleaner than that, unless you wrap it in your own function, eg:
int wprintf32(const char32_t *str, ...)
{
va_list args;
va_start(args, str);
int result = vwprintf((const wchar_t *)str, args);
va_end(args);
return result;
}
wprintf32(U"some data\n");
Note that this code will not work properly at all on platforms where sizeof(wchar_t) < sizeof(char32_t), such as Windows. On those platforms, where sizeof(wchar_t) is 2, you will have to actually convert your string data from UTF-32 to UTF-16 instead, eg:
int wprintf32(const char32_t *str, ...)
{
va_list args;
int result;
va_start(args, str);
if (sizeof(wchar_t) < sizeof(char32_t))
{
wchar_t *str = convert_to_utf16(str); // <-- for you to implement
result = vwprintf(str, args);
free(str);
}
else
result = vwprintf((const wchar_t *)str, args);
va_end(args);
return result;
}
wprintf32(U"some data\n");

String value to UnsafePointer<UInt8> function parameter behavior

I found the following code compiles and works:
func foo(p:UnsafePointer<UInt8>) {
var p = p
for p; p.memory != 0; p++ {
print(String(format:"%2X", p.memory))
}
}
let str:String = "今日"
foo(str)
This prints E4BB8AE697A5 and that is a valid UTF8 representation of 今日
As far as I know, this is undocumented behavior. from the document:
When a function is declared as taking a UnsafePointer argument, it can accept any of the following:
nil, which is passed as a null pointer
An UnsafePointer, UnsafeMutablePointer, or AutoreleasingUnsafeMutablePointer value, which is converted to UnsafePointer if necessary
An in-out expression whose operand is an lvalue of type Type, which is passed as the address of the lvalue
A [Type] value, which is passed as a pointer to the start of the array, and lifetime-extended for the duration of the call
In this case, str is non of them.
Am I missing something?
ADDED:
And it doesn't work if the parameter type is UnsafePointer<UInt16>
func foo(p:UnsafePointer<UInt16>) {
var p = p
for p; p.memory != 0; p++ {
print(String(format:"%4X", p.memory))
}
}
let str:String = "今日"
foo(str)
// ^ 'String' is not convertible to 'UnsafePointer<UInt16>'
Even though the internal String representation is UTF16
let str = "今日"
var p = UnsafePointer<UInt16>(str._core._baseAddress)
for p; p.memory != 0; p++ {
print(String(format:"%4X", p.memory)) // prints 4ECA65E5 which is UTF16 今日
}
This is working because of one of the interoperability changes the Swift team has made since the initial launch - you're right that it looks like it hasn't made it into the documentation yet. String works where an UnsafePointer<UInt8> is required so that you can call C functions that expect a const char * parameter without a lot of extra work.
Look at the C function strlen, defined in "shims.h":
size_t strlen(const char *s);
In Swift it comes through as this:
func strlen(s: UnsafePointer<Int8>) -> UInt
Which can be called with a String with no additional work:
let str = "Hi."
strlen(str)
// 3
Look at the revisions on this answer to see how C-string interop has changed over time: https://stackoverflow.com/a/24438698/59541

Casting nil in block

I was just quickly playing with blocks today and I came across the error:
NSString *(^testBlock)(int) = ^(int option) {
if (option == 1) return #"ONE";
if (option == 2) return #"TWO";
return nil;
};
NSLog(#"OUTPUT: %#", testBlock(4));
Return type 'void *' must match previous return type 'NSString *' when block literal has unspecified explicit return type
As I really wanted to return nil if "1" or "2" were not input I decided to simply cast the final return back to an NSString using:
NSString *(^testBlock)(int) = ^(int option) {
if (option == 1) return #"ONE";
if (option == 2) return #"TWO";
return (NSString *) nil;
};
This works just fine, I was just curious if this was the correct solution or even bad practice as I have never thought about casting nil before?
It's not the best approach.
You should correct the first line to this:
NSString *(^testBlock)(int) = ^NSString*(int option){
if(option == 1) return #"ONE";
if(option==2) return #"TWO";
return nil;
};
This way the block literal has the return type specified and the error goes away. Correctly.
EDIT: Adding explanation on the initial error:
A block without a return type will have the return type inferred by the compiler (which doesn't happen with functions). When you have 2 return statements in the block with different types (note that nil is void*) the compiler can't infer the return type and reports an error.
To fix that error you have to manually specify a return type to avoid ambiguity for the compiler.
As a good practice, never return different types from the same block unless you are using polymorphism.

What does error conflicting types for '' mean?

i got an error that said "error: conflicting types for '____'. What does that mean?
Quickfix:
Make sure that your functions are declared once and only once before they are called. For example, change:
main(){ myfun(3.4); }
double myfun(double x){ return x; }
To:
double myfun(double x){ return x; }
main(){ myfun(3.4); }
Or add a separate function declaration:
double myfun(double x);
main(){ myfun(3.4); }
double myfun(double x){ return x; }
Possible causes for the error
Function was called before being declared
Function defined overrides a function declared in an included header.
Function was defined twice in the same file
Declaration and definition don't match
Declaration conflict in the included headers
What's really going on
error: conflicting types for ‘foo’ means that a function was defined more than once with different type signatures.
A file that includes two functions with the same name but different return types would throw this error, for example:
int foo(){return 1;}
double foo(){return 1.0;}
Indeed, when compiled with GCC we get the following errors:
foo.c:5:8: error: conflicting types for ‘foo’
double foo(){return 1.0;}
^
foo.c:4:5: note: previous definition of ‘foo’ was here
int foo(){return 1;}
^
Now, if instead we had a file with two function definitions with the same name
double foo(){return 1;}
double foo(){return 1.0;}
We would get a 'redefinition' error instead:
foo.c:5:8: error: redefinition of ‘foo’
double foo(){return 1.0;}
^
foo.c:4:8: note: previous definition of ‘foo’ was here
double foo(){return 1;}
^
Implicit function declaration
So why does the following code throw error: conflicting types for ‘foo’?
main(){ foo(); }
double foo(){ return 1.0; }
The reason is implicit function declaration.
When the compiler first encounters foo() in the main function, it will assume a type signature for the function foo of int foo(). By default, implicit functions are assumed to return integers, and the input argument types are derived from what you're passing into the function (in this case, nothing).
Obviously, the compiler is wrong to make this assumption, but the specs for the C (and thus Objective-C) language are old, cranky, and not very clever. Maybe implicitly declaring functions saved some development time by reducing compiler complexity back in the day, but now we're stuck with a terrible feature that should have never made it into the language. In fact, implicit declarations were made illegal in C99.
That said, once you know what's going on, it should be easy to dig out the root cause of your problem.
it's probably because your function "_" already exists in your library. It happened to me with this function:
I was using stdio.h
int getline (char s[ ] , int lim)
{
int c, i;
for (i=0; i < lim-1 && (c=getchar())!=EOF && c!='\n'; ++i)
s[i] = c;
if (c == '\n') {
s[i] = c;
++i;
}
s[i] = '\0';
return i;
}
When I changed "getline" to "getlinexxx" and gcc compiled it:
int getlinexxx (char s[], int lim)
{
int c, i;
for (i=0; i < lim-1 && (c=getchar())!=EOF && c!='\n'; ++i)
s[i] = c;
if (c == '\n') {
s[i] = c;
++i;
}
s[i] = '\0';
return i;
}
And the problem was gone
What datatype is '___'?
My guess is that you're trying to initialize a variable of a type that can't accept the initial value. Like saying int i = "hello";
If you're trying to assign it from a call that returns an NSMutableDictionary, that's probably your trouble. Posting the line of code would definitely help diagnose warnings and errors in it.

How do I test if a primitive in Objective-C is nil?

I'm doing a check in an iPhone application -
int var;
if (var != nil)
It works, but in X-Code this is generating a warning "comparison between pointer and integer." How do I fix it?
I come from the Java world, where I'm pretty sure the above statement would fail on compliation.
Primitives can't be nil. nil is reserved for pointers to Objective-C objects. nil is technically a pointer type, and mixing pointers and integers will without a cast will almost always result in a compiler warning, with one exception: it's perfectly ok to implicitly convert the integer 0 to a pointer without a cast.
If you want to distinguish between 0 and "no value", use the NSNumber class:
NSNumber *num = [NSNumber numberWithInt:0];
if(num == nil) // compare against nil
; // do one thing
else if([num intValue] == 0) // compare against 0
; // do another thing
if (var) {
...
}
Welcome to the wonderful world of C. Any value not equal to the integer 0 or a null pointer is true.
But you have a bug: ints cannot be null. They're value types just like in Java.
If you want to "box" the integer, then you need to ask it for its address:
int can_never_be_null = 42; // int in Java
int *can_be_null = &can_never_be_null; // Integer in Java
*can_be_null = 0; // Integer.set or whatever
can_be_null = 0; // This is setting "the box" to null,
// NOT setting the integer value