Can we use C macro-function to define variables; - macros

#define def_1(var) { \
int s[var]; \
}
int main(){
def_1(2);
s[0]=1; s[1]=3;
printf("s[0]=%d\t s[1]=%d\n", s[0], s[1]);
return;
}
Above code has compilation error indicating that s is not defined.

The compilation error is because you have curly braces around the definition of s. Remove the curly braces from your macro and it should compile.
i.e. change:
#define def_1(var) { \
int s[var]; \
}
to
#define def_1(var) int s[var];

Yes you can, and yes it does. Your #define adds your variable to a one line block, I think you want to change this
#define def_1(var) { \ int s[var]; \ }
to something like this
#define def_1(var) int s[var];

Related

Error: expected initializer before 'bool'

I have this header file and GNU GGC compiler gives me error in row with declaration of bool function. I cannot find anything wrong.
#ifndef GAMECREATOR_H
#define GAMECREATOR_H
bool isPerfectSquare (int x);
#endif // GAME_CREATOR_H

Print UTF-32 string with wprintf

I am migrating some code from using wchar_t to char32_t, and when compiling with the -Werror=pointer-sign flag set, I am getting the following issue:
// main.c
#include <uchar.h>
#include <wchar.h>
int main(void) {
wprintf(U"some data\n");
}
Compiling: gcc -std=c11 -Werror=pointer-sign main.c
Output:
main.c: In function ‘main’:
main.c:5:10: error: pointer targets in passing argument 1 of ‘wprintf’ differ in signedness [-Werror=pointer-sign]
wprintf(U"some data\n");
^~~~~~~~~~~~~~
In file included from main.c:2:
/usr/include/wchar.h:587:12: note: expected ‘const wchar_t * restrict’ {aka ‘const int * restrict’} but argument is of type ‘unsigned int *’
extern int wprintf (const wchar_t *__restrict __format, ...)
^~~~~~~
To remedy this, I can do:
wprintf((const int *)U"some data\n");
//or
printf("%ls\n", U"some data");
Although this is quite a pain. Is there a nice and easy way to do this? What is the real difference between const unsigned int* vs const signed int*, aside from the data type it points to? Is this possibly dangerous, or should I just disable the flag altogether?
char32_t is an unsigned type.
wchar_t is either signed or unsigned, depending on implementation. In your case, it is signed.
You can't pass a pointer-to-unsigned where a pointer-to-signed is expected. So yes, you need a type-cast, however you should be casting to const wchar_t *, since that is what wprintf() actually expects (wchar_t just happens to be implemented as an int on your compiler, but don't cast to that directly):
wprintf((const wchar_t *)U"some data\n");
It doesn't get much cleaner than that, unless you wrap it in your own function, eg:
int wprintf32(const char32_t *str, ...)
{
va_list args;
va_start(args, str);
int result = vwprintf((const wchar_t *)str, args);
va_end(args);
return result;
}
wprintf32(U"some data\n");
Note that this code will not work properly at all on platforms where sizeof(wchar_t) < sizeof(char32_t), such as Windows. On those platforms, where sizeof(wchar_t) is 2, you will have to actually convert your string data from UTF-32 to UTF-16 instead, eg:
int wprintf32(const char32_t *str, ...)
{
va_list args;
int result;
va_start(args, str);
if (sizeof(wchar_t) < sizeof(char32_t))
{
wchar_t *str = convert_to_utf16(str); // <-- for you to implement
result = vwprintf(str, args);
free(str);
}
else
result = vwprintf((const wchar_t *)str, args);
va_end(args);
return result;
}
wprintf32(U"some data\n");

Swift errors using #if, #endif

Using #if, #endif in Swift (using Xcode) produces errors if it cuts into the flow of an operation. This screenshot says it all:
Does anyone know a solution to make this example work, without repeating the entire code block twice? There can easily be situations where the entire block can be very large.
EDIT: My sample was a bit too simple. Here is a new sample where the "else if" depends on the same define (DEBUG). The "else if" must also be within the #if and #endif. And other samples can be much more complex than this.
Ideally, limit the usage of #if as much as possible. Using preprocessor directives is always a bit of a code smell. In this case, you can simply use a boolean variable:
#if DEBUG
let debug = true
#else
let debug = false
#endif
Then simply use the variable:
var a = 0
var b = 0
...
else if debug && a == b {
}
In release mode the code will become unreachable and the optimizer will remove it anyway.
With a bit of imagination, we can find other solutions, for example, we can move the check to a function:
func isDebugCheck(a: Int, b: Int) -> Bool {
#if DEBUG
return a == b
#else
return false
#endif
}
or we can move the whole code to a separate function and replace if-else by a return (or continue, depending on you needs), e.g.:
if a == 7 {
...
return
}
#if DEBUG
if a == b {
return
}
#endif
if ...
As #user28434 notes, there is no source-level pre-processor. This has gotten rid of a lot of very tricky pre-processor problems in C (such as bizarre needs for parentheses to make things work).
However, #if is integrated well into the language, and specifically supports switch for exactly these kinds of cases.
var a = 0
#if DEBUG
let b = 0
#endif
switch a {
case 7: a += 1
#if DEBUG
case b: a += 2
#endif
case 5: a += 3
default:
break
}
You can simply achieve this case with below code:
if a == b {
#if DEBUG
a += 2
#else
a += 1
#endif
} else if a == c {
a += 3
}

iOS/Mac OS fluent matching framework that works with Swift?

Is there a fluent matching API that works for Swift code? The leading Objective-C matcher candidates seem to be OCHamcrest and Expecta, both of which rely on complex macros that (as per the docs) aren't available to Swift code, e.g.
#define HC_assertThat(actual, matcher) \
HC_assertThatWithLocation(self, actual, matcher, __FILE__, __LINE__)
(OCHamcrest)
#define EXP_expect(actual) _EXP_expect(self, __LINE__, __FILE__, ^id{ return EXPObjectify((actual)); })
(Expecta)
Is there another alternative that does work with Swift, or some way to wrap one or the other of these so they can be used with Swift?
ETA: For future readers -- I looked at SwiftHamcrest (per Jon Reid's answer) but for the time being I've settled on Quick/Nimble.
Complex preprocessor macros aren't translated to their Swift alternatives. Apple has a Blog discussing Swift, in which they described on how they made the assert function.
The first macro is easy to make into a Swift function
#define HC_assertThat(actual, matcher) \
HC_assertThatWithLocation(self, actual, matcher, __FILE__, __LINE__)
In Swift:
func HC_assertThat(caller: AnyObject, actual: String, matcher: String, file: String = __FILE__, line: UWord = __LINE__) {
HC_assertThatWithLocation(caller, actual, matcher, file.fileSystemRepresentation, Int32(line))
}
#define EXP_expect(actual) _EXP_expect(self, __LINE__, __FILE__, ^id{ return EXPObjectify((actual)); })
In Swift:
func EXP_expect(caller: AnyObject, actual: String, line: UWord = __LINE__, file: String = __FILE__) {
_EXP_expect(caller, Int32(line), file.fileSystemRepresentation, ({
return EXPObjectify(caller)
})
}
Note that I am guessing on what you want passed to the preprocessor functions.
https://github.com/nschum/SwiftHamcrest is a Swift-native implementation of Hamcrest

What does error conflicting types for '' mean?

i got an error that said "error: conflicting types for '____'. What does that mean?
Quickfix:
Make sure that your functions are declared once and only once before they are called. For example, change:
main(){ myfun(3.4); }
double myfun(double x){ return x; }
To:
double myfun(double x){ return x; }
main(){ myfun(3.4); }
Or add a separate function declaration:
double myfun(double x);
main(){ myfun(3.4); }
double myfun(double x){ return x; }
Possible causes for the error
Function was called before being declared
Function defined overrides a function declared in an included header.
Function was defined twice in the same file
Declaration and definition don't match
Declaration conflict in the included headers
What's really going on
error: conflicting types for ‘foo’ means that a function was defined more than once with different type signatures.
A file that includes two functions with the same name but different return types would throw this error, for example:
int foo(){return 1;}
double foo(){return 1.0;}
Indeed, when compiled with GCC we get the following errors:
foo.c:5:8: error: conflicting types for ‘foo’
double foo(){return 1.0;}
^
foo.c:4:5: note: previous definition of ‘foo’ was here
int foo(){return 1;}
^
Now, if instead we had a file with two function definitions with the same name
double foo(){return 1;}
double foo(){return 1.0;}
We would get a 'redefinition' error instead:
foo.c:5:8: error: redefinition of ‘foo’
double foo(){return 1.0;}
^
foo.c:4:8: note: previous definition of ‘foo’ was here
double foo(){return 1;}
^
Implicit function declaration
So why does the following code throw error: conflicting types for ‘foo’?
main(){ foo(); }
double foo(){ return 1.0; }
The reason is implicit function declaration.
When the compiler first encounters foo() in the main function, it will assume a type signature for the function foo of int foo(). By default, implicit functions are assumed to return integers, and the input argument types are derived from what you're passing into the function (in this case, nothing).
Obviously, the compiler is wrong to make this assumption, but the specs for the C (and thus Objective-C) language are old, cranky, and not very clever. Maybe implicitly declaring functions saved some development time by reducing compiler complexity back in the day, but now we're stuck with a terrible feature that should have never made it into the language. In fact, implicit declarations were made illegal in C99.
That said, once you know what's going on, it should be easy to dig out the root cause of your problem.
it's probably because your function "_" already exists in your library. It happened to me with this function:
I was using stdio.h
int getline (char s[ ] , int lim)
{
int c, i;
for (i=0; i < lim-1 && (c=getchar())!=EOF && c!='\n'; ++i)
s[i] = c;
if (c == '\n') {
s[i] = c;
++i;
}
s[i] = '\0';
return i;
}
When I changed "getline" to "getlinexxx" and gcc compiled it:
int getlinexxx (char s[], int lim)
{
int c, i;
for (i=0; i < lim-1 && (c=getchar())!=EOF && c!='\n'; ++i)
s[i] = c;
if (c == '\n') {
s[i] = c;
++i;
}
s[i] = '\0';
return i;
}
And the problem was gone
What datatype is '___'?
My guess is that you're trying to initialize a variable of a type that can't accept the initial value. Like saying int i = "hello";
If you're trying to assign it from a call that returns an NSMutableDictionary, that's probably your trouble. Posting the line of code would definitely help diagnose warnings and errors in it.