Suppose I have the following macro:
#define xxx(x) printf("%s\n",x);
Now in certain files I want to use an "enhanced" version of this macro without changing its name. The new version explores the functionality of the original version and does some more work.
#define xxx(x) do { xxx(x); yyy(x); } while(0)
This of course gives me redefition warning but why I get 'xxx' was not declared in this scope? How should I define it properly?
EDIT: according to this http://gcc.gnu.org/onlinedocs/gcc-3.3.6/cpp/Self_002dReferential-Macros.html it should be possible
Not possible. Macros can use other macros but they are using the definition available at expand time, not definition time. And macros in C and C++ can't be recursive, so the xxx in your new macro isn't expanded and is considered as a function.
Self-referential macros do not work at all:
http://gcc.gnu.org/onlinedocs/cpp/Self_002dReferential-Macros.html#Self_002dReferential-Macros
If you're working on C++ you can obtain the same results with template functions and namespaces:
template <typename T> void xxx( x ) {
printf( "%s\n", x );
}
namespace my_namespace {
template <typename T> void xxx( T x ) {
::xxx(x);
::yyy(x);
}
}
You won't be able to reuse the old definition of the macro, but you can undefine it and make the new definition. Hopefully it isn't too complicated to copy and paste.
#ifdef xxx
#undef xxx
#endif
#define xxx(x) printf("%s\n",x);
My recommendation is defining an xxx2 macro.
#define xxx2(x) do { xxx(x); yyy(x); } while(0);
If we know type of 'x' parameter in the 'xxx' macro, we can redefine macro by using it in a function and then define the 'xxx' macro as this function
Original definition for the 'xxx' macro:
#define xxx(x) printf("xxx %s\n",x);
In a certain file make enhanced version of the 'xxx' macro:
/* before redefining the "xxx" macro use it in function
* that will have enhanced version for "xxx"
*/
static inline void __body_xxx(const char *x)
{
xxx(x);
printf("enhanced version\n");
}
#undef xxx
#define xxx(x) __body_xxx(x)
From: https://gcc.gnu.org/onlinedocs/gcc/Push_002fPop-Macro-Pragmas.html
#define X 1
#pragma push_macro("X")
#undef X
#define X -1
#pragma pop_macro("X")
int x [X];
It is not exactly what you're asking for but it can help.
You can #undef a macro prior to giving it a new definition.
Example:
#ifdef xxx
#undef xxx
#endif
#define xxx(x) whatever
I never heard of (or seen) a recursive macro though. I don't think it is possible.
This answer doesn't answer your question exactly, but I feel like it does a good job at working the way you would intend to use your method (if it was possible).
The idea is for all of your file-specific macros that do actual work to have a unique name (for example LOG_GENERIC and LOG_SPECIFIC), and having the common token (in your case xxx) simply point to the currently appropriate macro.
Plus, using non-standard but widely available #pragma push_macro and #pragma pop_macro we can both modify the common xxx token and restore it to the previous version.
For example, imagine two header files, generic.hpp and specific.hpp, common token here being LOG:
// generic.hpp
#pragma once
#include <cstdio>
#define LOG_GENERIC(x) printf("INFO: " x "\n")
#define LOG LOG_GENERIC
void generic_fn(){LOG("generic");} // prints "INFO: generic\n"
// specific.hpp
#pragma once
#include "generic.hpp"
#define LOG_SPECIFIC(x) do {printf("<SPECIFIC> "); LOG_GENERIC(x);} while (0)
#pragma push_macro("LOG")
#undef LOG
#define LOG LOG_SPECIFIC
void specific_fn(){LOG("specific");} // prints "<SPECIFIC> INFO: specific\n"
#undef LOG
#pragma pop_macro("LOG")
By doing things this way we get the benefits of:
an easy mechanism to modify LOG in a restorable way via #pragma push_macro and #pragma pop_macro
being able to refer to certain LOG_* macros explicitly (LOG_SPECIFIC can use LOG_GENERIC)
we can't refer to LOG inside of LOG_SPECIFIC definition, we have to go through LOG_GENERIC
this is different to your question, but personally I am of the opinion that this is the better design, otherwise you gain the ability to allow macros above the LOG_SPECIFIC definition to affect it, just sounds like the wrong thing to do every time
Link to a github repository with the example above
Related
I am writing a reference-counted linked list of characters data structure in C for practice. I want to try using Sal in it to annotate function parameters for this practice.
I have an input paremeter(named This), which I want to annotate to make it clear that the specified parameter's members must be mutable in order for the function to behave as expected.
The situation is analogous to the code below.
#include <Windows.h>
typedef struct Box {
ULONG val;
} Box;
ULONG Box_decrement(_In_ Box *This) {
return InterlockedDecrement(&(This->val));
}
int main(int argc, char **argv) {
Box b = {2};
Box_decrement(&b);
return (BYTE)b.val;
};
Is there an existing Sal annotation that can be used to annotate the This parameter of the Box_increment function to make it clear from the function signature that the function modifies one or more members of the Box that has been passed to it?
Something like _InternallyMutable_(but exist):
#include <Windows.h>
typedef struct Box {
ULONG val;
} Box;
ULONG Box_decrement(_InternallyMutable_ _In_ Box *This) {
return InterlockedDecrement(&(This->val));
}
int main(int argc, char **argv) {
Box b = {2};
Box_decrement(&b);
return (BYTE)b.val;
};
Best solution so far(unfortunately, there does not seem to be any equivelent in SAL to denote Internally_mutable, there is Unchanged which is the opposite):
#include <Windows.h>
#define _Internally_mutable_(expr) _At_(expr, _Out_range_(!=, _Old_(expr)))
typedef struct Box {
ULONG val;
} Box;
ULONG Box_decrement(_In_ _InternallyMutable_(This) Box *This) {
return InterlockedDecrement(&(This->val));
}
int main(int argc, char **argv) {
Box b = {2};
Box_decrement(&b);
return (BYTE)b.val;
};
Yes! You can. SAL is a wonderful DSL that lets you do basically anything you want if you're psychic enough to infer it from the little bits in the Windows SDK. I've even in the past been able to write super simple custom annotations to detect invalid HANDLE usage with _Post_satisfies_ and friends.
This code seems to work:
_At_(value, _Out_range_(!=, _Old_(value)))
void change_value_supposed_to(int& value) noexcept {
//value += 1;
}
...Running with all native rules in code analysis, I get a warning like this:
Warning C28196 The requirement that '_Param_(1)!=(("pre"), _Param_(1))' is not satisfied. (The expression does not evaluate to true.)
(there, substitute value with your variable)
For _Internally_mutable_, I can do it in the "above the function" style of SAL:
#define _Internally_mutable_(expr) _At_(expr, _Out_range_(!=, _Old_(expr)))
_Internally_mutable_(value)
void change_value_supposed_to_internally_mutable(int& value) noexcept {
//value += 1;
(void)value;
}
...but not inline WITHOUT being repetitive, as you wanted. Not sure why right now - _Curr_ doesn't seem to be working? - I may need another layer of indirection or something. Here's what it looks like:
#define _Internally_mutable_inline_(value) _Out_range_(!=, _Old_(value))
void change_value_supposed_to_internally_mutable_inline(_Internally_mutable_inline_(value) int& value) noexcept {
//value += 1;
(void)value;
}
How I figured this out:
sal.h defines an _Unchanged_ annotation (despite doing web dev for several years now and little C++, I remembered this when I saw your question in a google alert for SAL!):
// annotation to express that a value (usually a field of a mutable class)
// is not changed by a function call
#define _Unchanged_(e) _SAL2_Source_(_Unchanged_, (e), _At_(e, _Post_equal_to_(_Old_(e)) _Const_))
...if you look at this macro closely, you'll see that it just substitutes as:
_At_(e, _Post_equal_to_(_Old_(e)) _Const_)
...and further unrolling it, you'll see _Post_equal_to_ is:
#define _Post_equal_to_(expr) _SAL2_Source_(_Post_equal_to_, (expr), _Out_range_(==, expr))
Do you see it? All it's doing is saying the _Out_range_ is equal to the expression you specify. _Out_range_ (and all the other range SAL macros) appear to accept all of the standard C operators. That behavior is not documented, but years of reading through the Windows SDK headers shows me it's intentional! Here, all we need to do is use the not equals operator with the _Old_ intrinsic, and the analyzer's solver should be able to figure it out!
_Unchanged_ itself is broken?
To my great confusion, _Unchanged_ itself seems broken:
_Unchanged_(value)
void change_value_not_supposed_to(_Inout_ int& value) noexcept {
value += 1;
}
...that produces NO warning. Without the _Inout_, code analysis is convinced that value is uninitialized on function entry. This makes no sense of course, and I'm calling this directly from main in the same file. Twiddling with inlining or link time code generation doesn't seem to help
I've played a lot with it, and various combinations of _Inout_, even _Post_satisfies_. I should file a bug, but I'm already distracted here, I'm supposed to be doing something else right now :)
Link back here if anybody does file a bug. I don't even know what the MSVC/Compiler teams use for bug reporting these days.
Fun facts
5-6 years ago I tried to convince Microsoft to open source the SAL patents! It would have been great, I would have implemented them in Clang, so we'd all be able to use it across platforms! I might have even kicked off a career in static-analysis with it. But alas, they didn't want to do it in the end. Open sourcing them would have meant they might have to support it and/or any extensions the community might have introduced, and I kinda understand why they didn't want that. It's a shame, I love SAL, and so do many others!
I'm writing a tool using macros to generate enum. When done, I want to undef all those macros, but I don't know all names yet. Somes will comes later in developement. So I want to make a generic undef like...
#undef ENUM_*
Is that possible ? The macros simplified looks like that:
First...
#define ENUM_VALUE(VALUE) VALUE,
#define ENUM_STRING(STRING) #STRING,
#define ENUM_GENERATE(NAME)\
namespace NAME {\
enum Enum: int { ENUM_##NAME(ENUM_VALUE) };\
const char *Names[] = { ENUM_##NAME(ENUM_STRING) };\
}\
Then for each enum I define...
#define ENUM_MyEnum(VALUE)\
VALUE(Value1)\
VALUE(Value2)\
VALUE(Value3)
ENUM_GENERATE(MyEnum)
It generate a synchronized enum and string table like if I had declared
namespace MyEnum {
enum Enum: int { Value1, Value2, Value3 };
const char *Names[] = { "Value1", "Value2", "Value3" };
}
The only problem is that I end with truck load of macros. Somes I don't know yet, because they will be define later when I new enums will be creted. But all start by ENUM_
Is there a simple way to undef them all ? Thanks
Here is a detailed proposal for the "base macro" idea from comment:
It leaves some defined macros behind, but guarantees that any use is blocked by a single macro not being defined.
I.e. you undef that single one and all attempts to use one of the macros from the group it represents result in a "undefined" error,
or maybe in some defined expansion, which is however sure to annoy the compiler.
// Infrastructure
#define BASE_BASE(ParWhich) BASED_##ParWhich
// Secondary infrastructure, needed for supporting parameterised macros.
#define BASE_NOPAR(ParWhich) BASE_BASE(ParWhich)
#define BASE_ONEPAR(ParWhich, ParOne) BASE_BASE(ParWhich)(ParOne)
// Potentially many macros being defined in a slightly "get-used-to" manner.
#define BASED_A Hello
#define BASED_B(ParWhat) ParWhat
#define BASED_C(ParWord) ParWord
// Example line, to be expanded by prepro with defined macro BASE_BASE.
BASE_NOPAR(A) BASE_ONEPAR(B,PreProcessor) BASE_NOPAR(B)(World.)
// The one line which should lead to compiler errors for any use of one of the macros.
#undef BASE_BASE
// Same example again.
BASE_NOPAR(A) BASE_ONEPAR(B,PreProcessor) BASE_NOPAR(B)(World.)
// Not necessary, just to make sure and to make readable prepro-output.
#define BASE_BASE(ParIgnore) Compiler, please fail here!
// Same example again.
BASE_NOPAR(A) BASE_ONEPAR(B,PreProcessor) BASE_NOPAR(B)(World.)
Using the parameter-free base macro with a parameter in a following pair braces, as done in the example BASE_NOPAR(C)(World.)
should not be done directly in code anywhere; because it would, in my opinion,
turn into maintenance nightmare soon.
I.e. it should be done in a central "core" place as demonstrated otherwise.
Wouldn't want any code written like that.
Output:
>gcc -E -P BaseMacro.c
Hello PreProcessor World.
BASE_BASE(A) BASE_BASE(B)(PreProcessor) BASE_BASE(B)(World.)
Compiler, please fail here! Compiler, please fail here!(PreProcessor) Compiler, please fail here!(World.) Compiler, please fail here!(World.)
I am trying to properly declare and define global variables in separate files and include them in a third file which deals with class declaration.
The three files are:
1) global.h
#ifndef GLOBAL_H_INCLUDED
#define GLOBAL_H_INCLUDED
extern const int marker_num;
extern const int dim;
using namespace std;
#endif // GLOBAL_H_INCLUDED
2) global.cpp
#include <iostream>
#include <cstdio>
#include <cmath>
#include "global.h"
#include "WorldState.h"
#include "Robot.h"
#include "Sensor.h"
#include "Marker.h"
constexpr const int marker_num = 10;
constexpr const int dim = (2 * marker_num) + 3;
3) WorldState.h
#ifndef WORLDSTATE_H
#define WORLDSTATE_H
#include "global.h"
#include "global.cpp"
class WorldState{
public:
WorldState(float a[], float b[dim][dim]);
get_wstate();
protected:
private:
float w_state[];
float covar_matrix[dim][dim];
};
#endif // WORLDSTATE_H
I am using the global variable dim to declare and define a multidimensional array. I have declared dim inside global.h and defined it inside global.cpp. Now, I have a class called WorldState and inside its header, I am using dim. If I comment out #include "global.cpp", it throws the following error:
C:\Users\syamp\Documents\codeblocks\slam\WorldState.h|10|error: array bound is not an integer constant before ']' token
My understanding is that including the .h file includes the corresponding .cpp as well, and that all declarations should be inside .h and all definitions should be inside .cpp. However, it doesn't seem to work in this case.
1) If I decide to include global.cpp file inside WorldState.h, isn't it bad programming practice? I am trying to write a good code not just a code that works.
2) An alternative is to define values of variable(s) dim (and marker_num) inside global.h. Is that good programming practice?
3) I believe there is something that I am missing. Kindly suggest the best method to resolve this issue. I am using codeblocks and C++11. Thanks in advance.
I am using the global variable dim to declare and define a multidimensional array.
When declaring a fixed-length array at compile-time, the value(s) of its dimension(s) must be known to the compiler, but your separation prevents the value of dim from being known to the compiler at all, so dim cannot be used to specify fixed array dimensions. Any code that uses dim will just compile into a reference to it, and then the linker will resolve the references after compilation is done. Just because dim is declared as const does not make it suitable as a compile-time constant. To do that, you must define its value in its declaration, eg:
#ifndef GLOBAL_H_INCLUDED
#define GLOBAL_H_INCLUDED
static constexpr const int marker_num = 10;
static constexpr const int dim = (2 * marker_num);
using namespace std;
#endif // GLOBAL_H_INCLUDED
Otherwise, if you keep dim's declaration and definition in separate files, you will have to dynamically allocate the array at run-time instead of statically at compile-time.
I have declared dim inside global.h and defined it inside global.cpp.
That is fine for values you don't need to use until run-time. That will not work for values you need to use at compile-time.
My understanding is that including the .h file includes the corresponding .cpp as well
That is not even remotely true. The project/makefile brings in the .cpp file when invoking the compiler. The .h file has nothing to do with that.
that all declarations should be inside .h and all definitions should be inside .cpp.
Typically yes, but not always.
If I decide to include global.cpp file inside WorldState.h, isn't it bad programming practice?
Yes.
An alternative is to define values of variable(s) dim (and marker_num) inside global.h. Is that good programming practice?
Yes, if you want to use them where compile-time constants are expected.
I am having difficulty getting the Eclipse Indexer (Codan) to recognize certain data declarations in header files. There is a new preference to Index all header variants, but little explanation as to what this means. Enabling the preference seems to fix the problem. But I still would like to know what the preference does exactly.
Let's say you have header a.h like this:
#pragma once
#ifndef SYMBOL
#define SYMBOL int
#endif
struct S
{
SYMBOL sym;
};
And now if you include your header like this:
struct UserSymbol
{
int i, j, k;
};
#define SYMBOL UserSymbol
#include "a.h"
S var;
int main()
{
var.sym.i = 123;
return 0;
}
then Eclipse CDT may not to recognize sym.i.
You may have more complex examples with deeper nested inclusions or so on.
EDIT:
But if you include the a.h to the "Index all variants of specific headers" list or check "Index all header variants" Eclipse will build several variants of the a.h indexes and will "know" that you have defined the your specific SYMBOL.
Working from:
Is ignoring __attribute__((packed)) always safe in SWIG interfaces?
Visual C++ equivalent of GCC's __attribute__ ((__packed__))
My .i does:
#define __attribute__(x)
then uses %include to include my cross-platform definition of PACK():
#if defined(SWIG)
#define PACK(...) VA_ARGS
#elif defined(_MSC_VER)
#define PACK(__Decl__) __pragma(pack(push, 1)) __Decl__ __pragma(pack(pop))
#else // GCC
#define PACK(__Decl__) __Decl__ __attribute__ ((packed))
#endif
Then I have code like:
PACK(
typedef struct {
uint8_t something;
uint32_t more;
} ) aName;
With earlier versions of the PACK() macro, I got syntax error from SWIG on the typedef line. Now I get past that but when compiling the SWIG-generated .c file, I have get and set functions that complain aName doesn't exist. The messages are like (edited):
libudr_perl_swig.c: In function '_wrap_aName_set':
libudr_perl_swig.c:2367:20: error: expected identifier or '(' before
'=' token libudr_perl_swig.c: In function '_wrap_aName_get':
libudr_perl_swig.c:2377:3: error: expected expression before 'aName'
SWIG sort of seems to know about my struct -- it creates access functions -- but the doesn't expose them enough that the access functions can find it.
Before I started to make this cross-platform -- when it was still Linux-only with __attribute__ ((packed)) -- it worked in SWIG. And it still works in Linux. So there appears to be something about SWIG's interpretation of PACK() that is flawed.
The old way generated a lot of per-field code like:
XS(_wrap_aName_something_set) {
{
aName *arg1 = (aName *) 0 ;
...
the new way generates a little per-struct code like:
SWIGCLASS_STATIC int _wrap_aName_set(pTHX_ SV* sv, MAGIC * SWIGUNUSEDPARM(mg)) {
MAGIC_PPERL
{
Why should my PACK() (which should be a no-op in SWIG) do that?
Googling "cpp standard variadic macros" leads to http://en.wikipedia.org/wiki/Variadic_macro which notes the expansion of ... is __VA_ARGS__, not VA_ARGS (as I had found somewhere). When I change my macro definition to be:
#if defined(SWIG)
#define PACK(...) __VA_ARGS__
#elif defined(_MSC_VER)
#define PACK(__Decl__) __pragma(pack(push, 1)) __Decl__ __pragma(pack(pop))
#else // GCC
#define PACK(__Decl__) __Decl__ __attribute__ ((packed))
#endif
it works.