What is the Julia equivalent of the following C code:
#ifdef _USE_NATURAL
const scalar c=1.0;
const scalar e=0.302822;
#else
const scalar c=2.99792458e10;
const scalar e=4.80320425e-10;
#endif
I need c and e to be defined at module level. They are just constants, but I want to give the user the option to choose which set of constants they want to use (which physically corresponds to different set of units; this is a physical simulation).
This is baby-easy in C due to the existence of the preprocessor but I can't seem to figure out how to change the behavior of modules on import. Is it possible?
A macro can get you part of the way
julia> macro use_natural(t)
if eval(t) == 1
return esc(quote
const c=1.0
const e=0.302822
end)
else
return esc(quote
const c=2.99793e10
const e=4.803e-10
end)
end
end
julia> userchoice = 0
0
julia> #eval #use_natural $userchoice
4.803e-10
julia> c, e
(2.99793e10,4.803e-10)
However, it sound like you want userchoice to be define at import time and depending on the global namespace in another module... Not sure if that can be accomplished.
Related
I am beginning my journey of learning Rust. I came across this line in Rust by Example:
However, unlike macros in C and other languages, Rust macros are expanded into abstract syntax trees, rather than string preprocessing, so you don't get unexpected precedence bugs.
Why is an abstract syntax tree better than string preprocessing?
If you have this in C:
#define X(A,B) A+B
int r = X(1,2) * 3;
The value of r will be 7, because the preprocessor expands it to 1+2 * 3, which is 1+(2*3).
In Rust, you would have:
macro_rules! X { ($a:expr,$b:expr) => { $a+$b } }
let r = X!(1,2) * 3;
This will evaluate to 9, because the compiler will interpret the expansion as (1+2)*3. This is because the compiler knows that the result of the macro is supposed to be a complete, self-contained expression.
That said, the C macro could also be defined like so:
#define X(A,B) ((A)+(B))
This would avoid any non-obvious evaluation problems, including the arguments themselves being reinterpreted due to context. However, when you're using a macro, you can never be sure whether or not the macro has correctly accounted for every possible way it could be used, so it's hard to tell what any given macro expansion will do.
By using AST nodes instead of text, Rust ensures this ambiguity can't happen.
A classic example using the C preprocessor is
#define MUL(a, b) a * b
// ...
int res = MUL(x + y, 5);
The use of the macro will expand to
int res = x + y * 5;
which is very far from the expected
int res = (x + y) * 5;
This happens because the C preprocessor really just does simple text-based substitutions, it's not really an integral part of the language itself. Preprocessing and parsing are two separate steps.
If the preprocessor instead parsed the macro like the rest of the compiler, which happens for languages where macros are part of the actual language syntax, this is no longer a problem as things like precedence (as mentioned) and associativity are taken into account.
Prompted by Lyndon's question earlier today:
a.
julia> function f1(x::Float64)
const y = x;
y = "This should throw an error since y is of constant type";
return y;
end
f1 (generic function with 1 method)
julia> f1(1.0)
"This should throw an error since y is of constant type"
Why does the const keyword not work as expected here? (i.e., disallow assigning a string to y which has been declared as const).
b.
julia> function f2(x::Float64)
show(x);
const x = 1.;
end
f2 (generic function with 1 method)
julia> f2(1.0)
ERROR: UndefVarError: x not defined
Stacktrace:
[1] f2(::Float64) at ./REPL[1]:2
Why does defining x as const on line 3 affect the value of x on line 2?
c.
In particular, this prevents me from doing:
function f(x::Float64)
const x = x; # ensure x cannot change type, to simulate "strong typing"
x = "This should throw an error";
end
I was going to offer this as a way to simulate "strong typing", with regard to Lyndon's "counterexample" comment, but it backfired on me, since this function breaks at line 2, rather than line 3 as I expected it to.
What is causing this behaviour? Would this be considered a bug, or intentional behaviour?
Naming conventions aside, is there a more acceptable way to prevent an argument passed into a function from having its type altered?
(as in, is there a defined and appropriate way to do this: I'm not after workarounds, e.g. creating a wrapper that converts x to an immutable type etc)
EDIT:
So far, this is the only variant that allows me to enforce constness in a function, but it still requires the introduction of a new variable name:
julia> function f(x::Float64)
const x_const::Float64 = x;
x_const = "helle"; # this will break, as expected
end
but even then, the error just complains of an "invalid conversion from string to float" rather than an "invalid redefinition of a constant"
Because const in local scope is not yet implemented:
https://github.com/JuliaLang/julia/issues/5148
I have a bunch of signals like this:
logic [7:0] in0;
logic [7:0] in1;
logic [7:0] in2;
logic [7:0] in3;
That I want to assign to an array:
logic [7:0] in_array [4];
assign in_array[0] = in0;
assign in_array[1] = in1;
assign in_array[2] = in2;
assign in_array[3] = in3;
Easy enough, but if instead of 4 items I have 128 this gets annoying. I am sure there is a combination of defines and generates that can do this in a loop. Something like:
`define IN(x) inx
genvar i;
generate
for(i = 0; i<4; i++) begin
assign in_array[i] = `IN(i);
end
endgenerate
The above code doesn't work, but I think that I have done something like this before.
Simplifying that code is something that cannot be done in SystemVerilog. You can reduce you typing by creating a macro like below (note the double backticks ``), but you will still need to manually write each index. Macros are are resolved before generate loops and the input variable to the macro is treated as a literal.
// short named macro for reduced typing
// Note: using short named macro is typically a bad practice,
// but will be removed latter with an undef
`define A(idx) assign array_in[idx] = out``idx
//This works
`A(0);
`A(1);
`A(2);
`A(3);
// doesn't work. For example # gidx==0 will eval to 'assign array_in[0] = outgidx;'.
// There is not outgidx
genvar gidx;
generate
for(gidx=0; gidx<4; gidx++) begin
`A(gidx);
end
endgenerate
`undef A // prevent macro from from being used latter on
If it is just a small number of entries, it is best to do it manually. If it is large number of entries, then you need to consider a way to generate the for you, such as embedded coded.
There are also various embedded code (such as Perl's EP3, Ruby's eRuby/ruby_it, Python's prepro, etc.) that can generate the desired code. Pick your preference. You will need to per-process these files before giving to the compiler. Example with EP3 generating 400 assignments:
#perl_begin
foreach my $idx (0..400) {
printf "assign array_in[%0d] = out%0d;", $idx, $idx;
}
#perl_end
Use `` to separate text from argument.
`define IN(x) in``x
But there is another issue with the variable i not being declared at the time when the macro is evaluated. Thus the whole generate loop just connects to ini, because i is just another letter. Because of this macros cannot be assigned by dynamically allocated values.
The environment of your module already has to connect explicitly to each input assign in0 = out0; ... assign in127 = out127. So the simplest solution would be to have in_array as your modules input and let the environment connect to it assign array_in[0] = out0.
Something like this:
module parent_module();
/*some other stuff that has outputs out0, out1 etc.*/
logic [7:0] array_in[4];
assign array_in[0] = out0;
assign array_in[1] = out1;
assign array_in[2] = out2;
assign array_in[3] = out3;
my_module(.array_in(array_in));
endmodule
I need to declare two different constants in my app
one is a simple string, the other needs to be a uint32.
I know of two different ways to declare constants as follows
#define VERSION 1; //I am not sure how this works in regards to uint32.. but thats what I need it to be.
and
NSString * const SIGNATURE = #"helloworld";
is there a way to do the version which should be a uint32 like the nsstring decliration below?
for instance something like
UInt32 * const VERSION 1;
if so how? if not, how do i make sure the #define version is of type uint32?
any help would be appreciated
You're very close. The correct syntax is:
const UInt32 VERSION = 1;
You can also use UInt32 const rather than const UInt32. They're identical for scalars. For pointers such as SIGNATURE, however, the order matters, and your order is correct.
You're confused by macro definitions & constants:
#define VERSION (1)
or
#define SOME_STRING #"Hello there"
The above are macro definitions. This means during compilation VERSION & SOME_STRING will be replaced with the defined values all over the code. This is a quicker solution, but is more difficult to debug.
Examples of constant declarations are:
const NSUInteger VERSION = 1;
NSString * const RKLICURegexException = #"Some string";
Look at the constants like simple variables that are immutable and can't change their values.
Also, be careful with defining pointers to constants & constant values.
There are different kind of macros in the C language, nested macro is one of them.
Considering a program with the following macro
#define HYPE(x,y) (SQUR(x)+SQUR(y))
#define SQUR(x) (x*x)
Using this we can successfully compile to get the result.
As we all know the C preprocessor replaces all the occurrence of the identifiers with the replacement-string. Considering the above example I would like to know how many times the C preprocessor traverses the program to replace the macro with the replacement values. I assume it cannot be done in one go.
the replacement takes place, when "HYPE" is actually used. it is not expanded when the #define statement occurs.
eg:
1 #define FOO 1
2
3 void foo() {
4 printf("%d\n", FOO);
5 }
so the replacement takes place in line 5, and not in line 1. hence the answer to your question is: once.
A #define'd macro invocation is expanded until there are no more terms to expand, except it doesn't recurse. For example:
#define TIMES *
#define factorial(n) ((n) == 0 ? 1 : (n) TIMES factorial((n)-1))
// Doesn't actually work, don't use.
Suppose you say factorial(2). It will expand to ((2) == 0 ? 1 : (2) * factorial((2)-1)). Note that factorial is expanded, then TIMES is also expanded, but factorial isn't expanded again afterwards, as that would be recursion.
However, note that nesting (arguably a different type of "recursion") is in fact expanded multiple times in the same expression:
#define ADD(a,b) ((a)+(b))
....
ADD(ADD(1,2),ADD(3,4)) // expands to ((((1)+(2)))+(((3)+(4))))