I want to undefine a variable from a Makefile via passing a command to make. Is this possible? Man make didn't help that much.
What I want to do the following:
I want to compile a port with FreeBSD. This port is marked as broken. Though I don't have the permissions to change the Makefile I am looking for possibility to undefine the broken? variable.
Edit:
In the Makefile is:
BROKEN= does not link
And I want to unset/undefine broken. Because the Makefile is not executed further. This is not related to compiler flags so far.
I don't think you can actually undefine a variable.
However, if 'empty' is good enough for you:
make -e variable=''
You can do so. Here's how, assuming that undef is the name of the target which you want to use to undefine the variable.
FOO:=foo
ifeq (undef,$(filter undef,$(MAKECMDGOALS)))
override undefine FOO
endif
.PHONY: all
all:
echo $(FOO) '($(origin FOO))'
.PHONY: undef
undef: $(if $(filter-out undef,$(MAKECMDGOALS)),,$(.DEFAULT_GOAL)))
Explanation of make elements used:
MAKECMDGOALS is the predefined read-only variable containing the goals with which make was called. We use this to check whether make was called with undef.
.DEFAULT_GOAL is the predefined variable containing the default goal. If it is not set explicitly, it is set to the first goal, which in this case is all.
ifeq ... endif allows conditional parts in a Makefile.
The $(filter pattern,list) function returns all elements from list which are matched by pattern. So, if make was called with undef as one of its goals, $(filter undef,$(MAKECMDGOALS)) will be undef, otherwise it will be the empty String.
The $(filter-out pattern,list) function is the opposite of $(filter key,list). It returns all elements from list which are not matched by pattern. So, if make was called with undef only, $(filter-out undef,$(MAKECMDGOALS)) will be the empty String, otherwise it will be the list of all goals given on the command line, excluding undef.
undefine undefines a variable.
override changes a variable even if it was passed on the command line. You need to decide depending on your use case whether or not using override makes sense.
The $(if cond,then[,else]) function returns then if cond is true, otherwise else (or the empty String if there was no else). A non-empty String is true, an empty String is false.
So, the Makefile undefines FOO in case undef was given on the command line.
The "magic" dependency of undef makes sure that if undef was the only goal given on the command line, it will depend on and therefore execute the default goal, which in this example would be all. This is to make sure that make undef behaves like make except that it undefines the variable.
WARNING: The behavior of such Makefiles can be confusing. Imagine a Makefile that uses this mechanism to exclude CPPFLAGS+=-DNDEBUG in case debug was given. It is unexpected that make debug all and make all create a different output for make all. The expectation is that make debug all creates additional output compared to make all. The primary expectation is that make all always behaves the same, no matter what other goals might additionally be present on the command line. It is better to implement such mechanisms using a configure-style mechanism.
(Source: From my unpublished book on GNU make.)
http://www.gnu.org/software/make/manual/make.html#Undefine-Directive
Supposedly you can
override undefine VARIABLE
Not sure what compiler you're using (I know recent BSD uses clang by default) but with gcc you can undefine previously defined symbols using -U.
Edit: apparently I misread your question. So what you want is an equivalent to make variable=X, which will undefine rather than set a value? I don't believe that is possible.
If you can't modify the existing makefile, can you make your own copy and call make with that one instead?
There's a -DIGNORE_BROKEN or similar, please check the FreeBSD Porter's manual, it should hopefully have info on this.
Why do I get this from the code below
foo:=foo1
ifneq ($(foo),)
override undefine foo
endif
Makefile:5: invalid override directive
Related
Background
The perl command has several idiot-proofing command line options described in perldoc perlrun:
-c causes Perl to check the syntax of the program and then exit without executing it.
-w prints warnings about dubious constructs, such as variable names that are mentioned only once and scalar variables that are used before being set, etc.
-T forces "taint" checks to be turned on so you can test them.
After reading through these options, I could not find one that detects undefined functions. For example, I had a function I used called NFD() that imports the Unicode::Normalize package. However, being a perl novice, I did not know if this was already under the fold of the standard perl library or not. And perl -c nor any of the other options uncovered this error for me, rather a coworker noticed that it was somehow undefined (and not inside the standard libraries). Therefore, I was curious about the following:
Question
Is there an option in the perl command to automatically detect if there is an undefined function not already inside an imported package?
I did not know if this was already under the fold of the standard perl library or not.
It sounds like you want to distinguish imported subs from other subs and builtin functions.
If you always list your imports explicitly instead of accepting the defaults like I do, then you'll not only know which subs are imported, you'll know from which module they were imported.
use Foo::Bar; # Default imports
use Foo::Bar qw( ); # Import nothing ("()" also works)
use Foo::Bar qw( foo bar ); # Import subs foo and bar.
Is there an option in the perl command to check for undefined functions?
On the other hand, if you are trying to identify the subs that you call that don't exist or that aren't defined at compile time, then this question is a duplicate of How can I smoke out undefined subroutines?.
Aside from the particular technical details, you can't know if a function will be defined at some time in the future when you plan to use it. As a dynamic language, thinks come into and go out of existence, and even change their definitions, while the programming is running.
Jeffrey Kegler wrote Perl Cannot Be Parsed: A Formal Proof that relied on this idea. The details of the halting problem aren't as interesting as workings of a dynamic language.
And, for what it's worth, those command-line options don't make programs idiot-proof. For example, in Mastering Perl I show that merely adding -T to a program doesn't magically make it secure, as many would have you believe.
What were you doing with Unicode::Normalize? It has an NFD already but your question makes it sound like you were wrapping it somehow:
use Unicode::Normalize qw(NFD);
I am writing a perl script that needs to set a number of environment variables before calling an external program. My code has the form
$ENV{'VAR1'} = "value1";
$ENV{'VAR2'} = "value2";
When running this through perlcritic, I get a severity 4 violation for every such assignement:
^Magic variable "$ENV" should be assigned as "local"
Googling that error message didn't give me any good solutions. The perlcritic violation that complains in that case is Variables::RequireLocalizedPunctuationVars, and the example given deals with localizing a file handle. I tried to find the relevant section in Perl Best Practices, but it only talks about localizing package variables.
One solution I tried is to localize %ENV using the following statement before the assignments.
local %ENV = ();
This doesn't resolve the violation.
My question is the following:
Is that Perlcritic violation even relevant for assignments to %ENV, or can I ignore it?
If it's relevant, what's the best way to resolve it?
Perlcritic warnings are not The Word of God. They are simply warnings about situations that, if managed incorrectly, could get you into trouble.
Is that Perlcritic violation even relevant for assignments to %ENV, or
can I ignore it?
This warning tells you that:
Global Variables have the very real possibility of action at a distance.
This possibility is even more dangerous when dealing with those variables that change the operation of built in functions.
Is that relevant for %ENV? If you spawn more than one child process in your program, Yes. If someone changes your program later to spawn another child, Yes.
If it's relevant, what's the best way to resolve it?
Now here is where being the human being becomes important. You need to make a value judgement.
Possible actions:
Ignore the warning and hope that future maintainers aren't bitten by your usage of this Global Variable.
Change your code to avoid the situation you are being warned about. The syntax suggested by choroba is a fine option.
Now if you have made a change, are still getting the warning, and believe that the warning is now in error, you can do one or more of:
Be a good citizen and Submit a bug report.
use the comment ## no critic (RequireLocalizedPunctuationVars) on the affected lines or on it's own line before the lines in question.
Or more excessively disable the rule or just create an exception for %ENV in your .perlcriticrc file.
You can localize the value for the given environment variable only:
local $ENV{VAR1} = 'value1';
Consider using Env. perlcritic does not complain about the variable:
use warnings;
use strict;
use Env qw(VAR);
$VAR = "value1";
I was looking at a rather inconclusive question about whether it is best to use for(;;) or while(1) when you want to make an infinite loop and I saw an interesting solution in C where you can #define "EVER" as a constant equal to ";;" and literally loop for(EVER).
I know defining an extra constant to do this is probably not the best programming practice but purely for educational purposes I wanted to see if this could be done with Perl as well.
I tried to make the Perl equivalent, but it only loops once and then exits the loop.
#!/usr/bin/perl -w
use strict;
use constant EVER => ';;';
for (EVER) {
print "FOREVER!\n";
}
Output:
FOREVER!
Why doesn't this work in perl?
C's pre-processor constants are very different from the constants in most languages.
A normal constant acts like a variable which you can only set once; it has a value which can be passed around in most of the places a variable can be, with some benefits from you and the compiler knowing it won't change. This is the type of constant that Perl's constant pragma gives you. When you pass the constant to the for operator, it just sees it as a string value, and behaves accordingly.
C, however, has a step which runs before the compiler even sees the code, called the pre-processor. This actually manipulates the text of your source code without knowing or caring what most of it means, so can do all sorts of things that you couldn't do in the language itself. In the case of #DEFINE EVER ;;, you are telling the pre-processor to replace every occurrence of EVER with ;;, so that when the actual compiler runs, it only sees for(;;). You could go a step further and define the word forever as for(;;), and it would still work.
As mentioned by Andrew Medico in comments, the closest Perl has to a pre-processor is source filters, and indeed one of the examples in the manual is an emulation of #define. These are actually even more powerful than pre-processor macros, allowing people to write modules like Acme::Bleach (replaces your whole program with whitespace while maintaining functionality) and Lingua::Romana::Perligata (interprets programs written in grammatically correct Latin), as well as more sensible features such as adding keywords and syntax for class and method declarations.
It doesn't run forever because ';;' is an ordinary string, not a preprocessor macro (Perl doesn't have an equivalent of the C preprocessor). As such, for (';;') runs a single time, with $_ set to ';;' that one time.
Andrew Medico mentioned in his comment that you could hack it together with a source filter.
I confirmed this, and here's an example.
use Filter::cpp;
#define EVER ;;
for (EVER) {
print "Forever!\n";
}
Output:
Forever!
Forever!
Forever!
... keeps going ...
I don't think I would recommend doing this, but it is possible.
This is not possible in Perl. However, you can define a subroutine named forever which takes a code block as a parameter and runs it again and again:
#!/usr/bin/perl
use warnings;
use strict;
sub forever (&) {
$_[0]->() while 1
}
forever {
print scalar localtime, "\n";
sleep 1;
};
I have some Makefiles that are flexible based on the existence of certain variables by using ifdef to check for them. It is a bit annoying that I have to actually set the variable equal to something on the command line. make all DEBUG does not trigger the ifdef but make all DEBUG=1 does. Perhaps I am just using the C pre-processor approach where it does not belong.
Q1) Is it possible to specify a variable on the command line to be empty? Without even more characters?
Q2) What is the preferred approach for such boolean parameters to a make?
I assume you mean make all DEBUG= here, right? Without the = make will consider DEBUG to be a target to build, not a variable assignment.
The manual specifies that a variable that has a non-empty value causes ifdef to return true. A variable that does not exist or exists but contains the empty string, causes ifdef to return false. Note ifdef does not expand the variable, it just tests whether the variable has any value.
You can use the $(origin ...) function to test whether a variable is really not defined at all, or is defined but empty, like this:
ifeq ($(origin DEBUG),undefined)
$(info Variable DEBUG is not defined)
else
$(info Variable DEBUG is defined)
endif
As #MadScientist explained few minutes ago,
make all DEBUG
adds a target DEBUG to your make. Luckily, there is a workaround:
ifneq (,$(filter DEBUG,$(MAKECMDGOALS)))
DEBUG:=1 # or do whatever you want
DEBUG: all; #echo -n
endif
It is essential to supply a dummy rule (e.g. echo nothing, as above) to the dummy target. And either put this statement at the bottom of your makefile, or specify the prerequisite target explicitly as in the example. Otherwise, make may wrongly choose DEBUG target instead of all.
Note that this is not a preferred approach; the convention is like using V=1 to turn echo on.
Another caveat is that make processes the command-line goals sequentially, e.g. make A B will first take care of A target, then of B target, whether these targets are independent, or depend one on the other. Therefore writing make DEBUG PERFECT and make PERFECT DEBUG could produce different results. But the order of parameters is irrelevant, therefore make PERFECT=1 DEBUG=1 and make DEBUG=1 PERFECT=1 are equivalent.
It is already clarified why you can't use just DEBUG. But I would like to add something.
You can use shell script before running make that setup all variables you need, so, for example in linux shell it will look like this:
$source debug_setup.sh
$make all
Make is starting...
Debug is enabled
...
where debug_setup.sh contains all environment variables you need to set up:
export DEBUG=1
export DEBUG_OPTION=some_option
This is nice since you can make comments there, you can comment out if you don't need something at the moment and would like to keep for the future, etc.
Then you can have several setup scripts that must/can be used as a part of standard routine. This all depends on how many variables you need to set up, how many sets of variables you would like to have, etc.
Note that it is a good idea to notify user somehow which set of variables is selected.
I am optimizing a very time/memory consuming program by running it over a dataset and under multiple parameters. For each "run", I have a csv file, "setup.csv" set up with "runNumber","Command" for each run. I then import this into a perl script to read the command for the run number I would like, extrapolate the variables, then execute it on the system via the system command. Should I be worried about the potential for this to be exploited, (I am worried right now)? If so, what can I do to protect our server? My plan now is to change the file permissions of the "setup.csv" to read only and ownership to root, then go in as root whenever I need to append another run to the list.
Thank you very much for your time.
Run your code in taint mode with -T. That will force you to carefully launder your data. Only pass through strings that are ones you are expecting. Do not launder with .*, but rather check against a list of good strings.
Ideally, there a list of known acceptable values, and you validate against that.
Either way, you want to avoid the shell by using the multi-argument form of system or by using IPC::System::Simple's systemx.
If you can't avoid the shell, you must properly convert the text to pass to the command into shell literals.
Even then, you have to be careful of values that start with -. Lots of tools accept -- to denote the end options, allowing other values to be passed safely.
Finally, you might want to make sure the args don't contain the NUL character (\0).
systemx('tool', '--', #args)
Note: Passing arbitrary strings is not possible in Windows. Extra validation is required.