I'm using Getopt::Std to process my command line args. My command line args are strings. I have issuewithgetopts()`, as it works only for single character based opts.
As seen below "srcdir" "targetdir" options are mandatory and script should error out if any one of them is missing. "block" is NOT a mandatory option.
I don't see %options has is being set with the code below, and all my options{key} are NULL. Had I replaced "srcdir=>s" and "targetdir=>t" then the below piece of code works. It doesn't work with "-srcdir" "-targetdir" options.
What's the best way to address the issue I have?
Use mode:
perl test.pl -srcdir foo1 -targetdir hello1
#!/usr/bin/perl -w
use strict;
use Getopt::Std;
# declare the perl command line flags/opt we want to allow
my %options=();
my $optstring = 'srcdir:targetdir:block';
getopts( "$optstring", %options);
# test for the existence of the opt on the command line.
print "-srcdir $options{srcdir}\n" if defined $options{srcdir};
print "-targetdir $options{targetdir}\n" if defined $options{targetdir};
print "-blocks $options{block}\n" if defined $options{block};
# other things found on the command line
print "loop:\n" if ($#ARGV > 0);
foreach (#ARGV)
{
print "$_\n";
}
You really want to use Getopt::Long to handle words like srcdir:
use warnings;
use strict;
use Data::Dumper;
use Getopt::Long;
$Data::Dumper::Sortkeys=1;
my %options;
GetOptions(\%options, qw(srcdir=s targetdir=s block));
print Dumper(\%options);
print Dumper(\#ARGV);
The reason your hash was empty was that you need to pass a reference to a hash, as shown in Getopt::Std:
getopts( "$optstring", \%options);
Also, since Std only handles single letters, it would interpret srcdir as 6 separate options: s, r, etc.
Related
I reviewed many examples on-line about running another process (either PERL or shell command or a program), but do not find any useful for my needs way.
(As by already received answers I see that my 'request' is not understood, I will try to say it in short, leaving all earlier printed as an example of what I already tried...)
I need:
- In a caller script set parameters for the second script before call the second script (thus, I could not use the do script2.pl s it executed before startin to run the first script)
- In the second script I need to set some variables that will be used in the caller script (therefore it is not useful to process the second script by system() or by back ticks);
- and, as I need to use those variables in the first script, I need come back to the first script after completting the second one
(I hope now it is more clear what I need...)
(Reviewed and not useful the system(), 'back ticks', exec() and open())
I would like to run another PERL-script from a first one, not exiting (as by exec()), not catching the STDOUT of the called script (as in the back tick processing,) but having it printed out, as in initial script (as it is by system()) while I do not need the return status (as by system());
but, I would like to have the called script to set some variables, that will be accessible in the calling s cript (sure, set by the our #set_var;)
My attempt (that I am not able to make do what I need) is:
Script1 is something, like:
...
if($condition)
{ local $0 = 'script2.pl';
local #ARGV = ('first-arg', 'second_arg');
do script2.pl;
}
print "set array is: '#set_var'\n";
...
The 'script2' would have something like:
#!/usr/bin/perl
...
print "having input parameters: '#ARGV'\n";
... # all script activities
our #set_var = ($val1, $val2, $val3);
exit 0;
The problem in my code is that the do ... command is executed on beginning of the first script run and is not in the place, where it is prepared for it (by setting some local .. vars!)
I did try to use the eval "do script2.pl" :
- now it is executed in the proper place, but it is not setting the #set_var into the first script process!
Is there any idea to do it as I would like to have it?
(I understand, that I can rewrite the script2.pl, including whole processing in some function (say, main()) and load it by require() and execute the function main(): that will do everything as I prefer it; but I would like to leave the second script as-is to be executable from shell by itself, as it is now.
... and I do not like the way to pass values by a flat file...)
Does anybody have an idea how to do my whim?
This works just fine:
script2.pl
use strict;
our #set_var = ("foo","bar");
script1.pl
use strict;
our #set_var;
do './script2.pl';
print "#set_var\n";
$ perl script1.pl
foo bar
But it does not if you use:
script2.pl
use strict;
our #set_var = ("foo","bar");
exit 0;
There is only a single perl process in this example, so calling exit, even from the second script, exits your program.
If you don't want to remove the exit call in the second script, we can work around that with some CORE::GLOBAL namespace hacking. The gist is to redirect the exit function to your own custom function that you can manipulate when the second script runs.
script1.pl
BEGIN { *CORE::GLOBAL::exit = *my_exit };
use strict;
sub my_exit { goto &CORE::exit }
our #set_var;
{
local *my_exit = sub { warn "Not exiting" };
do './script2.pl';
}
print "#set_var\n";
script2.pl
use strict;
our #set_var = ("foo","bar");
exit 0;
$ perl script1.pl
Not exiting at script1.pl line 7.
foo bar
(Ok, finally, asked by myself and ansvering by myself, too!)
( After additional reviewing, I am realized, that 'mod' solution does use it, but I did not understand advice!
I am sorry: my false to step over the real solution!
)
Solution to my question is simple! It is the:
do EXPR;
That way
- the second script executed in place where it placed; so, anything defined and set in the first one usefull in the second one;
- It is printing to STDOUT everything what it should print (the second script;)
- any variables or objects that are defined in the second script process, are accessible in the first one after coming back; and
- control is returned to position immediately after the second-script execution with continuation to process the first script commands!
Simple! I am just amazed, why I forget about that 'do...' command. I have used it already not once!
And I am disappointed by that forum much!
While it is badly designed to display communication, participants, instead of perl-issue reviewing, much concerned on moderating others, teaching them how to leave in such nice forum!
I am not really sure what you are trying to do exactly, but along these lines it should be very close.
test.pl
#!/usr/bin/perl
use strict;
use warnings;
use feature 'say';
use IPC::System::Simple qw(system);
say $0;
system($^X, "sample.pl", #ARGV);
$ perl test.pl first-arg second-arg
test.pl
sample.pl
$VAR1 = [
'first-arg',
'second-arg'
];
sample.pl
#!/usr/bin/perl
use strict;
use warnings;
use Data::Dumper;
use feature 'say';
say $0;
print Dumper \#ARGV;
I used the module IPC::System::Simple. You can also capture the output of the script (sample.pl) through IPC::System::Simple::capture.
Update: Maybe you can use Storable. This way you can pass new parameters that you can use from script 2 (sample.pl) to script 1 (test.pl).
test.pl
#!/usr/bin/perl
use strict;
use warnings;
use Storable;
use Data::Dumper;
use feature 'say';
use IPC::System::Simple qw(system);
say $0;
system($^X, "sample.pl", #ARGV);
my $hashref = retrieve('sample');
print Dumper $hashref;
__END__
$ perl test.pl first-arg second-arg
test.pl
sample.pl
$VAR1 = [
'first-arg',
'second-arg'
];
$VAR1 = {
'arg1' => 'test1',
'arg2' => 'test2'
};
sample.pl
#!/usr/bin/perl
use strict;
use warnings;
use Storable;
use Data::Dumper;
use feature 'say';
say $0;
print Dumper \#ARGV;
my %hashArgs = ( arg1 => 'test1',
arg2 => 'test2', );
store \%hashArgs, 'sample';
I want to know if this is possible. the file a.pm has few variables that will be required by b.pl. and b.pl is called by a.pm
a.pm
{
$var1,$var2;
system (perl b.pl);
}
b.pl
{
print "$var1";
}
How do I make use of $var1 and $var2 to b.pl. I Do not want to pass as Command line arguments to b.pl
You cannot share the same variables if you do a system call to a new Perl interpreter. To do that, you could pass them as command line arguments.
use strict;
use warnings;
my $var1 = 'foo';
my $var2 = 'bar';
system( 'perl b.pl', $var1, $var2);
And then in b.pl:
use strict;
use warnings;
print $ARGV[0];
The arguments end up in #ARGV, which can be used as a regular array. If you want to share data structures, you'd have to serialize them first. You could use Storable or JSON to do that. You then need to deserialize them in your second program.
There are more complicated alternatives, but I think going into detail for those (like saving to a file, opening a pipe and reading from STDOUT in the second script) are a bit out of scope.
You can pass $var1 and var2 from a.pm to b.pl along with the command execution using system. Please the check the following:
a.pm
my ($var1,$var2);
$var1="Open";
$var2="Close";
system ("perl b.pl $var1 $var2");
b.pl
my $var1=$ARGV[0];
my $var2=$ARGV[1];
print "$var1";
The output will be:
Open
If you don't need to send value by command line please try the following.
The way to do is by passing value by setting it in %ENV hash as given in the following program. Please be careful when you set the key for it so that it won't replace the existing %ENV keys.
a.pm
my ($var1,$var2);
$var1="Open";
$var2="Close";
$ENV{m_temp_x_var1}=$var1;
$ENV{m_temp_x_var2}=$var2;
system ("perl b.pl");
b.pl
my $var1=$ENV{m_temp_x_var1};
my $var2=$ENV{m_temp_x_var2};
print "$var1";
The output will be:
Open
I have a script which does some basic awk like filtering using a while(<>) loop. I want the script to be able to display usage and version, but otherwise assume all arguments are files. How do I combine getopt with the <> operator?
Getopt plays nicely with #ARGV. Example
use strict; use warnings;
use feature 'say';
use Getopt::Long;
GetOptions 'foo=s' => \my $foo;
say "foo=$foo";
say "ARGV:";
say for #ARGV;
Then:
$ perl test.pl --foo fooval --bar
Unknown option: bar
foo=fooval
ARGV:
$ perl test.pl --foo fooval bar
foo=fooval
ARGV:
bar
$ perl test.pl --foo fooval -- --bar
foo=fooval
ARGV:
--bar
Summary:
Any items in #ARGV after the switches are simply left there.
This works as expected for normal filenames (that don't start with a hyphen-minus).
You can always use a -- to abort parsing of switches.
This works as expected for me.
use warnings;
use strict;
use Getopt::Long qw(GetOptions);
my %opt;
GetOptions(\%opt, qw(help)) or die;
die 'usage' if $opt{help};
while (<>) {
print;
}
As others have mentioned, Getopt::Long is the prefered module. It has been around since Perl 3.x.
There's a lot of options, and it can take a while to get use to the syntax, but it does exactly what you want:
use strict;
use warnings;
use Getopt::Long;
use feature qw(say);
use Pod::Usage;
my ( $version, $help ); #Strict, we have to predeclare these:
GetOptions(
'help' => \$help,
'version' => \$version,
) or pod2usage ( -message => "Invalid options" );
That's all there is to it. When the Getoptions subroutine runs, it will parse your command line (the #ARGV array) for anything that starts with a - or --. It will process those, and when it comes to either a double dash by itself, or an option not starting with a dash, it will assume those are files and it's done processing. At that point, all of the option strings (and their parameters) have been shifted out of the #ARGSV array, and you're left with your files:
if ( $help ) {
pod2usage;
}
if ( $version ) {
say "Version 1.23.3";
exit;
}
while ( my $file = <>) {
...
}
Getopts::Long is part of the standard Perl installation, so it should always be available to you.
I know many people are wary of using these modules because they think they're not standard Perl, but they are just as much a part of Perl as commands like print and chomp. Perl comes with over 500 of them and they're yours to use.
I come with the following perl problem. Take this piece of code and put it into test.pl
my $str=shift;
printf "$str", #ARGV;
Then run it like this:
perl test.pl "x\tx%s\n%s" one two three
The expected output for me should be:
x xone
two
Instead I got
x\sxone\ntwo
Where am I wrong?
Perl converts escape sequences within strings at compile time, so once your program is running you are too late to have "\t" and "\n" converted to tab and newline.
Using eval would fix this, but it's very insecure. I recommend you use the String::Interpolate module to process strings after compilation. It uses Perl's native interpolation engine so has the exact same effect as if you had coded the string into your program.
Your test.pl becomes
use strict;
use warnings;
use String::Interpolate qw/ interpolate /;
my $str = shift;
printf interpolate($str), #ARGV;
output
E:\Perl\source>perl test.pl "x\tx%s\n%s" one two three
x xone
two
E:\Perl\source>
Update
If you just want to allow for a small subset of the possibilities that String::Interpolate supports then you could write something explicit like, say
use strict;
use warnings;
my $str = shift;
$str =~ s/\\t/\t/g;
$str =~ s/\\n/\n/g;
printf $str, #ARGV;
but a module or eval are the only real ways to support a general Perl string on the command line.
If I have a command line like:
my_script.pl -foo -WHATEVER
My script knows about --foo, and I want Getopt to set variable $opt_foo, but I don't know anything about -WHATEVER. How can I tell Getopt to parse out the options that I've told it about, and then get the rest of the arguments in a string variable or a list?
An example:
use strict;
use warnings;
use Getopt::Long;
my $foo;
GetOptions('foo' => \$foo);
print 'remaining options: ', #ARGV;
Then, issuing
perl getopttest.pl -foo -WHATEVER
gives
Unknown option: whatever
remaining options:
You need to configure "pass_through" option via Getopt::Long::Configure("pass_through");
Then it support actual options (e.g. stuff starting with "-" and without the special "--" delimiter to signify the end of "real" options).
Here's perldoc quote:
pass_through (default: disabled)
Options that are unknown, ambiguous or supplied with an invalid option value are passed through in #ARGV instead of being flagged as errors. This makes it possible to write wrapper scripts that process only part of the user supplied command line arguments, and pass the remaining options to some other program.
Here's an example
$ cat my_script.pl
#!/usr/local/bin/perl5.8 -w
use Getopt::Long;
Getopt::Long::Configure("pass_through");
use Data::Dumper;
my %args;
GetOptions(\%args, "foo") or die "GetOption returned 0\n";
print Data::Dumper->Dump([\#ARGV],["ARGV"]);
$ ./my_script.pl -foo -WHATEVER
$ARGV = [
'-WHATEVER'
];
Aren't the remaining (unparsed) values simply left behind in #ARGV? If your extra content starts with dashes, you will need to indicate the end of the options list with a --:
#!/usr/bin/perl
use strict;
use warnings;
use Getopt::Long;
use Data::Dumper;
my $foo;
my $result = GetOptions ("foo" => \$foo);
print Dumper([ $foo, \#ARGV ]);
Then calling:
my_script.pl --foo -- --WHATEVER
gives:
$VAR1 = [
1,
[
'--WHATEVER'
]
];
PS. In MooseX::Getopt, the "remaining" options from the command line are put into the extra_argv attribute as an arrayref -- so I'd recommend converting!
I think the answer here, sadly though, is "no, there isn't a way to do it exactly like you ask, using Getopt::Long, without parsing #ARGV on your own." Ether has a decent workaround, though. It's a feature as far as most people are concerned that any option-like argument is captured as an error. Normally, you can do
GetOptions('foo' => \$foo)
or die "Whups, got options we don't recognize!";
to capture/prevent odd options from being passed, and then you can correct the user on usage. Alternatively, you can simply pass through and ignore them.