perl: catching GetOptions when something fails - perl

Using Getopt::Long, I want to capture any failures from GetOptions and, rather than printing them to STDOUT/STDERR, I want to print out a Usage string and exit the script. I tried wrapping the call to GetOptions in a Try::Tiny try/catch block, but it doesn't get caught. I'm not sure what's going on behind the scenes here.
How can I capture a failure in GetOptions, prevent it from printing, and print my own usage statement instead?

It uses warn, so you could use $SIG{__WARN__}. I would say it's better to provide a usage statement in addition to the specific error found by GetOptions.
sub usage {
my $prog = basename($0);
if (my ($msg) = #_) {
chomp($msg);
warn("$msg\n");
}
warn("Try `$prog --help' for more information.\n");
exit(1);
}
sub help {
my $prog = basename($0);
print(<<"__EOS__");
usage: $prog [options] [--] {foo} {bar}
$prog --help
Options:
...
__EOS__
exit(0);
}
sub parse_args {
GetOptions(
'h|?|help' => \&help,
...,
)
or usage();
#ARGV == 2
or usage("Too many arguments.");
}

Here's what I do:
eval {
local $SIG{__WARN__} = sub { die $_[0] };
GetOptions( \%opt, ... );
};
if ( $# ) {
...
}

Related

Parallel::ForkManager doesn't work with Perl 5.36

I have a script that runs well with Perl < 5.36:
#!/usr/bin/env perl
use strict;
use warnings FATAL => 'all';
use feature 'say';
use autodie ':all';
use Parallel::ForkManager;
sub execute {
my $command = shift;
print "Executing Command: $command\n";
if (system($command) != 0) {
my $fail_filename = "$0.fail";
print "$command failed.\n";
die;
}
}
sub run_parallel {
my $cmd = shift;
my $manager = new Parallel::ForkManager(2);
foreach my $command (#{ $cmd }) {
$manager->start and next;
execute( $command );
$manager->finish;
}
$manager->wait_all_children;#necessary after all lists
}
my #commands = ('echo "a"','echo "b"','echo "c"','which ls','which rm');
run_parallel(\#commands);
but when I make minor changes with the above to 5.36:
#!/usr/bin/env perl
use 5.036;
use warnings FATAL => 'all';
use autodie ':all';
use Parallel::ForkManager;
sub execute {
my $command = shift;
print "Executing Command: $command\n";
if (system($command) != 0) {
my $fail_filename = "$0.fail";
print "$command failed.\n";
die;
}
}
sub run_parallel {
my $cmd = shift;
my $manager = new Parallel::ForkManager(2);
foreach my $command (#{ $cmd }) {
$manager->start and next;
execute( $command );
$manager->finish;
}
$manager->wait_all_children;#necessary after all lists
}
my #commands = ('echo "a"','echo "b"','echo "c"','which ls','which rm');
run_parallel(\#commands);
I get an error:
Bareword found where operator expected at debug.pl line 20, near "new Parallel::ForkManager"
All I switched was use 5.036
Is Parallel::ForkManager incompatible with perl 5.36 or am I doing something wrong?
Perl v5.36 with use v5.36 turns off indirect object notation, where the method comes before the invocant:
my $p = new Some::Module; # indirect object notation
my $p = Some::Module->new(); # what you should do
If this is inconvenient for you in the short term, you can require the minimum version so you still get the things turned off by use v5.36:
require v5.36;
If you don't actually use v5.36 features, also consider requiring the minimum version that your code actually needs. In your snippet, I don't immediately see any minimum version requirement (other than just Perl 5).
Loading the 5.36 feature bundle (which you do through use 5.036;) disables the indirect method call syntax as if you had done no feature qw( indirect );.
This is a method call using the indirect syntax:
METHODNAME INVOCANT ARGS
Either re-enable the feature or use the "direct" syntax:
INVOCANT->METHODNAME( ARGS )
In your case,
my $manager = Parallel::ForkManager->new( 2 );

Getopts to flag bad options without dash

Getopt::Long::Configure("no_pass_through");
my %opts = ();
GetOptions(\%opts,
'opt1=s',
'opt2=s',
'opt3'
);
test.pl bad_option_without_dash
How do I make getopts flag an error when a bad option is passed without a dash? I was expecting that no_pass_through will take care of this. What am I missing?
Getopt::Long just extracts the options. It's up to you to validate the value of those options and the non-option arguments (which are left in #ARGV).
Specifically, if you want to make sure that only options have been passed, then you can use
#ARGV == 0
or die("usage\n");
What I use:
use Getopt::Long qw( );
my ( $opt_opt1, $opt_opt2, $opt_opt3 );
sub parse_args {
( $opt_opt1, $opt_opt2, $opt_opt3 ) = ();
Getopt::Long::Configure(qw( posix_default ));
Getopt::Long::GetOptions(
'help|h|?' => \&help,
'opt1=s' => \$opt_opt1,
'opt2=s' => \$opt_opt2,
'opt3' => \$opt_opt3,
)
or usage();
# Validate $opt_* here if necessary.
!#ARGV
or usage("Too many arguments.");
return #ARGV;
}
sub main {
# my () = #_; # This program doesn't accept non-option args.
...
}
main(parse_args());
Helpers:
use File::Basename qw( basename );
sub help {
my $prog = basename($0);
print
"Usage:
$prog [options]
$prog --help
Options:
--opt1 FOO
...
--opt2 BAR
...
--opt3
...
";
exit(0);
}
sub usage {
if (#_) {
my ($msg) = #_;
chomp($msg);
say STDERR $msg;
}
my $prog = basename($0);
say STDERR "Try '$prog --help' for more information.";
exit(1);
}
Getopt::Long operates only on options: arguments starting with hyphens. Without passthrough (no_pass_through is the default) it will remove them from #ARGV, leaving any non-option arguments for you to handle. If you expected no non-option arguments, you could determine that options were passed incorrectly if any arguments remain after calling GetOptions.
die "Usage: $0 [--opt1=string] [--opt2=string] [--opt3]\n" if #ARGV;
The return value of GetOptions is also useful, as it will indicate whether any unrecognized or invalid options were found.
GetOptions(...) or die "Usage: $0 ...\n";
Options are denoted with a leading dash (hyphen); arguments are not.
You might find Getopt::Long Argument-callback useful:
A special option 'name' <> can be used to designate a subroutine to handle non-option arguments. When GetOptions() encounters an argument that does not look like an option, it will immediately call this subroutine and passes it one parameter: the argument name.

perl - use command-line argument multiple times

I'm modifying a perl script in which the command line arguments are parsed like this:
if ($arg eq "-var1") {
$main::variable1 = shift(#arguments)
} elsif ($arg eq "-var2") {
$main::variable2 = shift(#arguments)
} elsif ($arg eq "var3") {
$main::variable3 = shift(#arguments)
} ...
So there is a whole bunch of elsif statements to cover all command-line arguments.
I'm now in a situaton where I want to use the argument '-var2' multiple times.
So my main::variable2 should maybe be an array that contains all values that are passed with "-var2".
I found that with Perl::getopt, this can be easily achieved (Perl Getopt Using Same Option Multiple Times).
However the way that my script parses its command-line arguments is different. So I was wondering if it could be achieved, without having to change the way the arguments are parsed.
That's not your actual code, is it? It won't even compile.
I'd be really surprised if Getopt::Long can't solve your problem and it's really a better idea to use a library rather than writing your own code.
But changing your code to store -var2 options in an array is simple enough.
my ($variable1, #variable2, $variable3);
if ($arg eq "-var1") {
$variable1 = shift(#arguments)
} elsif ($arg eq "-var2") {
push #variable2, shift(#arguments)
} elsif ($arg eq "-var3") {
$variable3 = shift(#arguments)
}
(I've also removed the main:: from your variables and added the, presumably missing, $s. It's really unlikely that you want to be using package variables rather than lexical variables.)
This particular wheel already exists. Please don't try to reinvent it. That just makes it a pain for the people trying to use your script. There's no reason to force people to learn a whole new set of rules in order to execute your program.
use File::Basename qw( basename );
use Getopt::Long qw( );
my $foo;
my #bars;
my $baz;
sub help {
my $prog = basename($0);
print
"Usage:
$prog [options]
$prog --help
Options:
--foo foo
...
--bar bar
May be used multiple times.
...
--baz baz
...
";
exit(0);
}
sub usage {
if (#_) {
my ($msg) = #_;
chomp($msg);
say STDERR $msg;
}
my $prog = basename($0);
say STDERR "Try '$prog --help' for more information.";
exit(1);
}
sub parse_args {
Getopt::Long::Configure(qw( posix_default ));
Getopt::Long::GetOptions(
"help" => \&help,
"foo=s" => \$foo,
"bar=s" => \#bars,
"baz=s" => \$baz,
)
or usage();
!#ARGV
or usage("Too many arguments");
return #ARGV;
}
main(parse_args());
Well, it is good practice to document your core -- you would appreciate it as soon as you return to make changes
NOTE: in Linux it requires perl-doc package to be installed to use --man option in full extent
#!/usr/bin/perl
#
# Description:
# Describe purpose of the program
#
# Parameters:
# Describe parameters purpose
#
# Date: Tue Nov 29 1:18:00 UTC 2019
#
use warnings;
use strict;
use Getopt::Long qw(GetOptions);
use Pod::Usage;
my %opt;
GetOptions(
'input|i=s' => \$opt{input},
'output|o=s' => \$opt{output},
'debug|d' => \$opt{debug},
'help|?' => \$opt{help},
'man' => \$opt{man}
) or pod2usage(2);
pod2usage(1) if $opt{help};
pod2usage(-exitval => 0, -verbose => 2) if $opt{man};
print Dumper(\%opt) if $opt{debug};
__END__
=head1 NAME
program - describe program's functionality
=head1 SYNOPSIS
program.pl [options]
Options:
-i,--input input filename
-o,--output output filename
-d,--debug output debug information
-?,--help brief help message
--man full documentation
=head1 OPTIONS
=over 4
=item B<-i,--input>
Input filename
=item B<-o,--output>
Output filename
=item B<-d,--debug>
Print debug information.
=item B<-?,--help>
Print a brief help message and exits.
=item B<--man>
Prints the manual page and exits.
=back
B<This program> accepts B<several parameters> and operates with B<them> to produce some B<result>
=cut

How to pass directory path as arguments from command line using perl?

My question as follows:
I struck with how to pass the command line arguments instead of passing directory path using perl .
Example suppose if am executing the file as follows:
./welcome.pl -output_dir "/home/data/output"
My code:
#!/usr/local/bin/perl
use strict;
use warnings 'all';
use Getopt::Long 'GetOptions';
GetOptions(
'output=s' => \my $output_dir,
);
my $location_dir="/home/data/output";
print $location_dir;
Code explanation:
I tried to print the contents in the $output_dir.so i need to pass the command line arguments inside the variable (i.e $location_dir) instead of passing path directly how can i do it?
use strict;
use warnings 'all';
use File::Basename qw( basename );
use Getopt::Long qw( GetOptions );
sub usage {
if (#_) {
my ($msg) = #_;
chomp($msg);
print(STDERR "$msg\n");
}
my $prog = basename($0);
print(STDERR "$prog --help for usage\n");
exit(1);
}
sub help {
my $prog = basename($0);
print(STDERR "$prog [options] --output output_dir\n");
print(STDERR "$prog --help\n");
exit(0);
}
Getopt::Long::Configure(qw( posix_default ));  # Optional, but makes the argument-handling consistent with other programs.
GetOptions(
'help|h|?' => \&help,
'output=s' => \my $location_dir,
)
or usage();
defined($location_dir)
or usage("--output option is required\n");
print("$location_dir\n");
Or course, if the argument is indeed required, then why not just use ./welcome.pl "/home/data/output" instead of an not-really optional parameter.
use strict;
use warnings 'all';
use File::Basename qw( basename );
use Getopt::Long qw( GetOptions );
sub usage {
if (#_) {
my ($msg) = #_;
chomp($msg);
print(STDERR "$msg\n");
}
my $prog = basename($0);
print(STDERR "$prog --help for usage\n");
exit(1);
}
sub help {
my $prog = basename($0);
print(STDERR "$prog [options] [--] output_dir\n");
print(STDERR "$prog --help\n");
exit(0);
}
Getopt::Long::Configure(qw( posix_default ));  # Optional, but makes the argument-handling consistent with other programs.
GetOptions(
'help|h|?' => \&help,
)
or usage();
#ARGV == 1
or usage("Incorrect number of arguments\n");
my ($location_dir) = #ARGV;
print("$location_dir\n");

How can I pipe two Perl CORE::system commands in a cross-platform way?

I'm writing a System::Wrapper module to abstract away from CORE::system and the qx operator. I have a serial method that attempts to connect command1's output to command2's input. I've made some progress using named pipes, but POSIX::mkfifo is not cross-platform.
Here's part of what I have so far (the run method at the bottom basically calls system):
package main;
my $obj1 = System::Wrapper->new(
interpreter => 'perl',
arguments => [-pe => q{''}],
input => ['input.txt'],
description => 'Concatenate input.txt to STDOUT',
);
my $obj2 = System::Wrapper->new(
interpreter => 'perl',
arguments => [-pe => q{'$_ = reverse $_}'}],
description => 'Reverse lines of input input',
output => { '>' => 'output' },
);
$obj1->serial( $obj2 );
package System::Wrapper;
#...
sub serial {
my ($self, #commands) = #_;
eval {
require POSIX; POSIX->import();
require threads;
};
my $tmp_dir = File::Spec->tmpdir();
my $last = $self;
my #threads;
push #commands, $self;
for my $command (#commands) {
croak sprintf
"%s::serial: type of args to serial must be '%s', not '%s'",
ref $self, ref $self, ref $command || $command
unless ref $command eq ref $self;
my $named_pipe = File::Spec->catfile( $tmp_dir, int \$command );
POSIX::mkfifo( $named_pipe, 0777 )
or croak sprintf
"%s::serial: couldn't create named pipe %s: %s",
ref $self, $named_pipe, $!;
$last->output( { '>' => $named_pipe } );
$command->input( $named_pipe );
push #threads, threads->new( sub{ $last->run } );
$last = $command;
}
$_->join for #threads;
}
#...
My specific questions:
Is there an alternative to POSIX::mkfifo that is cross-platform? Win32 named pipes don't work, as you can't open those as regular files, neither do sockets, for the same reasons.
2. The above doesn't quite work; the two threads get spawned correctly, but nothing flows across the pipe. I suppose that might have something to do with pipe deadlocking or output buffering. What throws me off is that when I run those two commands in the actual shell, everything works as expected.
Point 2 is solved; a -p fifo file test was not testing the correct file.
Out of interest, why do you need a FIFO? Couldn't you just set up a regular pipe (e.g. with pipe?) And why use threads when you can use the much more strongly supported fork?
In fact, you could instead use a CPAN module to do most of your work for you. IPC::Run for example:
use IPC::Run qw(run);
run ['perl', '-pe', ''], '<', 'input.txt', '|', ['perl', '-pe', '$_ = reverse $_}'], '>', 'output';
...should work as you expect, on Linux or Windows.