Pass parameter to a perl script executed trough qsub - perl

Hi would like to pass a parameter to my perl script that should be executed trough qsub.
So I run:
qsub -l nodes=node01 -v "i=500" Test.pl
While in Test.pl I try to call i parameter in several way:
use Getopt::Long;
$result = GetOptions ("i" => \$num);
open(FILE,">/data/home/FILEout.txt");
print FILE "$num\n";
print FILE "$ARGV[0]";
close(FILE);
Unfortunatelly output file of the perl script is always empty.
Do you have any suggestions? Where I'm wrong? Help please

According to all documentation I can find -v sets an environment var, so you'd use $ENV{i} to get 500. (Check your own documentation to confirm.)
If you wanted to actually pass an arg to your script, you could try using
qsub ... Test.pl -i=500
But based on my web search, that might only work for some versions of qsub. Others would require that you make a helper script (say Test.sh)
#!/bin/sh
Test.pl "-i=$i"
along with the command
qsub ... -v 'i=500' Test.sh
If qsub ... Test.pl ...args... is supported and you can change your script, the simplest solution is
qsub ... Test.pl 500
and
my ($i) = #ARGV;

I Finally get the solution that works with PBRProfessional 10.4.
There are two way to solve it:
First one is the following
echo "perl /path/to/Test.pl -i 500" | qsub -l nodes=node06
Second one is two use
qsub -l nodes=node06 -v i=500 Test.pl
and read the parameter in the Test.pl through $ENV{i}

Related

Force Perl to stop special processing of command line arguments if -e eval switch is used

perl -e 'print(123, #ARGV);' a b
# 123ab
perl -e 'print(123, #ARGV);' --help
# prints Perl's help instead
This is a toy example demonstrating the problem. In my real use-case I'm using -e to execute a large script from an embedded interpreter using perl_parse(...) function, the script has its own processing of --help switch, so I'd like to block any special processing of command line arguments after -e.
Is it possible?
Use a double hyphen to stop argument processing:
$ perl -e'print "[#ARGV]\n"' -- --help
[--help]
$

Conditional Perl Shebang?

I have a situation where I need to detect if a particular perl executable, /usr/goofy/bin/perl exists and if so use it to run the Perl script otherwise use /usr/bin/perl.
I have been struggling with this small POC script, called perlshebang.pl:
#!/bin/sh -e
perls="/usr/goofy/bin/perl /usr/bin/perl"
for pl_exec in $perls
do
if [ -x $pl_exec ]; then
exec "$pl_exec -w -S \"$0\" ${1+\"$#\"}"
fi
done
print "[$^X] Whoop!\n";
When I run this on a system that does not have /usr/goofy/bin/perl I get this error message:
./perlshebang.pl: 6: exec: /usr/bin/perl -w -S "./perlshebang.pl" : not found
And when I run it on a system that does have /usr/goofy/bin/perl I get a similuar error message:
./perlshebang.pl: line 6: /usr/goofy/bin/perl -w -S "./perlshebang.pl" : No such file or directory
I think I am close but cannot figure out why I am getting these error messages.
Thanks!
To answer your question, "Why am I getting these error messages?", the problem is your exec line:
exec "/path/to/cmd arg arg"
# This will attempt to execute a file named "cmd arg arg"
# (with spaces in name) in directory /path/to/
Contrast that with
exec /path/to/cmd arg arg
# This will attempt to execute a file named "cmd" in directory
# /path/to/, with arguments "arg" and "arg"
So, that is why the shell complains that it cannot find your executable. You don't have a file named perl -w -s "perlshebang.pl", neither under /usr/bin/ nor under /usr/goofy/bin/.
This sounds a little ugly to me if you are releasing software that uses this hack
If you have no other choice, then I suggest you make sure there is always a /usr/goofy/bin/perl, and use the shebang line
#!/usr/goofy/bin/perl
on all your scripts.
For those systems where you want to use the system perl, just make /usr/goofy/bin/perl a symlink to /usr/bin/perl
A co-worker of mine came up with this. I am not sure I fully understand it but it seems to work fine:
#!/bin/sh
#! -*-perl-*-
eval ' if test -x /usr/goofy/bin/perl ; then
exec /usr/goofy/bin/perl -x -S $0 ${1+"$#"};
elif test -x /usr/bin/perl ; then
exec /usr/bin/perl -x -S $0 ${1+"$#"};
fi '
if $running_under_some_shell;
use strict;
use warnings;
print "hello world\n"; # if $foo;
printf("running %s v(%vd)\n", $^X, $^V);
__END__
unpod like docs.
See http://perldoc.perl.org/perlrun.html
You can run the idea out of a Perl script running /usr/bin/perl. Use the shebang line with the 'goofy perl' in your script that should run. Then run the following wrapper, followed by the normal invocation of the script (its name and arguments).
#!/usr/bin/perl
exec "#ARGV";
exec "/usr/bin/perl #ARGV";
print STDERR "Couldn't execute either.\n";
Let's call the above pick_perl.pl, and your script is script.pl. Run it all as
pick_perl.pl script.pl args-for-script
The exec replaces the running program altogether with the one it executes, ie. it loads the new program. Thus your script runs with its own shebang. If that failes exec returns quietly (with false) and the next statement is executed so the other Perl runs the script (overriding the shebang). This happens if script's shebang fails, but also if the first exec fails to execute for any reason.
If you wish/need to run checks then put exec in a full if block. One can also interrogate the 'goofy_perl' file further if -e isn't assuring enough.
#!/usr/bin/perl
$system_perl = "/usr/bin/perl";
$goofy_perl = "/usr/goofy/bin/perl";
# Your 'goofy_perl' script with its arguments
#script_cmd = #ARGV;
if (-x $goofy_perl) { exec "#script_cmd" }
exec "$system_perl #script_cmd";
The #script_cmd has the full command line for your script (which has 'goofy_perl' shebang).

Output of perl debugger to file (Windows)

I tried to list all the subroutines of a script with the perl debugger and put the results in an external file. But It didn't work.
My code:
perl -d -S myscript.pl > results.txt
-S = list all subroutines
-d = debug perl script
Greets,
The -S isn't supposed to be used as a command line switch. Running perl -d will start a debugger process, and one of the commands you can use there is S.
Example:
$ perl -d tmp/splithttpdconf.pl
Loading DB routines from perl5db.pl version 1.28
Editor support available.
Enter h or `h h' for help, or `man perldebug' for more help.
main::(tmp/splithttpdconf.pl:6): my $basedir = shift;
DB<1> S main::
main::BEGIN
main::debug
main::splitconf
DB<2>
In order to get the kind of output you want, you can to use the profiler module Devel::DProf instead. It'll output profiler info into a file which can be read by the dprofpp program. Here's an example to get the list of subroutines:
perl -d:DProf perlscript.pl; dprofpp -T
If you only want the subroutines within your own script, and not those loaded from other modules, add a grep to it, e.g.:
perl -d:DProf perlscript.pl; dprofpp -T | grep main::
Though for the particular question of knowing what subroutines exist in a given program, provided you use a consistent coding style it'd probably be easier to just do a grep "sub.*{" to start with.
In your home directory, create a file called .perldb with the following contents:
parse_options("NonStop=1 LineInfo=results.txt AutoTrace=1 frame=2");
And then run the command
perl -d myscript.pl
If you want to scan and list the entire subroutine's what Perl see's before it runs:
perl -MO=Deparse -f myscript.pl

How to check if a Perl script doesn't have any compilation errors?

I am calling many Perl scripts in my Bash script (sometimes from csh also).
At the start of the Bash script I want to put a test which checks if all the Perl scripts are devoid of any compilation errors.
One way of doing this would be to actually call the Perl script from the Bash script and grep for "compilation error" in the piped log file, but this becomes messy as different Perl scripts are called at different points in the code, so I want to do this at the very start of the Bash script.
Is there a way to check if the Perl script has no compilation error?
Beware!!
Using the below command to check compilation errors in your Perl program can be dangerous.
$ perl -c yourperlprogram
Randal has written a very nice article on this topic which you should check out
Sanity-checking your Perl code (Linux Magazine Column 91, Mar 2007)
Quoting from his article:
Probably the simplest thing we can tell is "is it valid?". For this,
we invoke perl itself, passing the compile-only switch:
perl -c ourprogram
For this operation, perl compiles the program,
but stops just short of the execution phase. This means that every
part of the program text is translated into the internal data
structure that represents the working program, but we haven't actually
executed any code. If there are any syntax errors, we're informed, and
the compilation aborts.
Actually, that's a bit of a lie. Thanks to BEGIN blocks (including
their layered-on cousin, the use directive), some Perl code may have
been executed during this theoretically safe "syntax check". For
example, if your code contains:
BEGIN { warn "Hello, world!\n" }
then you will see that message,
even during perl -c! This is somewhat surprising to people who
consider "compile only" to mean "executes no code". Consider the
code that contains:
BEGIN { system "rm", "-rf", "/" }
and you'll see the problem with
that argument. Oops.
Apart from perl -c program.pl, it's also better to find warnings using the command:
perl -w program.pl
For details see: http://www.perl.com/pub/2004/08/09/commandline.html
I use the following part of a bash func for larger perl projects :
# foreach perl app in the src/perl dir
while read -r dir ; do
echo -e "\n"
echo "start compiling $dir ..." ;
cd $product_instance_dir/src/perl/$dir ;
# run the autoloader utility
find . -name '*.pm' -exec perl -MAutoSplit -e 'autosplit($ARGV[0], $ARGV[1], 0, 1, 1)' {} \;
# foreach perl file check the syntax by setting the correct INC dirs
while read -r file ; do
perl -MCarp::Always -I `pwd` -I `pwd`/lib -wc "$file"
# run the perltidy inline
# perltidy -b "$file"
# sleep 3
ret=$? ;
test $ret -ne 0 && break 2 ;
done < <(find "." -type f \( -name "*.pl" -or -name "*.pm" \))
test $ret -ne 0 && break ;
echo "stop compiling $dir ..." ;
echo -e "\n\n"
cd $product_instance_dir ;
done < <(ls -1 "src/perl")
When you need to check errors/warnings before running but your file depends on mutliple other files you can add option -I:
perl -I /path/to/dependency/lib -c /path/to/file/to/check
Edit: from man perlrun
Directories specified by -I are prepended to the search path for modules (#INC).

storing output of perl command into a variable

This works fine
system("perl -c C:/Users/mytest/scripts/file_name.pm")
This command gives many lines of output in cygwin and a single syntak ok line in centos. since ill be using cygwin, what am trying to do is to get this output into a variable and use it later in my program. How can i do it?
Thanks in advance for your time.
Instead of system, use backticks:
my $output = `perl -c C:/Users/mytest/scripts/file_name.pm`;
if you want to also include STDERR output, use:
my $output = `perl -c C:/Users/mytest/scripts/file_name.pm 2>&1`;