Pass variables from shell script to Perl script - perl

I'm trying to send two values from my existing shell code to a Perl script, which needs to be processed. I'm relatively new to perl, and I tried accepting them using the ARGV[0] and ARGV[1], but when I try to print the values, they come out to be blank. I cannot seem to understand the problem here, although from this it looks like, the Perl script didn't get the values. The variables hold strings and I verified them by printing in the shell script.
I used two syntaxes based on solutions available, however none of them seem to work:
There are 2 variables BITNUM and BITENV, which I'm passing:
In the shell script I did:
/scriptdirectory/execute.pl $BITNUM $BITENV
To accept them in Perl I did:
my $var1 = $ARGV[0];
my $var2 = $ARGV[1];
print "Var 1 is $var1\n";
print "Var 2 is $var2\n";
This did not work so I tried using export in the Shell script
export BITNUM
export BITENV
/scriptdirectory/execute.pl $ENV{"BITNUM"} $ENV{"BITENV"}
To accept them in Perl I did:
my $var1 = $ENV{BITNUM};
my $var2 = $ENV{BITENV};
print "Var 1 is $var1\n";
print "Var 2 is $var2\n";
Both of these return blank values. Is this occurring because I've passed multiple values and there's another way of doing this which I've not tried?

Related

Is there an interactive command line environment for Perl?

Hi I'm wondering if there is something for Perl similar to Rstudio? That is to ability to run commands, retain all variables in memory without exiting the script.
For example say I execute this command my $temp = 83; then instead of ending the script I change the value $temp = 22; print "$temp \n"; and so on, but I don't end the script and continue to work on it. This will be extremely helpful when dealing with a large datasets and general workflow.
The closest thing I came across is Visual Studio Code using a plugin whereby I can execute specific chunks of code in my script. However I did not find a way to keep the variable persistently in memory.
thanks!
You want a REPL.
Take a look at Devel::REPL. It brings a script called re.pl that you can run.
$ re.pl
$ my $foo = 123;
123$ use feature 'say';
$ $foo + 1;
124$
A newer alternative is Reply with its reply script.
$ reply
0> my $foo = 123;
$res[0] = 123
1> $foo + 2
$res[1] = 125
2>
For a comparison, you can read this blog post by Matt Trout.

Pass arguments from command line in Perl

I have written the Perl code and trying to pass the command line arguments and not getting expected output.
Here is my code:
my ($buildno, $appname, $ver) = #ARGV;
print values \#ARGV;
$Artifact_name = "flora-$appname-$ver";
mkdir "$target_dir/$Artifact_name";'
When I run the Perl script perl scripts\perl\test.pl "MobileApp", %ver%, I am getting the following output: flora-MobileApp-
And the log message is showing
'Use of uninitialized value $ver in concatenation (.) or string at
Jscripts\perl\test.pl line 31 (#3)'.
%ver% is the environment variable and its value is 1.0.1.23.
I am expecting the output flora-MobileApp-1.0.1.23.
So you run the program like this:
perl scripts\perl\test.pl "MobileApp", %ver%
And then, within the program you are accessing the command line arguments like this:
my ($buildno, $appname, $ver) = #ARGV;
There's an obvious mismatch here. You are passing in two arguments, but expecting three. You will end up with "MobileApp" in $buildno and the contents of %ver% in $appname. Your last variable, $ver, will remain undefined.
If you want three variables to be set, then you need to pass three arguments.
And, if you wanted to investigate this, then surely it would have been simple to print out the values of your three variables?
The "fix" is to pass another value as the first command line option so that the parameters line up with the variables.
perl scripts\perl\test.pl something_else "MobileApp" %ver%
But I'm fascinated to hear what level of confusion makes this problem hard to debug.
Update: Another option (as PerlDuck has reminded me in a comment) would be to fix the code to only expect two arguments.
my ($appname, $ver) = #ARGV;

Does the Perl default variable $_ or #_ behave differently on OS X to linux using split?

I am new to Perl, and have searched long and hard for the answer to this problem, but I am stuck.
Take the following example:
my $filein = $ARGV[0];
open(SNPIN,$filein);
while(<SNPIN>)
{
chomp;
split(/\s+/);
print "$_[0]\n";
}
close(SNPIN);
The test file that I am using has the following lines:
This is a test.
Is a test.
A test.
This script executes fine on our linux servers (with perl 5.10), outputting the first word of each line - although it gives me the following warning:
Use of implicit split to #_ is deprecated at scan_test.pl line 7.
but when I try to execute it on my local machine running OS X (with perl 5.12.3) I get the following error:
Useless use of split in void context at scan_test.pl line 7.
Use of uninitialized value $_[0] in concatenation (.) or string at
scan_test.pl line 8, <SNPIN> line 1.
Obviously this is a dummy script. I have inherited someone else's extremely long and complicated script which is working on our servers but I would like to develop it locally without having to go through the entire script and reassign all default variable calls to another variable. No matter what I have tried (including "use v5.10;"), nothing will allow me to use the default variable on my local machine.
Any ideas? Help is most appreciated.
It's not the platform, but the Perl version.
The warning you got on Perl 5.10 should've been a clue: deprecated means "you should not use this, because it's going to stop working in some later version". Obviously, it did stop working somewhere between Perl 5.10 and 5.12.
To fix it, just explicitly assign the list returned by split into an array. If you want to keep using #_, this should work:
chomp;
#_ = split(/\s+/);
print "$_[0]\n";
but I'd really recommend using your own named array, since #_ has a special meaning (it's used for passing parameters to subroutines):
chomp;
my #items = split(/\s+/);
print "$items[0]\n";

Can I obtain values from a perl script using a system call from the middle of another perl script?

I'm trying to modify a script that someone else has written and I wanted to keep my script separate from his.
The script I wrote ends with a print line that outputs all relevant data separated by spaces.
Ex: print "$sap $stuff $more_stuff";
I want to use this data in the middle of another perl script and I'm not sure if it's possible using a system call to the script I wrote.
Ex: system("./sap_calc.pl $id"); #obtain printed data from sap_calc.pl here
Can this be done? If not, how should I go about this?
Somewhat related, but not using system():
How can I get one Perl script to see variables in another Perl script?
How can I pass arguments from one Perl script to another?
You're looking for the "backtick operator."
Have a look at perlop, Section "Quote-like operators".
Generally, capturing a program's output goes like this:
my $output = `/bin/cmd ...`;
Mind that the backtick operator captures STDOUT only. So in order to capture everything (STDERR, too) the commands needs to be appended with the usual shell redirection "2>&1".
If you want to use the data printed to stdout from the other script, you'd need to use backticks or qx().
system will only return the return value of the shell command, not the actual output.
Although the proper way to do this would be to import the actual code into your other script, by building a module, or simply by using do.
As a general rule, it is better to use all perl solutions, than relying on system/shell as a way of "simplifying".
myfile.pl:
sub foo {
print "Foo";
}
1;
main.pl:
do 'myfile.pl';
foo();
perldoc perlipc
Backquotes, like in shell, will yield the standard output of the command as a string (or array, depending on context). They can more clearly be written as the quote-like qx operator.
#lines = `./sap_calc.pl $id`;
#lines = qx(./sap_calc.pl $id);
$all = `./sap_calc.pl $id`;
$all = qx(./sap_calc.pl $id);
open can also be used for streaming instead of reading into memory all at once (as qx does). This can also bypass the shell, which avoids all sorts of quoting issues.
open my $fh, '-|', './sap_calc.pl', $id;
while (readline $fh) {
print "read line: $_";
}

Is it possible to send POST parameters to a CGI script without another HTTP request?

I'm attempting to run a CGI script in the current environment from another Perl module. All works well using standard systems calls for GET requests. POST is fine too, until the parameter list gets too long, then they get cut off.
Has anyone ran in to this problem, or have any suggestions for other ways to attempt this?
The following are somewhat simplified for clarity. There is more error checking, etc.
For GET requests and POST requests w/o parameters, I do the following:
# $query is a CGI object.
my $perl = $^X;
my $cgi = $cgi_script_location; # /path/file.cgi
system {$perl} $cgi;
Parameters are passed through the
QUERY_STRING environment variable.
STDOUT is captured by the calling
script so whatever the CGI script
prints behaves as normal.
This part works.
For POST requests with parameters the following works, but seemingly limits my available query length:
# $query is a CGI object.
my $perl = $^X;
my $cgi = $cgi_script_location; # /path/file.cgi
# Gather parameters into a URL-escaped string suitable
# to pass to a CGI script ran from the command line.
# Null characters are handled properly.
# e.g., param1=This%20is%20a%20string&param2=42&... etc.
# This works.
my $param_string = $self->get_current_param_string();
# Various ways to do this, but system() doesn't pass any
# parameters (different question).
# Using qx// and printing the return value works as well.
open(my $cgi_pipe, "|$perl $cgi");
print {$cgi_pipe} $param_string;
close($cgi_pipe);
This method works for short parameter lists, but if the entire command gets to be close to 1000 characters, the parameter list is cut short. This is why I attempted to save the parameters to a file; to avoid shell limitations.
If I dump the parameter list from the executed CGI script I get something like the following:
param1=blah
... a bunch of other parameters ...
paramN=whatever
p <-- cut off after 'p'. There are more parameters.
Other things I've done that didn't help or work
Followed the CGI troubleshooting guide
Saved the parameters to a file using CGI->save(), passing that file to the CGI script. Only the first parameter is read using this method.
$> perl index.cgi < temp-param-file
Saved $param_string to a file, passing that file to the CGI script just like above. Same limitations as passing the commands through the command line; still gets cut off.
Made sure $CGI::POST_MAX is acceptably high (it's -1).
Made sure the CGI's command-line processing was working. (:no_debug is not set)
Ran the CGI from the command line with the same parameters. This works.
Leads
Obviously, this seems like a character limit of the shell Perl is using to execute the command, but it wasn't resolved by passing the parameters through a file.
Passign parameters to system as a single string, from HTTP input, is extremely dangerous.
From perldoc -f system,
If there is only one scalar argument, the argument is checked for shell metacharacters, and if there are any, the entire argument is passed to the system's command shell for parsing (this is /bin/sh -c on Unix platforms, but varies on other platforms). If there are no shell metacharacters in the argument,..
In other words, if I pass in arguments -e printf("working..."); rm -rf /; I can delete information from your disk (everything if your web server is running as root). If you choose to do this, make sure you call system("perl", #cgi) instead.
The argument length issue you're running into may be an OS limitation (described at http://www.in-ulm.de/~mascheck/various/argmax/):
There are different ways to learn the upper limit:
command: getconf ARG_MAX
system header: ARG_MAX in e.g. <[sys/]limits.h>
Saving to a temp file is risky: multiple calls to the CGI might save to the same file, creating a race condition where one user's parameters might be used by another user's process.
You might try opening a file handle to the process and passing arguments as standard input, instead. open my $perl, '|', 'perl' or die; fprintf(PERL, #cgi);
I didn't want to do this, but I've gone with the most direct approach and it works. I'm tricking the environment to think the request method is GET so that the called CGI script will read its input from the QUERY_STRING environment variable it expects. Like so:
$ENV{'QUERY_STRING'} = $long_parameter_string . '&' . $ENV{'QUERY_STRING'};
$ENV{'REQUEST_METHOD'} = 'GET';
system {$perl_exec} $cgi_script;
I'm worried about potential problems this may cause, but I can't think of what this would harm, and it works well so far. But, because I'm worried, I thought I'd ask the horde if they saw any potential problems:
Are there any problems handling a POST request as a GET request on the server
I'll save marking this as the official answer until people have confirmed or at least debated it on the above post.
Turns out that the problem is actually related to the difference in Content-Length between the original parameters and the parameter string I cobbled together. I didn't realize that the CGI module was using this value from the original headers as the limit to how much input to read (makes sense!). Apparently the extra escaping I was doing was adding some characters.
My solution's trick is simply to piece together the parameter string I'll be passing and modify the environment variable the CGI module will check to determine the content length to be equal to the .
Here's the final working code:
use CGI::Util qw(escape);
my $params;
foreach my $param (sort $query->param) {
my $escaped_param = escape($param);
foreach my $value ($query->param($param)) {
$params .= "$escaped_param=" . escape("$value") . "&";
}
}
foreach (keys %{$query->{'.fieldnames'}}) {
$params .= ".cgifields=" . escape("$_") . "&";
}
# This is the trick.
$ENV{'CONTENT_LENGTH'} = length($params);
open(my $cgi_pipe, "| $perl $cgi_script") || die("Cannot fork CGI: $!");
local $SIG{PIPE} = sub { warn "spooler pipe broke" };
print {$cgi_pipe} $params;
warn("param chars: " . length($params));
close($cgi_pipe) || warn "Error: CGI exited with value $?";
Thanks for all the help!