I want to run a script from my website using CGI.pm - The script I am running is usally ran from the command line and requires several command line ARGV inputs. How do I deal with this using CGI.pm? - can I insert a system($command) into Perl CGI script? The script can be seen here - http://www.ncbi.nlm.nih.gov/IEB/ToolBox/C_DOC/lxr/source/doc/blast/web_blast.pl
how to collect ARGV using Perl CGI?
#ARGV didn't go anywhere, but CGI doesn't use command line arguments, so there are no command line arguments to collect.
can I insert a system($command) into Perl CGI script?
Yes.
You can dual-purpose the script by checking if you are connected to a terminal:
if (-t STDOUT) {
# Command LIne mode, use #ARGV;
}
else {
# CGI mode, get ARGV equivalent from CGI->param
}
You will have to adjust the output to work in CGI mode, by adding content headers before you output anything.
If you use system($foo) in a web page, make sure the logic controlling what's in $foo is secure, otherwise you might end up hacked.
Related
I need to write a perl script that calls a c-shell script that calls yet another perl script. I cannot change the c-shell script or the perl script it calls. One of the args that needs to be passed is a quotes string with spaces. If I use backticks to call the c-shell, and I run the c-shell with tcsh, the quoted string is respected as a single entity. However, if I run the c-shell with source, it is not.
I feel that I need to use 'source' because when the c-shell is called by users from the command line, it is called through an alias that sources the c-shell. E.g.
alias top "source top.csh"
Consider these...
topmost.pl
#!/usr/bin/env perl
use strict;
print "Try with tcsh...\n";
my $msg = `tcsh ./top.csh -arg1 "this line has spaces"`;
print "$msg\n";
print "Try with source...\n";
my $msg = `source ./top.csh -arg1 "this line has spaces"`;
print "$msg\n";
exit;
top.csh is simply....
perl ./subperl.pl $*:q
exit
And subperl.pl is...
#!/usr/bin/env perl
use strict;
print "In subperl.pl\n";
foreach $x (#ARGV) {
print "$x\n";
}
print "The End\n";
exit;
When I run the topmost.pl script, I get...
Try with tcsh...
In subperl.pl
-arg1
this line has spaces
The End
Try with source...
In subperl.pl
-arg1
this
line
has
spaces:q
The End
Why does the "sourced" call to the top.csh script fail to respect the quotes ?
#Kaz has the answer as to why your code isn't working. This answer is about how to avoid this class of problems entirely.
First, if you can, add a #!/bin/tcsh to top.csh and make it executable (ie. chmod +x). Now it can be executed as top.csh without needing to know what shell to use.
Then you'll want to avoid using `` for anything but very simple commands. This is because `` is interpreted by the shell and now you need to worry about shell special characters and escapes and spaces... it's a mess. What you need is a way to call external programs without invoking a shell.
You can do this by passing a list to system, but system cannot capture the output.
system "tcsh", "./top.csh", "-arg1", "this line has spaces";
While you can cobble something together with open and pipes, it's better to use a pre-existing library such as IPC::System::Simple.
use IPC::System::Simple qw(capturex);
# Or capturex("./top.csh", ...) if you add a #! to top.csh.
my $msg = capturex("tcsh", "./top.csh", "-arg1", "this line has spaces");
For more involved interactions with executables, look into System::Command or IPC::Run.
Needless to say, Perl scripts which call shell scripts which call Perl scripts is a bit of a nightmare to maintain. Rather than do that, it is better to scoop the guts of subperl.pl out into a Perl library and have both subperl.pl and your code use that library.
The command in backticks is being interpreted by your system interpreter (invoked via /bin/sh), which I'm guessing might be GNU Bash. Or, in any case, it seems to be some shell which understands the source command, and almost certainly a POSIX-like shell. That source command quite probably tells that shell to read a script written in that shell's own language. So, for instance, if that shell happens to be Bash, it will treat that as a Bash script1, not as a Tcsh script.
The only way both could work is if the script is a "polyglot": a program which can be interpreted by either tcsh or the system shell that is used by perl to implement backticks.
(An easy example of a C Shell + POSIX shell polyglot is a script that contains nothing but a sequence of trivial commands consisting of space separated words like cp from to.)
Your script isn't a polyglot; only Tcsh understands the :q syntax, not the other shell.
More precisely, if /bin/sh is Bash, the original source ... command as well as the contents of the sourced top.csh script will be treated as a POSIX-mode Bash script, since when Bash is invoked as /bin/sh, it turns off its POSIX-incompatible behaviors. So even if Bash's pathname expansion supported the Tcsh :q mechanism, it would almost certainly be turned off under POSIX mode because $*:q already has a firm meaning in POSIX.
I came across the following example. I tried to google but could not find much so I'm posting this question here.
What is the benefit of executing the perl script like this?
How can we make the shell script work like a "normal" shell script once we are through executing the perl code?
Here's the code:
#!/bin/ksh
#! -*- perl -*-
eval 'exec $PERLLOCATION/bin/perl -x $0 ${1+"$#"} ;'
if 0;
print "hello world\n";
# how can I make it behave like a "normal" shell script from this point onwards? What needs to be done?
# echo "hello world" ### this results in error
This idiom is described in the perlrun documentation.
The -x switch scans the whole file and ignores anything that appears before the first line that begins with #! and also contains the word perl.
It means that your system will run the script with the Perl interpreter whether you invoke the script with perl or with a shell command (sh/bash/ksh/etc.)
That is,
$ perl this_script
and
$ sh this_script
will both run the script with perl.
To address your second question, this idiom has just about nothing to do with combining shell script and Perl script in the same file. There are a few different ways to approach that problem, but maybe the most readable way is to write in shell script, but use the shell's heredoc notation to invoke perl code.
#!/bin/bash
# this is a bash script, but there is some Perl in here too
echo this line is printed from the shell
echo now let\'s run some Perl
perl <<EOF
# this is now perl script until we get to the EOF
print "This line is printed from Perl\n";
EOF
echo now this is from the shell script again
1. If you start a Perl script in the usual way:
#!/usr/bin/perl
print "hello world\n";
the #! line will only work if the Perl interpreter is actually installed under /usr/bin. The perl/ksh bilingual script you show is a tricky kluge to make the script work even if perl is installed somewhere else. For more information, see e.g. this.
2. You can't. When the shell process encounters the exec command, it terminates and hands control over to perl. (Technically, it executes perl in place of the shell, without creating a new process.) The only way to run more shell commands after that would be to launch a new shell.
It's way simpler than what's already been posted.
#!$PERLLOCATION/bin/perl
doesn't work because the shebang (#!) line is interpreted by the kernel (not the shell), and the kernel doesn't do variable interpolation.
The code invokes ksh to expand the environment variable and to launch the specified installation of Perl.
Apologies if some of the terminology may be slighlty off here. Feel free to correct me if I use a wrong term for something.
Is it possible to use Perl as an "advanced shell" for running "batch" scripts? (on Windows)
The problem I face when replacing a .bat/.cmd script that's getting too complicated with a perl script is that I can't easily run sub processes as a shell does.
That is, I would like to do the same thing from my perl script as a shell does when invoking a child process, that is, fully "connecting" STDIN, STDOUT and STDERR.
Example:
foo.bat -
#echo off
echo Hello, this is a simple script.
set PARAM_X=really-simple
:: The next line will allow me to simply "use" the tool on the shell I have open,
:: that is STDOUT + STDERR of the tool are displayed on my STDOUT + STDERR and if
:: I enter something on the keyboard it is sent to the tools STDIN
interactive_commandline_tool.exe %PARAM_X%
echo The tool returned %ERROLEVEL%
However, I have no clue how to fully implement this in perl (is it possible at all?):
foo.pl -
print "Hello, this is a not so simple script.\n";
my $param_x = get_more_complicated_parameter();
# Magic: This sub executes the process and hooks up my STDIN/OUT/ERR and
# returns the process error code when done
my $errlvl = run_executable("interactive_commandline_tool.exe", $param_x);
print "The tool returned $errlvl\n";
How can I achieve this in perl? I played around with IPC::Open3 but it seems this doesn't do the trick ...
Probably you'll find IPC::Run3 useful. It allow you to capture both STDOUT and STDERR (but not pipe them in real time). Command error level will be returned in $?.
Why not this way:
print "Hello, this is a not so simple script.\n";
my $param_x = get_more_complicated_parameter();
system('cmd.exe', $param_x);
my $errorlevel = $? >> 8;
print "The tool returned $errorlevel\n";
sub get_more_complicated_parameter { 42 }
I don't have your interactive program, but the shell executed allowed me to enter commands, it has inherited environment defined in perl, etc.
I am using perl as replacement for more complicated shell scripts on Windows for long time and so far everything I needed was possible.
I've got a batch script that does some processing and calls some perl scripts.
My question is if there was a way to put the perl code directly into the batch script and have it run both types of scripts.
Active Perl has been doing this for years!
Below is a skeleton. You can only call perl once though. Because passing it the -x switch says that you'll find the perl code embedded in this file, and perl reads down the file until it finds a perl shebang (#!...perl) and starts executing there. Perl will ignore everything past the __END__ and because you told DOS to goto endofperl it won't bother with anything until it gets to the label.
#rem = '--*-Perl-*--
#echo off
perl -x -S %0 %*
goto endofperl
#rem -- BEGIN PERL -- ';
#!d:/Perl/bin/perl.exe -w
#line 10
use strict;
__END__
:endofperl
Yes you can.
In fact this is exactly what the pl2bat tool does: it transforms a perl program into a batch file which embeds the perl program. Have a look to pl2bat.bat itself.
So you can take the .pl, convert it with pl2bat, and then tweak the batch part as you need. The biggest part of the batch code must be put at the end of the file (near the :end_of_perl label) because in the code at the top you are limited to not using single quotes.
However:
this simple approach will not work if you need to embed more that one perl file
this will be a maintenance nightmare.
So I suggest instead to write the whole process in one Perl program.
Update: if you have one script and some Perl modules that you want to combine in a single batch file, you can combine the Perl file using fatpack, and then apply pl2bat on the result.
There is a way to do this, but it wont be pretty. You can echo your perl code into a temp .pl file and then run that file from within your .bat.
I am trying to send a get or a post through a command-line argument. That is test the script in the command line before I test through a browser (the server has issues). I tried searching online, and I suppose I was probably using incorrect terminology because I got nothing. I know this is possible because I saw someone do it. I just don't remember how it was done.
Thanks! :)
To test a CGI program from the command line, you fake the environment that the server creates for the program. CGI.pm has a special offline mode, but often I find it easier not to use because of the extra setup I need to do for everything else my programs typically expect.
Depending on the implementation of your script, this involves setting many environment variables, which you can do from a wrapper script that pretends to be the server:
#!/bin/bash
export HTTP_COOKIE=...
export HTTP_HOST=test.example.com
export HTTP_REFERER=...
export HTTP_USER_AGENT=...
export PATH_INFO=
export QUERY_STRING=$(cat query_string);
export REQUEST_METHOD=GET
perl program.cgi
If you're doing this for a POST request, the environment is slightly different and you need to supply the POST data on standard input:
#!/bin/bash
export CONTENT_LENGTH=$(perl -e "print -s q/post_data/");
export HTTP_COOKIE=...
export HTTP_HOST=test.example.com
export HTTP_REFERER=...
export HTTP_USER_AGENT=...
export PATH_INFO=...
export QUERY_STRING=$(cat query_string);
export REQUEST_METHOD=POST
perl program.cgi < post_data
You can make this as fancy as you need and each time you want to test the program, you change up the data in the query_string or post_data files. If you don't want to do this in a shell script, it's just as easy to make a wrapper Perl script.
Are you using the standard CGI module?
For example, with the following program (notice -debug in the arguments to use CGI)
#! /usr/bin/perl
use warnings;
use strict;
use CGI qw/ :standard -debug /;
print "Content-type: text/plain\n\n",
map { $_ . " => " . param($_) . "\n" }
param;
you feed it parameters on the command line:
$ ./prog.cgi foo=bar baz=quux
Content-type: text/plain
foo => bar
baz => quux
You can also do so via the standard input:
$ ./prog.cgi
(offline mode: enter name=value pairs on standard input; press ^D or ^Z when done)
foo=bar
baz=quux
^D
Content-type: text/plain
foo => bar
baz => quux
Old discussion, but I was looking for the same answers - so for those who follow - this is what I found out
RTFM! from the CGI man page ( and there is more )
DEBUGGING
If you are running the script from the command line or in the perl
debugger, you can pass the script a list of keywords or parameter=value
pairs on the command line or from standard input (you don't have to
worry about tricking your script into reading from environment
variables). You can pass keywords like this:
your_script.pl keyword1 keyword2 keyword3
or this:
your_script.pl keyword1+keyword2+keyword3
or this:
your_script.pl name1=value1 name2=value2
or this:
your_script.pl name1=value1&name2=value2
To turn off this feature, use the -no_debug pragma.
If you don't want to alter the perl script, you can call it with at least two environment variables set, as others mentioned already. To simulate a GET request:
shell$ QUERY_STRING=limit=20 REQUEST_METHOD=GET ./events_html.pl
That's the console shortcut for www.myserver.org/events_html.pl?limit=20
Yes, it's possible to do this from the command line, bypassing your server. This page explains all: Perl CGI debugging (sitewizard.com) (Especially item 6 on that page). Here I quote the most important part:
To test the script offline using the
GET method, simply set the
QUERY_STRING environment variable
accordingly. If you are using Windows,
you might use the following command
line in a DOS window prior to running
the script in the same window:
set QUERY_STRING=recipient=John#Doe.com&Fullname=M+Name
To test the script offline using the
POST method, put the line below into a
text file named, say, testinput.txt.
recipient=John#Doe.com&Fullname=M+Name
Then redirect that file as an input to
the script. On Unix systems as well as
under Windows' MSDOS prompt, you can
do it this way:
perl -w scriptname.pl < testinput.txt
Your script will then receive that
input as though it was sent it by a
form on the website. Check the error
messages that perl spouts, if any, to
help you track the problem in the
script.
To give a cgi script post data:
$ echo -n 'a=b;c=d' | REQUEST_METHOD=POST CONTENT_LENGTH=999 perl index.cgi
To give a cgi script get data:
$ perl index.cgi 'a=b;c=d'
LWP comes with ready made scripts that can be used from the command-line. Check for GET and POST scripts in your system.
In Windows, you can use VBScript to write a command line util that calls into the MS XML library:
Dim XMLHttp : Set XMLHttp = CreateObject("Microsoft.XMLHTTP")
On Error Resume Next
strIPAddress = WScript.Arguments(0)
strMACAddress = WScript.Arguments(1)
strSubnetMask = WScript.Arguments(2)
On Error Goto 0
WScript.Echo "Attempting to wake host " & strIPAddress & " on NIC " & strMACAddress &
"using netmask " & strSubnetMask
strGetUrl = http://wolService/WolService/WolService.asmx/WakeBroadcast?hostIP=" &
strIPAddress & "&macAddress=" & strMACAddress & "&subnetMask=" & strSubnetMask
XMLHttp.Open "GET", strGetUrl, False
XMLHttp.Send ""
WScript.Echo XMLHttp.ResponseText
Edit: This script sends HTTP requests and can be used from the command line. I got confused by the question 'How can I send POST and GET data to a Perl CGI script via the command line' and thought this was about sending POST and GET data to a Perl CGI script via the command line from an unspecified client OS.