Call one URL from Perl script - perl

I have one Perl script which contains some code to call one URL. URL pointing to a servlet for storing some data in the data base, based on the parameter passing through the url.I was working with the code below but failed to call the url and fail to store the data in the data base.
#!/appl/teamsite/iw-perl/bin/iwperl
for (#ARGV) { printf "%d %s\n", $i++, $_};
my $environment=$ARGV[0];
my $jobid = $ARGV[1];
my $taskID = $ARGV[2];
my $workArea= $ARGV[3];
my $jobDocument = $ARGV[4];
my $url = "http://localhost:7001/JCreationServlet?command=build"."&environment=".$ARGV[0]."&jobID=".$ARGV[1]."&taskID=".$ARGV[2]."&workArea=".$ARGV[3]."&jobDocument=".$ARGV[4];
print "Url is $url\n";
'wget '.$url;
Please help me so that I can able to call the URL.

'wget '.$url;
should be backticks, if you want it to be executed
`wget $url`

In your code:
'wget '.$url;
is simply a string.
Please refer this to execute the external commands.

You could use `wget $url`, instead of what you have and this would fix it. However, I recommend using the system call, as it is better practice:
system("wget",$url);
However, if you for some reason need the output, then it would be best to use the backticks and capture it like this:
my $response = `wget $url`;

Related

Changing query params and pass to other .cgi file executed by eval do in perl

I am working on perl CGI file which calls to another .CGI file with the help of eval do{}
now i tried to change query params and append one more parameter to list of query params with the help of below code:
file1.cgi use CGI;
my $q = CGI->new();
$q->param(-value=>'new value', -name=>'field1'); #appending query param
my $field1 = $query->param('field1');
print "===> $field1 <==="; #prints the value of field1
open MACROFILE,"<file2.cgi" or print "Could not open file";
my $mstatus = eval do{local $/;<MACROFILE>} or print $#; ## passing content of file to eval
close MACROFILE;
Below is the code in file2.cgi which is getting executed in eval:
file2.cgi:
use CGI;
my $query = new CGI;
my $field1 = $query->param('field1');
print "===> $field1 <==="; #Empty value since "field1" in not found in query params
why appended query params are not getting cached, is there any other way to do the same?
As file2.cgi is a CGI program, it expects to be called as a CGI program - i.e. using a HTTP request. Of course you can't just eval the file and expect it to work.
Is there a good reason why file2.cgi needs to be a CGI program? Does it live on another server or something like that? If that's the case, then you should call it doing something like this:
use LWP::Simple;
my $response = get("http://your-server.com/file2.cgi?field1=new+value");
But if you're just calling a program on the same server, you can just call it using system().
# Renamed the file as it's no longer a CGI program
system('/path/to/file2.pl', 'new value');
In this case, you'll need to rewrite file2 so that it reads its arguments from the command like.
my $field1 = #ARGV[0];
But perhaps I'm completely misunderstanding and you have a good reason for taking the baroque approach that you're trying here.

How to run a perl script from another passing parameters?

How to run a perl script from another passing parameters?
I'm trying to use a solution found in a internet post that i can't find anymore.
It was something like:
do 'script.cgi param1 param2';
And in the other script I'm using simply the shift to get those parameters:
#Parameters
my $param1= shift;
my $param2= shift;
I saw people using system with args, but is it better for real?
If not, how can I fix the solution with 'do EXPR'?
Thanks in advance.
Oh well, I solved doing:
{local #ARGV = (#my_args); do $script;}
It works. If anybody has any better suggestions feel free to tell them to me.
Meantime i'm using this solution.
Actually, there are two better ways I can think of:
system($script, #my_args);
and
my $cmd = $script . ' ' . join(' ', #my_args);
my $return = `$cmd`;
Both solutions pass the arguments in #my_args. The system() call returns the exit code of the executed program, while the backticks solution (``) returns the output for later parsing.

In Perl, how do I send CGI parameters on the command line?

Normally i get the data from a webpage but i want to send it from the command line to facilitate debugging.
To get the data i do something like:
my $query = new CGI;
my $username = $query->param("the_username");
this doesn't seem to work:
$ ./script.pl the_username=user1
EDIT:
Actually the above works. The if statement that checked $username was wrong (using == instead of eq).
As I found out long time ago, you can indeed pass query string parameters to a script using CGI.pm. I am not recommending this as a preferred debugging method (better to have replicable stuff saved in files which are then directed to the STDIN of the script), however, it does work:
#!/usr/bin/env perl
use warnings; use strict;
use CGI;
my $cgi = CGI->new;
my $param_name = 'the_username';
printf(
"The value of '%s' is '%s'.\n",
$param_name, $cgi->param($param_name)
);
Output:
$ ./t.pl the_username=yadayada
The value of 'the_username' is 'yadayada'.
CGI reads the variables from standard input.
See this part of the CGI.pm documentation:
http://search.cpan.org/dist/CGI/lib/CGI.pod#DEBUGGING

how to use stdin on perl cgi

I am new to perl and I want to use screen input on my script. here is my script and I want the IOS command to be enter from keyboard. can some one show me where I am wrong.the problem i have now the script not read my keyboard input, I am not sure if work on my case. thanks!!
# ### Show #########################################################
$cmd = <STDIN>;
chomp ($cmd);
$show_error = "";
if ($ssh_switch eq 'yes') {
ssh_access();
}
print "\n",h2($host . ' - ' . $cmd);
#output=$cmd;
print hr(),"\n";
}
}
#########################################################################
CGI is designed to take in a complete HTTP request and then spit out a complete HTTP response.
It simply doesn't work interactively.
If you want to write a script that expects keyboard input, then don't use CGI.
In fact, CGI does uses STDIN. It is used to pass the body of POST request. Try this script
#!/usr/bin/perl
print "Content-Type: text/plain\r\n\r\nYou entered: ";
print while <STDIN>;
and POST some data to it, e.g.
$ echo "Just test" | POST http://localhost/yourscript.pl
You entered: Just test
(POST is a program from LWP CPAN distribution)
So you can direct your script with commands read from STDIN, although it is very unsecure as is!
CGI does allow input through STDIN; try CGI->new(\*STDIN).
Though it may not be how you want to enter things. Can you give an example of what your input looks like?
Ah, it looks to me like you want to either:
run your script from the command line, like: perl scriptname 'Submit=1&Devices=router1&Devices=router2' and then provide your cisco commands on STDIN (and get html output, which may be inconvenient), or
run your script in a browser, in which case you should replace the STDIN usage with an input tag for entering commands and get the commands from that named parameter
Looking again, I see you already have an input tag "search", you just aren't using it.
Try replacing <STDIN> with $cgi->param('search') and adding search to the "make sure data was input" check.

Is it possible to send POST parameters to a CGI script without another HTTP request?

I'm attempting to run a CGI script in the current environment from another Perl module. All works well using standard systems calls for GET requests. POST is fine too, until the parameter list gets too long, then they get cut off.
Has anyone ran in to this problem, or have any suggestions for other ways to attempt this?
The following are somewhat simplified for clarity. There is more error checking, etc.
For GET requests and POST requests w/o parameters, I do the following:
# $query is a CGI object.
my $perl = $^X;
my $cgi = $cgi_script_location; # /path/file.cgi
system {$perl} $cgi;
Parameters are passed through the
QUERY_STRING environment variable.
STDOUT is captured by the calling
script so whatever the CGI script
prints behaves as normal.
This part works.
For POST requests with parameters the following works, but seemingly limits my available query length:
# $query is a CGI object.
my $perl = $^X;
my $cgi = $cgi_script_location; # /path/file.cgi
# Gather parameters into a URL-escaped string suitable
# to pass to a CGI script ran from the command line.
# Null characters are handled properly.
# e.g., param1=This%20is%20a%20string&param2=42&... etc.
# This works.
my $param_string = $self->get_current_param_string();
# Various ways to do this, but system() doesn't pass any
# parameters (different question).
# Using qx// and printing the return value works as well.
open(my $cgi_pipe, "|$perl $cgi");
print {$cgi_pipe} $param_string;
close($cgi_pipe);
This method works for short parameter lists, but if the entire command gets to be close to 1000 characters, the parameter list is cut short. This is why I attempted to save the parameters to a file; to avoid shell limitations.
If I dump the parameter list from the executed CGI script I get something like the following:
param1=blah
... a bunch of other parameters ...
paramN=whatever
p <-- cut off after 'p'. There are more parameters.
Other things I've done that didn't help or work
Followed the CGI troubleshooting guide
Saved the parameters to a file using CGI->save(), passing that file to the CGI script. Only the first parameter is read using this method.
$> perl index.cgi < temp-param-file
Saved $param_string to a file, passing that file to the CGI script just like above. Same limitations as passing the commands through the command line; still gets cut off.
Made sure $CGI::POST_MAX is acceptably high (it's -1).
Made sure the CGI's command-line processing was working. (:no_debug is not set)
Ran the CGI from the command line with the same parameters. This works.
Leads
Obviously, this seems like a character limit of the shell Perl is using to execute the command, but it wasn't resolved by passing the parameters through a file.
Passign parameters to system as a single string, from HTTP input, is extremely dangerous.
From perldoc -f system,
If there is only one scalar argument, the argument is checked for shell metacharacters, and if there are any, the entire argument is passed to the system's command shell for parsing (this is /bin/sh -c on Unix platforms, but varies on other platforms). If there are no shell metacharacters in the argument,..
In other words, if I pass in arguments -e printf("working..."); rm -rf /; I can delete information from your disk (everything if your web server is running as root). If you choose to do this, make sure you call system("perl", #cgi) instead.
The argument length issue you're running into may be an OS limitation (described at http://www.in-ulm.de/~mascheck/various/argmax/):
There are different ways to learn the upper limit:
command: getconf ARG_MAX
system header: ARG_MAX in e.g. <[sys/]limits.h>
Saving to a temp file is risky: multiple calls to the CGI might save to the same file, creating a race condition where one user's parameters might be used by another user's process.
You might try opening a file handle to the process and passing arguments as standard input, instead. open my $perl, '|', 'perl' or die; fprintf(PERL, #cgi);
I didn't want to do this, but I've gone with the most direct approach and it works. I'm tricking the environment to think the request method is GET so that the called CGI script will read its input from the QUERY_STRING environment variable it expects. Like so:
$ENV{'QUERY_STRING'} = $long_parameter_string . '&' . $ENV{'QUERY_STRING'};
$ENV{'REQUEST_METHOD'} = 'GET';
system {$perl_exec} $cgi_script;
I'm worried about potential problems this may cause, but I can't think of what this would harm, and it works well so far. But, because I'm worried, I thought I'd ask the horde if they saw any potential problems:
Are there any problems handling a POST request as a GET request on the server
I'll save marking this as the official answer until people have confirmed or at least debated it on the above post.
Turns out that the problem is actually related to the difference in Content-Length between the original parameters and the parameter string I cobbled together. I didn't realize that the CGI module was using this value from the original headers as the limit to how much input to read (makes sense!). Apparently the extra escaping I was doing was adding some characters.
My solution's trick is simply to piece together the parameter string I'll be passing and modify the environment variable the CGI module will check to determine the content length to be equal to the .
Here's the final working code:
use CGI::Util qw(escape);
my $params;
foreach my $param (sort $query->param) {
my $escaped_param = escape($param);
foreach my $value ($query->param($param)) {
$params .= "$escaped_param=" . escape("$value") . "&";
}
}
foreach (keys %{$query->{'.fieldnames'}}) {
$params .= ".cgifields=" . escape("$_") . "&";
}
# This is the trick.
$ENV{'CONTENT_LENGTH'} = length($params);
open(my $cgi_pipe, "| $perl $cgi_script") || die("Cannot fork CGI: $!");
local $SIG{PIPE} = sub { warn "spooler pipe broke" };
print {$cgi_pipe} $params;
warn("param chars: " . length($params));
close($cgi_pipe) || warn "Error: CGI exited with value $?";
Thanks for all the help!