I am new to perl and I want to use screen input on my script. here is my script and I want the IOS command to be enter from keyboard. can some one show me where I am wrong.the problem i have now the script not read my keyboard input, I am not sure if work on my case. thanks!!
# ### Show #########################################################
$cmd = <STDIN>;
chomp ($cmd);
$show_error = "";
if ($ssh_switch eq 'yes') {
ssh_access();
}
print "\n",h2($host . ' - ' . $cmd);
#output=$cmd;
print hr(),"\n";
}
}
#########################################################################
CGI is designed to take in a complete HTTP request and then spit out a complete HTTP response.
It simply doesn't work interactively.
If you want to write a script that expects keyboard input, then don't use CGI.
In fact, CGI does uses STDIN. It is used to pass the body of POST request. Try this script
#!/usr/bin/perl
print "Content-Type: text/plain\r\n\r\nYou entered: ";
print while <STDIN>;
and POST some data to it, e.g.
$ echo "Just test" | POST http://localhost/yourscript.pl
You entered: Just test
(POST is a program from LWP CPAN distribution)
So you can direct your script with commands read from STDIN, although it is very unsecure as is!
CGI does allow input through STDIN; try CGI->new(\*STDIN).
Though it may not be how you want to enter things. Can you give an example of what your input looks like?
Ah, it looks to me like you want to either:
run your script from the command line, like: perl scriptname 'Submit=1&Devices=router1&Devices=router2' and then provide your cisco commands on STDIN (and get html output, which may be inconvenient), or
run your script in a browser, in which case you should replace the STDIN usage with an input tag for entering commands and get the commands from that named parameter
Looking again, I see you already have an input tag "search", you just aren't using it.
Try replacing <STDIN> with $cgi->param('search') and adding search to the "make sure data was input" check.
Related
I would like to capture all output (both STDOUT and STDERR) of a command that also requires user interaction from the terminal window, i.e. it reads STDIN and then prints something to STDOUT.
Here is minimal version of the script I want to capture the output from:
user.pl:
#! /usr/bin/env perl
use feature qw(say);
use strict;
use warnings;
print "Enter URL: ";
my $ans = <STDIN>;
# do something based on $ans
say "Verification code: AIwquj2VVkwlWEBwway";
say "Access Token: bskjZO8iZotv!";
I tried using Capture::Tiny :
p.pl:
#! /usr/bin/env perl
use feature qw(say);
use strict;
use warnings;
use Capture::Tiny qw(tee_merged);
my $output = tee_merged {
#STDOUT->autoflush(1); # This does not work
system "user.pl";
};
if ( $output =~ /Access Token: (.*)$/ ) {
say $1;
}
but it does not work, since the prompt is not displayed until after the user has entered the input in the terminal.
Edit:
It seems it works fine if I replace user.pl with a python script. For example:
user.py:
#! /usr/bin/env python3
ans = input( 'Enter URL: ' )
# do something based on $ans
print( 'Verification code: AIwquj2VVkwlWEBwway' )
print( 'Access Token: bskjZO8iZotv!' )
TL/DR There is a solution, it's somewhat ugly, but it works. There are some minor caveats.
What's going on? The problem is actually in user.pl. The sample user.pl that you provided works like this: It starts by printing the string Enter URL: to its stdout, it then flushes its stdout and it then reads a line from its stdin. The flushing of the stdout occurs automatically by perl: when you try do read from stdin with <..> (aka readline), perl flushes stdout. It does that precisely to make programs like this behave correctly. Unfortunately, it appears that perl only implements this behavior when stdout is a tty (pseudo-terminal). If not, it does not flush stdout before reading from stdin. This is why the script works when you execute it in an interactive terminal session and it doesn't work correctly when you try to capture its output (because in that case its stdout is connected to a pipe).
How to fix this? Since user.pl misbehaves if its stdout is not a tty, we must use a tty. AFAIK, IPC::Run is the only perl module that can capture the output of a subprocess using a tty instead of a plain pipe. Unfortunately, when using a tty, IPC::Run does not allow us to redirect stdout only, it forces us to redirect stdin too. Because of that, we have to handle reading from stdin in the parent process on behalf of the child process (yikes!). Here's an example implementation of p.pl using IPC::Run:
#!/usr/bin/perl
use strict;
use warnings;
use IO::Handle;
use IPC::Run;
my $complete_output='';
my $in='';
my $out='';
my $h=IPC::Run::start ['./user.pl'],'<pty<',\$in,'>pty>',\$out;
while ($h->pumpable) {
$h->pump;
print $out;
STDOUT->flush;
if ($out eq 'Enter URL: ') {
$in.=<STDIN>;
}
$complete_output.=$out;
$out='';
}
$h->finish;
# do something with $complete_output here
So this is somewhat ugly. For example, we try do detect when the subprocess is waiting for user input (by looking for the string Enter URL:) and when it does, we read the user input in the parent process and then pass it to the child. Also notice that we have to implement the tee functionality ourselves since IPC::Run doesn't offer that.
There are some caveats. The way we handle user input, if the subprocess uses something like the readline library to support line editing, this will not work, because we do all the reading in the parent process with a simple <STDIN>. Also, because a tty is used behind the scenes instead of a pipe, all user input will be echoed to stdout. So whatever the user types in prompt, we put it in $in to send it to the process and will get it back from the process (via the $out variable). But since our terminal has also echo, the text will appear twice. One solution is filter $out to remove the user input and to prevent us from printing it.
Finally, this will not work on Windows.
Write your input prompt directly to the tty.
open TTY, '>', '/dev/tty'; # or 'con' in Windows
print TTY "Enter URL:";
my $ans = <STDIN>;
...
I'm trying to get user input from a web page written in Perl and send it to a local program (blastp), then display the results.
This is what I have right now:
(input code)
print $q->p, "Your database: $bd",
$q->p, "Your protein is: $prot",
$q->p, "Executing...";
print $q->p, system("blastp","-db $bd","-query $prot","-out results.out");
Now, I've done a little research, but I can't quite grasp how you're supposed to do things like this in Perl. I've tried opening a file, writing to it, and sending it over to blastp as an input, but I was unsuccessful.
For reference, this line produces a successful output file:
kold#sazabi ~/BLAST/pataa $ blastp -db pataa -query ../teste.fs -out results.out
I may need to force the bd to load from an absolute path, but that shouldn't be difficult.
edit: Yeah, the DBs have an environmental variable, that's fixed. Ok, all I need is to get the input into a file, pass it to the command, and then print the output file to the CGI page.
edit2: for clarification:
I am receiving user input in $prot, I want to pass it over to blastp in -query, have the program blastp execute, and then print out to the user the results.out file (or just have a link to it, since blastp can output in HTML)
EDIT:
All right, fixed everything I needed to fix. The big problem was me not seeing what was going wrong: I had to install Tiny:Capture and print out stderr, which was when I realized the environmental variable wasn't getting set correctly, so BLAST wasn't finding my databases. Thanks for all the help!
Write $prot to the file. Assuming you need to do it as-is without processing the text to split it or something:
For a fixed file name (may be problematic):
use File::Slurp;
write_file("../teste.fs", $prot, "\n") or print_error_to_web();
# Implement the latter to print error in nice HTML format
For a temp file (better):
my ($fh, $filename) = tempfile( $template, DIR => "..", CLEANUP => 1);
# You can also create temp directory which is even better, via tempdir()
print $fh "$prot\n";
close $fh;
Step 2: Run your command as you indicated:
my $rc = system("$BLASTP_PATH/blastp", "-db", "pataa"
,"-query", "../teste.fs", "-out", "results.out");
# Process $rc for errors
# Use qx[] instead of system() if you want to capture
# standard output of the command
Step 3: Read the output file in:
use File::Slurp;
my $out_file_text = read_file("results.out");
Send back to web server
print $q->p, $out_file_text;
The above code has multiple issues (e.g. you need better file/directory paths, more error handling etc...) but should start you on the right track.
I basically want to reopen STDERR/STDOUT so they write to one logfile with both the stream and the timestamp included on every line. So print STDERR "Hello World" prints STDERR: 20130215123456: Hello World. I don't want to rewrite all my print statements into function calls, also some of the output will be coming from external processes via system() calls anyway which I won't be able to rewrite.
I also need for the output to be placed in the file "live", i.e. not only written when the process completes.
(p.s. I'm not asking particularly for details of how to generate timestamps, just how to redirect to a file and prepend a string)
I've worked out the following code, but it's messy:
my $mode = ">>";
my $file = "outerr.txt";
open(STDOUT, "|-", qq(perl -e 'open(FILE, "$mode", "$file"); while (<>) { print FILE "STDOUT: \$\_"; }'));
open(STDERR, "|-", qq(perl -e 'open(FILE, "$mode", "$file"); while (<>) { print FILE "STDERR: \$\_"; }'));
(The above doesn't add dates, but that should be trivial to add)
I'm looking for a cleaner solution, one that doesn't require quoting perl code and passing it on the command line, or at least module that hides some of the complexity. Looking at the code for Capture::Tiny it doesn't look like it can handle writing a part of output, though I'm not sure about that. annotate-output only works on an external command sadly, I need this to work on both external commands and ordinary perl printing.
The child launched via system doesn't write to STDOUT because it does not have access to variables in your program. Therefore, means having code run on a Perl file handle write (e.g. tie) won't work.
Write another script that runs your script with STDOUT and STDERR replaced with pipes. Read from those pipes and print out the modified output. I suggest using IPC::Run to do this, because it'll save you from using select. You can get away without it if you combine STDOUT and STDERR in one stream.
I'm trying to modify a script that someone else has written and I wanted to keep my script separate from his.
The script I wrote ends with a print line that outputs all relevant data separated by spaces.
Ex: print "$sap $stuff $more_stuff";
I want to use this data in the middle of another perl script and I'm not sure if it's possible using a system call to the script I wrote.
Ex: system("./sap_calc.pl $id"); #obtain printed data from sap_calc.pl here
Can this be done? If not, how should I go about this?
Somewhat related, but not using system():
How can I get one Perl script to see variables in another Perl script?
How can I pass arguments from one Perl script to another?
You're looking for the "backtick operator."
Have a look at perlop, Section "Quote-like operators".
Generally, capturing a program's output goes like this:
my $output = `/bin/cmd ...`;
Mind that the backtick operator captures STDOUT only. So in order to capture everything (STDERR, too) the commands needs to be appended with the usual shell redirection "2>&1".
If you want to use the data printed to stdout from the other script, you'd need to use backticks or qx().
system will only return the return value of the shell command, not the actual output.
Although the proper way to do this would be to import the actual code into your other script, by building a module, or simply by using do.
As a general rule, it is better to use all perl solutions, than relying on system/shell as a way of "simplifying".
myfile.pl:
sub foo {
print "Foo";
}
1;
main.pl:
do 'myfile.pl';
foo();
perldoc perlipc
Backquotes, like in shell, will yield the standard output of the command as a string (or array, depending on context). They can more clearly be written as the quote-like qx operator.
#lines = `./sap_calc.pl $id`;
#lines = qx(./sap_calc.pl $id);
$all = `./sap_calc.pl $id`;
$all = qx(./sap_calc.pl $id);
open can also be used for streaming instead of reading into memory all at once (as qx does). This can also bypass the shell, which avoids all sorts of quoting issues.
open my $fh, '-|', './sap_calc.pl', $id;
while (readline $fh) {
print "read line: $_";
}
I'm attempting to run a CGI script in the current environment from another Perl module. All works well using standard systems calls for GET requests. POST is fine too, until the parameter list gets too long, then they get cut off.
Has anyone ran in to this problem, or have any suggestions for other ways to attempt this?
The following are somewhat simplified for clarity. There is more error checking, etc.
For GET requests and POST requests w/o parameters, I do the following:
# $query is a CGI object.
my $perl = $^X;
my $cgi = $cgi_script_location; # /path/file.cgi
system {$perl} $cgi;
Parameters are passed through the
QUERY_STRING environment variable.
STDOUT is captured by the calling
script so whatever the CGI script
prints behaves as normal.
This part works.
For POST requests with parameters the following works, but seemingly limits my available query length:
# $query is a CGI object.
my $perl = $^X;
my $cgi = $cgi_script_location; # /path/file.cgi
# Gather parameters into a URL-escaped string suitable
# to pass to a CGI script ran from the command line.
# Null characters are handled properly.
# e.g., param1=This%20is%20a%20string¶m2=42&... etc.
# This works.
my $param_string = $self->get_current_param_string();
# Various ways to do this, but system() doesn't pass any
# parameters (different question).
# Using qx// and printing the return value works as well.
open(my $cgi_pipe, "|$perl $cgi");
print {$cgi_pipe} $param_string;
close($cgi_pipe);
This method works for short parameter lists, but if the entire command gets to be close to 1000 characters, the parameter list is cut short. This is why I attempted to save the parameters to a file; to avoid shell limitations.
If I dump the parameter list from the executed CGI script I get something like the following:
param1=blah
... a bunch of other parameters ...
paramN=whatever
p <-- cut off after 'p'. There are more parameters.
Other things I've done that didn't help or work
Followed the CGI troubleshooting guide
Saved the parameters to a file using CGI->save(), passing that file to the CGI script. Only the first parameter is read using this method.
$> perl index.cgi < temp-param-file
Saved $param_string to a file, passing that file to the CGI script just like above. Same limitations as passing the commands through the command line; still gets cut off.
Made sure $CGI::POST_MAX is acceptably high (it's -1).
Made sure the CGI's command-line processing was working. (:no_debug is not set)
Ran the CGI from the command line with the same parameters. This works.
Leads
Obviously, this seems like a character limit of the shell Perl is using to execute the command, but it wasn't resolved by passing the parameters through a file.
Passign parameters to system as a single string, from HTTP input, is extremely dangerous.
From perldoc -f system,
If there is only one scalar argument, the argument is checked for shell metacharacters, and if there are any, the entire argument is passed to the system's command shell for parsing (this is /bin/sh -c on Unix platforms, but varies on other platforms). If there are no shell metacharacters in the argument,..
In other words, if I pass in arguments -e printf("working..."); rm -rf /; I can delete information from your disk (everything if your web server is running as root). If you choose to do this, make sure you call system("perl", #cgi) instead.
The argument length issue you're running into may be an OS limitation (described at http://www.in-ulm.de/~mascheck/various/argmax/):
There are different ways to learn the upper limit:
command: getconf ARG_MAX
system header: ARG_MAX in e.g. <[sys/]limits.h>
Saving to a temp file is risky: multiple calls to the CGI might save to the same file, creating a race condition where one user's parameters might be used by another user's process.
You might try opening a file handle to the process and passing arguments as standard input, instead. open my $perl, '|', 'perl' or die; fprintf(PERL, #cgi);
I didn't want to do this, but I've gone with the most direct approach and it works. I'm tricking the environment to think the request method is GET so that the called CGI script will read its input from the QUERY_STRING environment variable it expects. Like so:
$ENV{'QUERY_STRING'} = $long_parameter_string . '&' . $ENV{'QUERY_STRING'};
$ENV{'REQUEST_METHOD'} = 'GET';
system {$perl_exec} $cgi_script;
I'm worried about potential problems this may cause, but I can't think of what this would harm, and it works well so far. But, because I'm worried, I thought I'd ask the horde if they saw any potential problems:
Are there any problems handling a POST request as a GET request on the server
I'll save marking this as the official answer until people have confirmed or at least debated it on the above post.
Turns out that the problem is actually related to the difference in Content-Length between the original parameters and the parameter string I cobbled together. I didn't realize that the CGI module was using this value from the original headers as the limit to how much input to read (makes sense!). Apparently the extra escaping I was doing was adding some characters.
My solution's trick is simply to piece together the parameter string I'll be passing and modify the environment variable the CGI module will check to determine the content length to be equal to the .
Here's the final working code:
use CGI::Util qw(escape);
my $params;
foreach my $param (sort $query->param) {
my $escaped_param = escape($param);
foreach my $value ($query->param($param)) {
$params .= "$escaped_param=" . escape("$value") . "&";
}
}
foreach (keys %{$query->{'.fieldnames'}}) {
$params .= ".cgifields=" . escape("$_") . "&";
}
# This is the trick.
$ENV{'CONTENT_LENGTH'} = length($params);
open(my $cgi_pipe, "| $perl $cgi_script") || die("Cannot fork CGI: $!");
local $SIG{PIPE} = sub { warn "spooler pipe broke" };
print {$cgi_pipe} $params;
warn("param chars: " . length($params));
close($cgi_pipe) || warn "Error: CGI exited with value $?";
Thanks for all the help!