I am developing an automation script in perl in which for authentication I have written a subroutine which takes password input by user and return it to the main perl program which in turn passes the password to the tool that I need to automate.
This script goes fine with every case unless the character # is part of the password. Then it is not able to automate the tool and fails for authentication.
Below is the subroutine which I used for taking password input.
use Win32::Console;
sub password() {
$StdIn = new Win32::Console(STD_INPUT_HANDLE);
my $Password = "";
$StdIn->Mode(ENABLE_PROCESSED_INPUT);
print "Enter Password: ";
while (ord(my $Data = $StdIn->InputChar(1)) != 10) {
if("\r" eq $Data ) {
last;
}
elsif ("\ch" eq $Data) {
if( "" ne chop( $Password )) {
print "\ch \ch";
}
next;
}
$Password .=$Data;
print "*";
}
return $Password;
}
i am calling the above subroutine as
$passwd = &password();
And then passing the $passwd to the tool that I need to automate as below,
This is the line in which I pass the password to tool,
cc -c URL OF THE TOOL:$server -d $domain -t $appl -u $userid -p $passwd; \n";
Can anyone please cross check the code for calling the password() sub routine, returning $password and passing $passwd to the tool. I think error may be at one of the three places.Someone please help and if possible provide the code too.
You are probably passing the user input to an external tool of some sort that doesn't support literal #, perhaps because it is interpreted as "start of comment". The shell is a likely suspect. But without sample code, it is impossible for us to be sure what your problem is.
It seems to me like issue with quoting on the shell side. Proper quoting rules are entirely dependent on your system's command shell. In bash, for example, you can often be fairly safe with something like -p '$password', while transliterating ' to \' in the Perl script. Or there could be a module for that, of which I'm not aware.
However, it's the "cc" thing you're passing data into. The tool could could support easier way to safely pass the password, i.e. avoid the need of quoting arbitrary data. Examples of such facilities are via STDIN, via external file (well that could be actually pretty unsafe :)) or as a null-terminated string. Or passing the hash only. So I'd look into the documentation.
Be aware that by passing user collected data into shell, you're posing a great security risk. Consider using Perl's Taint mode.
Update: You say it's Windows, so I should warn you: quoting for cmd.exe is one of the most complicated, painful and frustrating things I've ever done. Have a look at this question
Related
centos-6.8 perl, v5.10.1 (*) built for x86_64-linux-thread-multi
This question descends from this one Where is the shell command called which invokes OpenSSL commands?. Briefly I am hacking a very old Perl script used to maintain an internal private PKI so that the default signature hashes and key sizes meet current browser requirements.
I have these snippets of code:
. . .
$args->{keypass} = $self->getPassword("Private key password",1)
unless $args->{keypass};
$self->warn("# Password argument: $args->{keypass}\n") if $ENV{CSPDEBUG};
my $cmd = "-out $args->{keyfile} $args->{keysize}";
$cmd = "-des3 -passout pass:$args->{keypass} ".$cmd if defined($args->{keypass});
$self->{openssl}->cmd('genrsa',$cmd,$args);
. . .
$self->{openssl}->cmd('req',"-x509 $common_args -new -out $cacert",$args);
. . .
use IPC::Run qw( start pump finish timeout new_appender new_chunker);
. . .
sub cmd
{
my $self = shift;
my $cmd = shift;
my $cmdline = shift;
my $args = shift;
my $conf;
my $cfgcmd;
. . .
$self->{_handle}->pump while length ${$self->{_in}};
. . .
If the password argument value that the user provides contains no white space then this code performs as desired. If it does contain embedded white space then the code fails silently. If the argument passed to keypass is concatenated with starting and ending single-quotes then the code likewise fails silently. In both cases of failure the script nonetheless reports success.
Why?
What change is necessary to make this code work whether the user input contains spaces or not?
To answer your literal question, let me quote the IPC::Run manual:
"run(), start(), and harness() can all take a harness specification as input. A harness specification is either a single string to be passed to the systems' shell […] or a list of commands, io operations, and/or timers/timeouts to execute."
To prevent the command arguments being parsed by the shell (which is what's causing things to break when the arguments contain spaces), you should not pass them as a single string, but as a reference to an array that contains each individual argument as a single string, something like this:
my #cmd = ("-out", $args->{keyfile}, $args->{keysize});
unshift #cmd, ("-des3", "-passout", "pass:$args->{keypass}") if defined $args->{keypass};
# ...
my $h = start ["openssl", "genrsa", #cmd], \$in, \$out; # or something equivalent
(The code you've posted seems to be using IPC::Run via some custom interface layer; since you haven't shown us exactly what that layer looks like, I've replace it with a simple call to IPC::Run::start.)
In any case, note that passing passwords on the command line is generally considered insecure: if any untrusted users can run code on the same server (even under an unprivileged account), they can see the password simply by running ps ax. The openssl manual notes this, and warns that pass:password "should only be used where security is not important."
A safer alternative would be to send the password over a separate file descriptor. Conveniently, IPC::Run makes this pretty easy:
my #cmd = ("-out", $args->{keyfile}, $args->{keysize});
unshift #cmd, ("-des3", "-passout", "fd:3") if defined $args->{keypass};
# ...
my $h = start ["openssl", "genrsa", #cmd], '<', \$in, '>', \$out, '3<', \$args->{keypass};
Here, the password is passed over the file descriptor number 3; if you need to pass in multiple passwords, you can use file descriptors 4, 5, etc. for those. (Descriptors 0, 1 and 2 are stdin, stdout and stderr.)
Disclaimer: This is, obviously, all untested code. I'm not an expert on IPC::Run, so I may have made some silly command syntax errors or other mistakes. Please test thoroughly before using!
There are several computers , I want to use who command to see who is online . I write a script can let me check all the computers but it seems don't return the information of who is online ....
The computer just pause .
I try type this command => ssh f-001 who , and it works . But when I write it to the script , it fails .
here is my code
#Hosts = ("f-001","f-002","f-003","f-004","f-005");
for($i=0;$i<=$#Hosts;$i++)
{
`ssh $Hosts[$i] who`;
getc();
}
thanks
~
~
The results aren't displayed because while you're executing the command, you're not actually displaying its output; you'd need to do something like
print `ssh $Hosts[$i] who`;
Assuming you're using ssh-agent, Kerberos, or something else that lets you login without giving a password, the pause is just the `getc().
Use system() instead:
#Hosts = ("f-001","f-002","f-003","f-004","f-005");
foreach $host (#Hosts)
{
system ("ssh $host who");
}
And please do not iterate with $i.
Run3 is even more convenient.
I would like to add one thing over here is that if you want to do further processiong with the data coming out from the command then remember you need to capture the output like
my #users = `ssh $Hosts[$i] who`;
I think the answers here cover what you need but would emphasize the value of using foreach e.g.:
foreach my $host ("mail1", "san", "ws100.internal"){ say qx/ping -c1 $host/}
How do you plan to deal with the output? Unless you are watching the terminal you're going to want to log or write the results somewhere. Log::Dispatch is pretty simple but you can make your script log to files, rotate them, send email etc.
If you are going to do a lots of remote execution and monitoring be sure to take a look at Rex https://metacpan.org/pod/Rex (and http://www.rexify.com).
The user is going to enter input string such as Tom's Toy.
However the perl script complains saying "unmatched '."
This is my code.
my $commandline="";
while (#ARGV) {
$_ = shift #ARGV;
{$commandline .= $_ . ' ';}
}
print " Running $commandline\n";
system ($commandline);
Now if the user input is Tom's Toy. I just want to print back Tom's Toy.
However perl complains "unmatched '.".
IF I dont user quote it works fine. (for eg: Tom Toy is good)
How do I fix this issue.
Any help is greatly appreciated.
Thanks in advance
If you switch things around a little to use the system $cmd, #args version of the function, no shell will be invoked, so no escaping will be necessary.
my $cmd = shift #ARGV;
my #args = #ARGV;
print " Running $cmd\n";
system $cmd, #args;
I tested with ./test.pl echo Tom\'s Toy and it gives the expected output:
Running echo
Tom's Toy
system(#ARGV) is probably all you need.
If you give system() a single argument, and if that argument contains any shell metacharacters (including spaces, quotation marks, etc), then the argument will be passed to the shell. jwodder is quite correct: the error message is from the shell, not from Perl.
If you pass system() multiple arguments, it's done without invoking a shell -- which is usually better. The approach you're using takes your program's command-line arguments, joins them together into a single string, then passes that string to the shell, which splits it back into multiple arguments for execution.
On the other hand, sometimes you might want to invoke the shell, for example if you're building up a complex command using pipes, I/O redirection, and so forth, and you don't want to set it all up in Perl. But you have to be careful about metacharacters, as you've seen.
"perldoc -f system" explains this more fully.
If all you want to do is print back the user input, use print, not system. system will try to pass the supplied string to the shell for execution as a command, and it's the shell that's complaining about the unmatched quote.
(Also, instead of manually concatenating #ARGV, may I direct your attention to the join function?)
I am new to perl and I want to use screen input on my script. here is my script and I want the IOS command to be enter from keyboard. can some one show me where I am wrong.the problem i have now the script not read my keyboard input, I am not sure if work on my case. thanks!!
# ### Show #########################################################
$cmd = <STDIN>;
chomp ($cmd);
$show_error = "";
if ($ssh_switch eq 'yes') {
ssh_access();
}
print "\n",h2($host . ' - ' . $cmd);
#output=$cmd;
print hr(),"\n";
}
}
#########################################################################
CGI is designed to take in a complete HTTP request and then spit out a complete HTTP response.
It simply doesn't work interactively.
If you want to write a script that expects keyboard input, then don't use CGI.
In fact, CGI does uses STDIN. It is used to pass the body of POST request. Try this script
#!/usr/bin/perl
print "Content-Type: text/plain\r\n\r\nYou entered: ";
print while <STDIN>;
and POST some data to it, e.g.
$ echo "Just test" | POST http://localhost/yourscript.pl
You entered: Just test
(POST is a program from LWP CPAN distribution)
So you can direct your script with commands read from STDIN, although it is very unsecure as is!
CGI does allow input through STDIN; try CGI->new(\*STDIN).
Though it may not be how you want to enter things. Can you give an example of what your input looks like?
Ah, it looks to me like you want to either:
run your script from the command line, like: perl scriptname 'Submit=1&Devices=router1&Devices=router2' and then provide your cisco commands on STDIN (and get html output, which may be inconvenient), or
run your script in a browser, in which case you should replace the STDIN usage with an input tag for entering commands and get the commands from that named parameter
Looking again, I see you already have an input tag "search", you just aren't using it.
Try replacing <STDIN> with $cgi->param('search') and adding search to the "make sure data was input" check.
I'm attempting to run a CGI script in the current environment from another Perl module. All works well using standard systems calls for GET requests. POST is fine too, until the parameter list gets too long, then they get cut off.
Has anyone ran in to this problem, or have any suggestions for other ways to attempt this?
The following are somewhat simplified for clarity. There is more error checking, etc.
For GET requests and POST requests w/o parameters, I do the following:
# $query is a CGI object.
my $perl = $^X;
my $cgi = $cgi_script_location; # /path/file.cgi
system {$perl} $cgi;
Parameters are passed through the
QUERY_STRING environment variable.
STDOUT is captured by the calling
script so whatever the CGI script
prints behaves as normal.
This part works.
For POST requests with parameters the following works, but seemingly limits my available query length:
# $query is a CGI object.
my $perl = $^X;
my $cgi = $cgi_script_location; # /path/file.cgi
# Gather parameters into a URL-escaped string suitable
# to pass to a CGI script ran from the command line.
# Null characters are handled properly.
# e.g., param1=This%20is%20a%20string¶m2=42&... etc.
# This works.
my $param_string = $self->get_current_param_string();
# Various ways to do this, but system() doesn't pass any
# parameters (different question).
# Using qx// and printing the return value works as well.
open(my $cgi_pipe, "|$perl $cgi");
print {$cgi_pipe} $param_string;
close($cgi_pipe);
This method works for short parameter lists, but if the entire command gets to be close to 1000 characters, the parameter list is cut short. This is why I attempted to save the parameters to a file; to avoid shell limitations.
If I dump the parameter list from the executed CGI script I get something like the following:
param1=blah
... a bunch of other parameters ...
paramN=whatever
p <-- cut off after 'p'. There are more parameters.
Other things I've done that didn't help or work
Followed the CGI troubleshooting guide
Saved the parameters to a file using CGI->save(), passing that file to the CGI script. Only the first parameter is read using this method.
$> perl index.cgi < temp-param-file
Saved $param_string to a file, passing that file to the CGI script just like above. Same limitations as passing the commands through the command line; still gets cut off.
Made sure $CGI::POST_MAX is acceptably high (it's -1).
Made sure the CGI's command-line processing was working. (:no_debug is not set)
Ran the CGI from the command line with the same parameters. This works.
Leads
Obviously, this seems like a character limit of the shell Perl is using to execute the command, but it wasn't resolved by passing the parameters through a file.
Passign parameters to system as a single string, from HTTP input, is extremely dangerous.
From perldoc -f system,
If there is only one scalar argument, the argument is checked for shell metacharacters, and if there are any, the entire argument is passed to the system's command shell for parsing (this is /bin/sh -c on Unix platforms, but varies on other platforms). If there are no shell metacharacters in the argument,..
In other words, if I pass in arguments -e printf("working..."); rm -rf /; I can delete information from your disk (everything if your web server is running as root). If you choose to do this, make sure you call system("perl", #cgi) instead.
The argument length issue you're running into may be an OS limitation (described at http://www.in-ulm.de/~mascheck/various/argmax/):
There are different ways to learn the upper limit:
command: getconf ARG_MAX
system header: ARG_MAX in e.g. <[sys/]limits.h>
Saving to a temp file is risky: multiple calls to the CGI might save to the same file, creating a race condition where one user's parameters might be used by another user's process.
You might try opening a file handle to the process and passing arguments as standard input, instead. open my $perl, '|', 'perl' or die; fprintf(PERL, #cgi);
I didn't want to do this, but I've gone with the most direct approach and it works. I'm tricking the environment to think the request method is GET so that the called CGI script will read its input from the QUERY_STRING environment variable it expects. Like so:
$ENV{'QUERY_STRING'} = $long_parameter_string . '&' . $ENV{'QUERY_STRING'};
$ENV{'REQUEST_METHOD'} = 'GET';
system {$perl_exec} $cgi_script;
I'm worried about potential problems this may cause, but I can't think of what this would harm, and it works well so far. But, because I'm worried, I thought I'd ask the horde if they saw any potential problems:
Are there any problems handling a POST request as a GET request on the server
I'll save marking this as the official answer until people have confirmed or at least debated it on the above post.
Turns out that the problem is actually related to the difference in Content-Length between the original parameters and the parameter string I cobbled together. I didn't realize that the CGI module was using this value from the original headers as the limit to how much input to read (makes sense!). Apparently the extra escaping I was doing was adding some characters.
My solution's trick is simply to piece together the parameter string I'll be passing and modify the environment variable the CGI module will check to determine the content length to be equal to the .
Here's the final working code:
use CGI::Util qw(escape);
my $params;
foreach my $param (sort $query->param) {
my $escaped_param = escape($param);
foreach my $value ($query->param($param)) {
$params .= "$escaped_param=" . escape("$value") . "&";
}
}
foreach (keys %{$query->{'.fieldnames'}}) {
$params .= ".cgifields=" . escape("$_") . "&";
}
# This is the trick.
$ENV{'CONTENT_LENGTH'} = length($params);
open(my $cgi_pipe, "| $perl $cgi_script") || die("Cannot fork CGI: $!");
local $SIG{PIPE} = sub { warn "spooler pipe broke" };
print {$cgi_pipe} $params;
warn("param chars: " . length($params));
close($cgi_pipe) || warn "Error: CGI exited with value $?";
Thanks for all the help!