I've written a wrapper program for mailx using perl that allows me to easily add attachments and do some other nifty things that were a little frustrating to accomplish with mailx.
In the first few lines I have:
use strict;
use warnings;
use Getopt::Long;
my ( $to, $from, $subject, $attachments, $body, $file ) = (undef) x 7;
GetOptions(
"to=s" => \$to,
"from=s" => \$from,
"subject=s" => \$subject,
"attachments=s" => \$attachments,
"body=s" => \$body,
"file=s" => \$file,
);
$to = getlogin unless $to;
$from = getlogin unless $from;
$subject = " " unless $subject;
This wrapper up until now has worked fine when being called by other scripts. However now that we have a script being run by the Cron some funny things are happening. This Cron job calls the wrapper by only specifying -t and -su but omitting -fr (yes abbreviations of the flags are being used). The resulting email correctly sets the To: however has the Sender listed as -s#blah.com with the subject line blank. As per the above code I can only assume that there is something strange going between Cron and the Getopt::Long module. Does anyone know why a Cron job may cause this odd behavior? If it is something else that is wrong what would it be?
Perl's getlogin probably doesn't return anything useful from cron, quoting from getlogin(3):
getlogin() returns a pointer to a string containing
the name of the user logged in on the controlling
terminal of the process, or a null pointer if this
information cannot be determined.
I suggest changing your crontab to always include the username explicitly for any options that rely on getlogin. You could also change your wrapper to use getpwuid($<). (See perlvar(1) and perlfunc(1) for details on $< and getpwuid.)
Why that screws up your mailx, I don't know, but I'm going to guess you're using backticks, exec or system with a string to start mailx, rather than exec or system with a list.
Related
I'm new to using modules in Perl. My head is exploding right now and i would like to know what is wrong in here:
#!/usr/bin/perl
use strict;
use Mail::Mailer;
my $from_adress = "email\#xxxxx.com";
my $to_adress = "email\#hxxxx.com";
my $subject = "There goes bananas\n";
my $body = "Here is the bananas";
my $server = "smtp.gmail.com";
my $mailer = Mail::Mailer->new("smtp", Server => $server);
$mailer->open({
From => $from_adress,
To => $to_adress,
Subject => $subject,
});
print $mailer $body;
$mailer->close();
open(F, '>>', $Mail::Mailer::testfile::config{outfile});
print F #_;
print #_;
close (F);
Sorry to post the whole script but i'm not sure where it went wrong. I don't get any print from #_ variable. I would love to receive advises on how to improve in using modules in Perl and how i can get better at it.
Thanks in advance.
Well done for using strict in your code. For extra credit, add a use warnings line too.
I can't see any obvious problems with the way you're using the module. Do you think there's something wrong? Is the email not being sent?
If you're not getting the email, then I'd suggest that your first step should be to follow the example in the documentation and change the close line to:
$mailer->close
or die "couldn't send whole message: $!\n";
I wonder if the problem (if there is one) is that you're using Google's SMTP server and you don't have permission to do that. Perhaps you need to authenticate first.
A few other points about your code.
There is no need for all of your set-up variables to be initialised with double-quoted strings. And if you switch to single-quoted strings then you no longer need to escape the #s in the data. You would need double quotes to put the newline in $subject, but I've removed that as email subject lines rarely contain newlines.
my $from_adress = 'email#xxxxx.com';
my $to_adress = 'email#hxxxx.com';
my $subject = 'There goes bananas';
my $body = 'Here is the bananas';
my $server = 'smtp.gmail.com';
The last four lines of your code are confusing in many ways. I'm not really sure what you're trying to achieve there. I'll point out two things though. Firstly, we generally use lexical filehandles these days. If you're learning from a source that uses bareword filehandles, then I'd worry slightly about its age. So the file opening line should look like this:
# $f is, of course, a terrible name for a variable
open(my $f, '>>', $Mail::Mailer::testfile::config{outfile});
You are then printing the value of #_. In Perl, #_ contains the arguments to a subroutine. And this code isn't inside a subroutine, so #_ will be empty. So I'm not surprised that you're not getting any output.
Lastly, I'll point out that I find that I enjoy working with email in Perl a lot more when I'm using tools from the Email::* namespace. In particular, I'd use Email::Sender for sending email.
Update: Ok, I've had a closer look at the Mail::Mailer documentation and I think I understand what you're trying to do in the last four lines. I think you're trying to write the mail message data to the file. Is that right?
If it is, then you're misunderstanding the documentation. The way to do that is to change the type that you pass to new(). It needs to be testfile rather than smtp. So change
my $mailer = Mail::Mailer->new("smtp", Server => $server);
to
my $mailer = Mail::Mailer->new("testfile",);
That will write the mail to a file called mailer.testfile and no mail will be sent.
I am developing an automation script in perl in which for authentication I have written a subroutine which takes password input by user and return it to the main perl program which in turn passes the password to the tool that I need to automate.
This script goes fine with every case unless the character # is part of the password. Then it is not able to automate the tool and fails for authentication.
Below is the subroutine which I used for taking password input.
use Win32::Console;
sub password() {
$StdIn = new Win32::Console(STD_INPUT_HANDLE);
my $Password = "";
$StdIn->Mode(ENABLE_PROCESSED_INPUT);
print "Enter Password: ";
while (ord(my $Data = $StdIn->InputChar(1)) != 10) {
if("\r" eq $Data ) {
last;
}
elsif ("\ch" eq $Data) {
if( "" ne chop( $Password )) {
print "\ch \ch";
}
next;
}
$Password .=$Data;
print "*";
}
return $Password;
}
i am calling the above subroutine as
$passwd = &password();
And then passing the $passwd to the tool that I need to automate as below,
This is the line in which I pass the password to tool,
cc -c URL OF THE TOOL:$server -d $domain -t $appl -u $userid -p $passwd; \n";
Can anyone please cross check the code for calling the password() sub routine, returning $password and passing $passwd to the tool. I think error may be at one of the three places.Someone please help and if possible provide the code too.
You are probably passing the user input to an external tool of some sort that doesn't support literal #, perhaps because it is interpreted as "start of comment". The shell is a likely suspect. But without sample code, it is impossible for us to be sure what your problem is.
It seems to me like issue with quoting on the shell side. Proper quoting rules are entirely dependent on your system's command shell. In bash, for example, you can often be fairly safe with something like -p '$password', while transliterating ' to \' in the Perl script. Or there could be a module for that, of which I'm not aware.
However, it's the "cc" thing you're passing data into. The tool could could support easier way to safely pass the password, i.e. avoid the need of quoting arbitrary data. Examples of such facilities are via STDIN, via external file (well that could be actually pretty unsafe :)) or as a null-terminated string. Or passing the hash only. So I'd look into the documentation.
Be aware that by passing user collected data into shell, you're posing a great security risk. Consider using Perl's Taint mode.
Update: You say it's Windows, so I should warn you: quoting for cmd.exe is one of the most complicated, painful and frustrating things I've ever done. Have a look at this question
I have the following script,
#!/usr/bin/perl
use strict;
use warnings;
use Net::SSH::Perl;
use Expect;
my $logs = "logs";
open(LOG,'>>',"$logs") or die "can't logs $!\n";
my $domain = 'domain.com';
my #host = qw/host/;
foreach my $host (#host) {
my $cmd = "passwd user1";
my $sshost = join('.', $host, $domain);
my $ssh = Net::SSH::Perl->new("$sshost");
$ssh->login('root');
$ssh->debug();
my ($stdout, $stderr, $exit) = $ssh->cmd($cmd);
print LOG $stdout,"\n";
}
Now my problem is I don't know how to use Expect to send the password after the $cmd is executed and it's time to key in the password. $stdin won't work in this case since we're using HPUX.
Appreciate any guidance and sample, reading the Expect docs don't result something for me.
I don't think that's possible unfortunately. However, Net::SSH::Expect seems to be able to do what you want.
I summarize: you need Expect, and the ssh module has no use.
I'll be more precise: if I understand your source code, your requirement, in human terms, is something like this: log in to a collection of Unix hosts and use passwd(1) to update root's password on each. Do I have that right?
I expect there's frustration in all directions, because variations of this question have been answered authoritatively for at least two decades. That's no reflection on you, because I recognize how difficult it is to find the correct answer.
While Net::SSH is a fine and valuable module, it contributes nothing to the solution of what I understand to be your requirements. You need Expect.
As it turns out, the standard distribution of the Tcl-based Expect includes an example which addresses your situation. Look in http://www.ibm.com/developerworks/aix/library/au-expect/ > for the description of passmass.
Identical functionality can be coded in Expect.pm, of course. Before I exhibit that, though, I ask that original questioner lupin confirm I'm on track in addressing his true requirements.
i think i had a similar issue getting into privileged exec mode with cisco routers, which similarly asks for a password when "en" is invoked. i got around it with a special subroutine:
sub enable { my ($expect_session, $password) = #_;
$expect_session->send("en\n");
$expect_session->expect($timeout,
[ qr/[Pp]assword:/,
sub {
my $expect_session = shift;
$expect_session->send("$password","\n");
exp_continue;
} ],
-re => $prompt,
);
}
but i think the issue is that you're not using Perl's Expect as it's intended to be used. An Expect session be created to manage the SSH connection, then commands are sent to through it. you don't need Net::SSH:Perl at all. here's my $expect_session definition:
my $expect_session = new Expect();
$expect_session->log_stdout(0); # let's keep things quiet on screen; we only want command output.
$expect_session->log_file(".expectlog");
$expect_session->spawn("/usr/bin/ssh -o StrictHostKeyChecking=no $username\#$host")
or die ("\nfor some reason we can't establish an SSH session to $host.\n
it's something to do with the spawn process: $!\n");
there might be a few pieces missing, but hopefully this will get you moving in the right direction. it's a complicated module which i don't understand fully. i wish you the best in luck getting it to do what you want.
Net::OpenSSH can be combined with Expect to do that easyly.
Actually the module distribution contains a sample script that does just that!
I'm attempting to run a CGI script in the current environment from another Perl module. All works well using standard systems calls for GET requests. POST is fine too, until the parameter list gets too long, then they get cut off.
Has anyone ran in to this problem, or have any suggestions for other ways to attempt this?
The following are somewhat simplified for clarity. There is more error checking, etc.
For GET requests and POST requests w/o parameters, I do the following:
# $query is a CGI object.
my $perl = $^X;
my $cgi = $cgi_script_location; # /path/file.cgi
system {$perl} $cgi;
Parameters are passed through the
QUERY_STRING environment variable.
STDOUT is captured by the calling
script so whatever the CGI script
prints behaves as normal.
This part works.
For POST requests with parameters the following works, but seemingly limits my available query length:
# $query is a CGI object.
my $perl = $^X;
my $cgi = $cgi_script_location; # /path/file.cgi
# Gather parameters into a URL-escaped string suitable
# to pass to a CGI script ran from the command line.
# Null characters are handled properly.
# e.g., param1=This%20is%20a%20string¶m2=42&... etc.
# This works.
my $param_string = $self->get_current_param_string();
# Various ways to do this, but system() doesn't pass any
# parameters (different question).
# Using qx// and printing the return value works as well.
open(my $cgi_pipe, "|$perl $cgi");
print {$cgi_pipe} $param_string;
close($cgi_pipe);
This method works for short parameter lists, but if the entire command gets to be close to 1000 characters, the parameter list is cut short. This is why I attempted to save the parameters to a file; to avoid shell limitations.
If I dump the parameter list from the executed CGI script I get something like the following:
param1=blah
... a bunch of other parameters ...
paramN=whatever
p <-- cut off after 'p'. There are more parameters.
Other things I've done that didn't help or work
Followed the CGI troubleshooting guide
Saved the parameters to a file using CGI->save(), passing that file to the CGI script. Only the first parameter is read using this method.
$> perl index.cgi < temp-param-file
Saved $param_string to a file, passing that file to the CGI script just like above. Same limitations as passing the commands through the command line; still gets cut off.
Made sure $CGI::POST_MAX is acceptably high (it's -1).
Made sure the CGI's command-line processing was working. (:no_debug is not set)
Ran the CGI from the command line with the same parameters. This works.
Leads
Obviously, this seems like a character limit of the shell Perl is using to execute the command, but it wasn't resolved by passing the parameters through a file.
Passign parameters to system as a single string, from HTTP input, is extremely dangerous.
From perldoc -f system,
If there is only one scalar argument, the argument is checked for shell metacharacters, and if there are any, the entire argument is passed to the system's command shell for parsing (this is /bin/sh -c on Unix platforms, but varies on other platforms). If there are no shell metacharacters in the argument,..
In other words, if I pass in arguments -e printf("working..."); rm -rf /; I can delete information from your disk (everything if your web server is running as root). If you choose to do this, make sure you call system("perl", #cgi) instead.
The argument length issue you're running into may be an OS limitation (described at http://www.in-ulm.de/~mascheck/various/argmax/):
There are different ways to learn the upper limit:
command: getconf ARG_MAX
system header: ARG_MAX in e.g. <[sys/]limits.h>
Saving to a temp file is risky: multiple calls to the CGI might save to the same file, creating a race condition where one user's parameters might be used by another user's process.
You might try opening a file handle to the process and passing arguments as standard input, instead. open my $perl, '|', 'perl' or die; fprintf(PERL, #cgi);
I didn't want to do this, but I've gone with the most direct approach and it works. I'm tricking the environment to think the request method is GET so that the called CGI script will read its input from the QUERY_STRING environment variable it expects. Like so:
$ENV{'QUERY_STRING'} = $long_parameter_string . '&' . $ENV{'QUERY_STRING'};
$ENV{'REQUEST_METHOD'} = 'GET';
system {$perl_exec} $cgi_script;
I'm worried about potential problems this may cause, but I can't think of what this would harm, and it works well so far. But, because I'm worried, I thought I'd ask the horde if they saw any potential problems:
Are there any problems handling a POST request as a GET request on the server
I'll save marking this as the official answer until people have confirmed or at least debated it on the above post.
Turns out that the problem is actually related to the difference in Content-Length between the original parameters and the parameter string I cobbled together. I didn't realize that the CGI module was using this value from the original headers as the limit to how much input to read (makes sense!). Apparently the extra escaping I was doing was adding some characters.
My solution's trick is simply to piece together the parameter string I'll be passing and modify the environment variable the CGI module will check to determine the content length to be equal to the .
Here's the final working code:
use CGI::Util qw(escape);
my $params;
foreach my $param (sort $query->param) {
my $escaped_param = escape($param);
foreach my $value ($query->param($param)) {
$params .= "$escaped_param=" . escape("$value") . "&";
}
}
foreach (keys %{$query->{'.fieldnames'}}) {
$params .= ".cgifields=" . escape("$_") . "&";
}
# This is the trick.
$ENV{'CONTENT_LENGTH'} = length($params);
open(my $cgi_pipe, "| $perl $cgi_script") || die("Cannot fork CGI: $!");
local $SIG{PIPE} = sub { warn "spooler pipe broke" };
print {$cgi_pipe} $params;
warn("param chars: " . length($params));
close($cgi_pipe) || warn "Error: CGI exited with value $?";
Thanks for all the help!
I know how to use Perl's Getopt::Long, but I'm not sure how I can configure it to accept any "--key=value" pair that hasn't been explicitly defined and stick it in a hash. In other words, I don't know ahead of time what options the user may want, so there's no way for me to define all of them, yet I want to be able to parse them all.
Suggestions? Thanks ahead of time.
The Getopt::Long documentation suggests a configuration option that might help:
pass_through (default: disabled)
Options that are unknown, ambiguous or supplied
with an invalid option value are passed through
in #ARGV instead of being flagged as errors.
This makes it possible to write wrapper scripts
that process only part of the user supplied
command line arguments, and pass the remaining
options to some other program.
Once the regular options are parsed, you could use code such as that provided by runrig to parse the ad hoc options.
Getopt::Long doesn't do that. You can parse the options yourself...e.g.
my %opt;
my #OPTS = #ARGV;
for ( #OPTS ) {
if ( /^--(\w+)=(\w+)$/ ) {
$opt{$1} = $2;
shift #ARGV;
} elsif ( /^--$/ ) {
shift #ARGV;
last;
}
}
Or modify Getopt::Long to handle it (or modify the above code to handle more kinds of options if you need that).
I'm a little partial, but I've used Getopt::Whatever in the past to parse unknown arguments.
Potentially, you could use the "Options with hash values" feature.
For example, I wanted to allow users to set arbitrary filters when parsing through an array of objects.
GetOptions(my $options = {}, 'foo=s', 'filter=s%')
my $filters = $options->{filter};
And then call it like
perl ./script.pl --foo bar --filter baz=qux --filter hail=eris
Which would build something like..
$options = {
'filter' => {
'hail' => 'eris',
'baz' => 'qux'
},
'foo' => 'bar'
};
And of course $filters will have the value associated with 'filter'
Good luck! I hope someone found this helpful.
From the documentation:
Argument Callback
A special option 'name' <> can be used to designate a subroutine to handle non-option arguments. When GetOptions() encounters an argument that does not look like an option, it will immediately call this subroutine and passes it one parameter: the argument name.
Well, actually it is an object that stringifies to the argument name.
For example:
my $width = 80;
sub process { ... }
GetOptions ('width=i' => \$width, '<>' => \&process);
When applied to the following command line:
arg1 --width=72 arg2 --width=60 arg3
This will call process("arg1") while $width is 80, process("arg2") while $width is 72, and process("arg3") while $width is 60.
This feature requires configuration option permute, see section
"Configuring Getopt::Long".
This is a good time to roll your own option parser. None of the modules that I've seen on the CPAN provide this type of functionality, and you could always look at their implementations to get a good sense of how to handle the nuts and bolts of parsing.
As an aside, this type of code makes me hate Getopt variants:
use Getopt::Long;
&GetOptions(
'name' => \$value
);
The inconsistent capitalization is maddening, even for people who have seen and used this style of code for a long time.