psftp error flag using in perl - perl

The script below functions just as intended. It calls the batch file which establishes a connection using PUTTY or scpecically psftp and transfers a handful of file. Once the transfer is complete i continue with the perl script to move the files to another directory.
My question is how could I capture an error from the PSFTP application? If there is an error xfering or establishing a connection i would like to generate a flag which then i could capture in perl and stop anything else from happening and send me an email. I just need guidance on generating the flag from the PSFTP application on error.
Thank you very much in advance!
My
use warnings;
use File::Copy;
my $TheInputDir = "//NT6/InfoSys/PatientSurvey/Invision/CFVMC/";
my $TheMoveDir = "//NT6/InfoSys/PatientSurvey/Invision/CFVMC/Completed";
system ('2166_PG_Upload_Batch.bat');
opendir (THEINPUTDIR, $TheInputDir);
#TheFiles = readdir(THEINPUTDIR);
close THEINPUTDIR;
#Get all the files that meet the naming convention
foreach $TheFile(#TheFiles)
{
if($TheFile =~ /2166/)
{
print "$TheFile\n";
move ("$TheInputDir$TheFile","$TheMoveDir");
}
}

You should read STDERR from your application into Perl. For that use Capture::Tiny
my ($stdout, $stderr, $exit) = capture {
system('2166_PG_Upload_Batch.bat');
};
#check for $stderr and do your stuffs i.e sending email etc
For more info refer perlfaq

Related

Read all the files in a remote directory in perl

I'm new to perl. I'd like to parse through a list of log files in a remote server(unix) probably around 200mb each to create a daily report.
What is the best way to read the log files from a remote server securely using ssh and efficiently.
Any pointer would be greatly appreciated.
Good Morning!
As already said in the comments there is no secret in it. I recommend you to use Net::SSH::Perl fo that kind of job:
#!/usr/bin/perl -w
use strict;
use lib qw("/path/to/module/");
use Net::SSH::Perl;
my $host = "hostname";
my $user = "username";
my $password = "password";
my $command = 'dir /var/log/'; #your command to read the directory
my $ssh = Net::SSH::Perl->new("$host", debug=>0);
$ssh->login("$user","$password");
my ($stdout,$stderr,$exit) = $ssh->cmd("$command"); #the standard outputs will be fetched
print $stdout; #obviously the STDOUT will be printed
For more informations about this Module you can read its documentation on CPAN: Net::SSH::Perl
I hope I could help you!
EDIT:
For sure you can run your script on the server and fetch its output in the same way. You just have to modify the $command. For example: $command: = "perl /home/$user/my_script.pl".

net-sftp-foreign error handling for nagios integration

I am rather new in Perl. I have written a very simple script that copies and removes a file from sftp.
but the script should return some kind of output in order to integrate it with nagios.. using nsca or something.
the script is running on solaris 10.
here is the script:
#!/usr/bin/perl -w
use strict;
use warnings;
use Net::SFTP::Foreign;
use POSIX qw(strftime);
my $datestring =strftime "%d-%m-%Y", localtime;
my $host="sftp.mariog.com";
my $username="user";
my $local="/mnt/mariog";
my $file="BookingReport_Daily_$datestring.xls";
print "$file \n";
my $sftp = Net::SFTP::Foreign->new($host,
user => $username,
stderr_discard => 1,
autodie => 1,
);
$sftp->die_on_error("unable to establish SFTP Connection");
$sftp->get("$file", "$local/$file");
$sftp->remove($file);
$sftp->disconnect();
how to handle the fact that the file does not exist on the sftp? maybe it has not reached yet. there a daily file uploaded but at different times. the script is run by cron every 4 hours so most of the times it will not find files to transfer...
where can I get the output code of the transfer successful or not? so that i can pass it to a nagios passive check with nsca...
thank you for your help..
kind regards.
Mario
It looks like one way to check for a successful get in the documentation would be something like this:
$sftp->get("$file", "$local/$file") or die "get failed:" $sftp->error;

Weird issue with Net::SSH::Expect in Perl script

I am working on putting together a perl script. I have captured it below:
#!/usr/bin/perl
use Tie::File;
use Net::SSH::Expect;
use utf8;
use warnings;
use diagnostics;
# Grab password from hidden file
$pw=`cat .password`;
chomp $pw;
#Read list of 9200's from hosts.list file into an array
tie #hosts, 'Tie::File', "hosts.list" or die;
#Loop through hosts, connect via ssh, run commands, and write out log files.
foreach (#hosts) {
#Create ssh session handle
my $ssh = Net::SSH::Expect->new (
host => $_,
password => $pw,
user => 'user',
raw_pty => 1
);
my $login_output = $ssh->login();
if ($login_output !~ /.*sbc.*>/) {
die "Login failed. Login output was $login_output";
}
$ssh->send("show sip errors");
my $line;
while ( defined ($line = $ssh->read_line()) ){
print $line . "\n";
}
$ssh->close();
}
First, I'm not a programmer, so style is probably very ugly. Sorry about that :) The goal is to run several commands on a remote appliance, capture the results in separate files, which will then be consumed by a 3rd party parsing engine (splunk).
The current implemented functionality is able to log in to remote hosts, run the command, and then print out to stdout. Not quite there, but still shows a good proof of concept.
The script runs fine for the first 3 hosts in the hosts.list file. However as soon as it gets to the fourth host, I receive this exception:
Uncaught exception from user code:
SSHAuthenticationError Login timed out. The input stream currently has the contents bellow: user#myhost.mydomain's password: at /System/Library/Perl/Extras/5.12/Expect.pm line 828
at /Library/Perl/5.12/Net/SSH/Expect.pm line 209
Net::SSH::Expect::__ANON__('ARRAY(0x7fd718a03008)') called at /System/Library/Perl/Extras/5.12/Expect.pm line 828
Expect::_multi_expect(1, 'ARRAY(0x7fd7189fbce8)', 'ARRAY(0x7fd7189f7460)') called at /System/Library/Perl/Extras/5.12/Expect.pm line 565
Expect::expect('Expect=GLOB(0x7fd7189f1878)', 1, 'ARRAY(0x7fd718a01530)', 'ARRAY(0x7fd7189f15a8)', 'ARRAY(0x7fd71890a3d0)', 'ARRAY(0x7fd718a07470)', 'ARRAY(0x7fd7189d8b18)') called at /Library/Perl/5.12/Net/SSH/Expect.pm line 580
Net::SSH::Expect::_sec_expect('Net::SSH::Expect=HASH(0x7fd718a29828)', 1, 'ARRAY(0x7fd718a01530)', 'ARRAY(0x7fd7189f15a8)', 'ARRAY(0x7fd71890a3d0)', 'ARRAY(0x7fd718a07470)', 'ARRAY(0x7fd7189d8b18)') called at /Library/Perl/5.12/Net/SSH/Expect.pm line 213
Net::SSH::Expect::login('Net::SSH::Expect=HASH(0x7fd718a29828)') called at ./pcscfFetch.pl line 26
Any ideas on what the problem could be? I am able to log in to the host with no issue manually via ssh. The script works fine for our other hosts, it's just this one outlier that I can't seem to figure out. Any advice would be appreciated. Thanks!
I did end up resolving this. In the constructor for $ssh I set the timeout to 10 seconds, instead of the default 1. The script runs significantly slower, but I don't appear to have the issues I was running into before. Appreciate the feedback!
Net::SSH::Expect is not reliable.
Use Net::OpenSSH instead, or if you want to run the same set of commands in several hosts Net::OpenSSH::Parallel.

How do I run shell commands in a CGI program as the nobody user?

I want to run shell commands in a CGI program (written in Perl). My program doesn’t have root permission. It runs as nobody. I want to use this code:
use strict;
system <<'EEE';
awk '{a[$1]+=$2;b[$1]+=$3}END{for(i in a)print i, a[i], b[i]|"sort -nk 3"}' s.txt
EEE
I can run my code successfully with perl from the command line but not as a CGI program.
Based on the code in your question, there are at least four possibilities for failure.
The nobody user does not have permission to execute your program.
The Perl code in your question has no shebang (#!) line. You are trying to run awk, so I assume you are running on some form of Unix. If your code is missing this line, then your operating system does not know how to run your program.
The file s.txt is either not in the executing program’s working directory, or it is not readable by the nobody user.
For whatever reason, awk is not reachable via the PATH of your executing program’s environment.
To quickly diagnose such low-level problems, try to have all error output to show up in the browser. One way to do this is adding the following just after the shebang line in your code.
BEGIN {
print "Content-type: text/plain\n\n";
open STDERR, ">&", \*STDOUT or print "$0: dup: $!";
}
The output will render as plain text rather than HTML, but this is a temporary measure to see your program’s output. By wrapping it in a BEGIN block, the code executes as soon as it parses. Redirecting STDERR means your browser also gets anything written to the standard output.
Another way to do this is with the CGI::Carp module.
use CGI::Carp 'fatalsToBrowser';
This way, errors go to the browser and also to the web server’s error log.
If you still see 500-series errors from your server, the problem is happening at a lower level: probably some failure to start perl. Go examine your server’s error log. Once your program is executing, you can remove this temporary redirection of error output.
Finally, I recommend changing your program to
#! /usr/bin/perl -T
BEGIN { print "Content-type: text/plain\n\n"; }
use strict;
use warnings;
$ENV{PATH} = "/bin:/usr/bin";
my $input = "/path/to/your/s.txt";
my $buckets = <<'EOProgram'
{ a[$1] += $2; b[$1] += $3 }
END { for (i in a) print i, a[i], b[i] }
EOProgram
open STDIN, "-|", "awk", $buckets, $input or die "$0: open: $!";
exec "sort", "-nk", 3 or die "$0: exec: $!";
The -T switch enables a security dataflow analysis called taint mode that prevents you from using unsanitized input on system operations such as open, exec, and so on that an attacker (or benign user supplying unexpected input) could use to harm your system. You should always add -T to CGI programs and any other code that runs on behalf of another user.
Given the nature of your awk program, a content type of text/plain seems reasonable. Output it as soon as possible.
With taint mode enabled, be explicit about the value of your PATH environment variable. If instead you stick with whatever untrusted PATH your program inherits, attempting to run external programs will fail.
Nail down the full path of your input. This will eliminate surprises.
Using the multi-argument forms of open and exec eliminates the shell and its argument parsing. (For completeness, system also has a similar multi-argument form.) Yes, writing it this way can mean being a little more deliberate (such as breaking out the arguments and setting up the pipeline yourself), but it also avoids nasty surprises.
I'm sure nobody is allowed to run shell commands. The problem is that nobody doesn't have permission to open the file s.txt. Add read permission for everyone to s.txt, and add execute permission to everyone on every directory up to s.txt.
I would suggest finding out the full qualified path for awk and specifying it directly. Likely the nobody that launched httpd had a very minimal path in its $ENV{PATH}. Displaying the $ENV{PATH} I am guessing will show this.
This is a good thing, I wouldn't modify the path, but just specify the path /usr/bin/awk or what not.
If you have shell access and it works, type 'which awk' to find this out.
i can run my codes successfully in
perl file but not in cgi file.
What web server are you running under? For instance, apache requires printing a CGI header i.e. print "Content-type: text/plain; charset=utf-8\n\n", or
use CGI;
my $q = CGI->new();
print $q->header('text/html');
(See CGI)
Apache will conplain in the log (error.log) about "premature end of script headers" IF what I said is the case.
You could just do it inline without having to fork out to another process...
if ( open my $fh, '<', 's.txt' ) {
my %data;
while (<$fh>) {
my ($c1,$c2,$c3) = split;
$data{a}{$c1} += $c2;
$data{b}{$c1} += $c3;
}
foreach ( sort { $data{b}{$a} <=> $data{b}{$b} } keys %{ $data{b} } ) {
print "$_ $data{a}{$_} $data{b}{$_}\n";
}
} else {
warn "Unable to open s.txt: $!\n";
}

A 'do' statement at the end of my perl script never runs

In my main script, I am doing some archive manipulation. Once I have completed that, I want to run a separate script to upload my archives to and FTP server.
Separately, these scripts work well. I want to add the FTP script to the end of my archive script so I only need to worry about scheduling one script to run and I want to guarantee that the first script completes it work before the FTP script is called.
After looking at all the different methods to call my FTP script, I settled on 'do', however, when my do statement is at the end of the script, it never runs. When I place it in my main foreach loop, it runs fine, but it runs multiple times which I want to avoid since the FTP script can handle having multiple archives to upload.
Is there something I am missing? Why does it not run?
Here is the relivant code:
chdir $input_dir;
#folder_list = <*>;
foreach $file (#folder_list)
{
if($file =~ m/.*zip/)
{
print "found $file\n";
print "Processing Files...\n";
mkdir 'BuildDir';
$new_archive = Archive::Zip->new();
$archive_name = $file;
$zip = Archive::Zip->new($file);
$zip->extractTree('', $build_dir);
&Process_Files;
}
}
do 'ArchiveToFTPServer.pl';
print "sending files to FTP server";
Thanks
I ended up copying and pasting the FTP code into the main file as a sub. It works fine when I call it at the end of the foreach loop.
Check out the docs for the do 'function'.
In there, you'll find a code sample:
unless ($return = do $file) {
warn "couldn't parse $file: $#" if $#;
warn "couldn't do $file: $!" unless defined $return;
warn "couldn't run $file" unless $return;
}
I suggest putting this code in to find out what's happening with your do call. In addition, try adding warnings and strict to your code to weed out any subtle bugs.
Add these lines to your scripts:
use strict;
use warnings;
You will now get more diagnostic information, which should lead you to the solution. My current bet is that you are not specifying the correct path to the other script, or that it is missing a shebang line.
What's the call to the new script? If using a shell, did you check your environment variables?