Real time logging to file with SSH? - perl

my $file = 'log.log';
my $cmd = 'sysstat';
my $ssh = Net::SSH::Perl->new($host);
$ssh->login('admin', 'password');
open my $in, "-|", ($ssh->cmd($cmd))[0];
open my $out_fh, ">", $file;
#print+($ssh->cmd($cmd))[0]."\n";
while (my $line = <$in>) {
print { $out_fh } $line;
}
Any recommendations for how to log ssh output to a file in real time? $cmd will run forever and I'd like each line it spits out be written to the file in real time.

You can't do that with Net::SSH::Perl... well, at least not easily!
Use Net::OpenSSH instead:
$ssh = Net::OpenSSH->new($host, user => $user, password => $password);
$ssh->system({stdout_file => $file}, $cmd);

Related

Net::OpenSSH how to make perl script not exit and move to next IP if authentication fails

I am a beginner to Perl. My objective is to run a loop on a list of router IPs and attempt to ssh login into them, execute some commands, and store the output.
Upon unsuccessful authentication of one router IP, the entire script stops, and it does not move to the next IP. I wish to store why authentication failed of the router IP in question, and then move onto the next IP to execute.
The error I get before the script exits is:
Unable to connect: unable to establish master SSH connection: bad
password or master process exited unexpectedly
#!/usr/bin/perl -s
use Net::OpenSSH;
my $password = "xxxxxxx";
my $s;
$filename1 = "ip_FINAL_sample";
open my $f1, '<', "$filename1.txt";
chomp(my #file1 = <$f1>);
close $f1;
for $loopvariable(#file1)
{
my #mikparameters = split /\|/, $loopvariable;
$mikip = $mikip . $mikparameters[0] . "\n";
}
my #routers = split "\n", $mikip;
my $cmd1 = 'system identity print';
my $cmd2 = 'system routerboard print';
foreach $s (#routers)
{
my $ssh = Net::OpenSSH->new("yyyyy+500w\#$s", password=>$password, timeout=>30);
$ssh->error and die "Unable to connect: " . $ssh->error;
print "connected to $s\n";
my $fh1 = $ssh->pipe_out($cmd1) or die "Unable to run command\n";
my $fh1_1 = $ssh->capture($cmd1) or die "Unable to run command\n";
my $fh2 = $ssh->pipe_out($cmd2) or die "Unable to run command\n";
my $fh2_2 = $ssh->capture($cmd2) or die "Unable to run command\n";
while (<$fh1>)
{
print "$_";
}
close $fh1;
print "\n";
open my $file1, '>>', "output.txt" or die "Can't open output.txt: $!\n";
print $file1 "$fh1_1";
close $file1;
while (<$fh2>)
{
print "$_";
}
close $fh2;
print "\n";
open my $file2, '>>', "output.txt" or die "Can't open output.txt: $!\n";
print $file2 "$fh2_2";
close $file2;
undef $ssh;
}

why does my perl ftp script get timed out on upload of a small file?

I do no understand why my script below hangs and is timed out on transfer of a very small file. If someone can assist whether it is a problem of configuration, or whatever.
#!/usr/bin/perl
use strict;
use warnings;
use Net::FTP;
my $host = "server";
my $user = "username";
my $pass = "password";
#!!!!!!!!!!!!!!!!!!!!!!!!!!
my $ftp = Net::FTP->new($host, passive => 1, time => 360, Debug => 1) or die "Error connecting to $host: $!\n";
$ftp->login($user, $pass) or die "Cannot login to $user: $!\n";
$ftp->ascii();
#!!!!!!!!!!!!!!!!!!!!!!!!!!
opendir(my $DIR, "./") or die $!;
my #files=readdir($DIR);
foreach my $file (#files){
next if -d $file;
next unless $file =~/^OBS_ZB/;
for my $file(glob 'OBS_ZB*'){
eval{$ftp->put($file, $file) or print("Can't send file $file\n")};
if($# =~ /Timeout/){
print "Got a timeout Issue: $#";
}
}
}
#!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
print "copying ends\n";
exit 0;

Multiple string parse to loop

I currently have some code which returns a sites header content back:
#!/usr/bin/perl
use strict;
require IO::Socket;
my #header;
my $host = shift;
my $socket = new IO::Socket::INET(
PeerAddr => $host,
PeerPort => 80,
Proto => 'tcp') || die "Could not Connect $!\n";
print "Connected.\n";
print "Getting Header\n";
print $socket "GET / HTTP/1.0\n\n";
my $i = 0;
while (<$socket>) {
#header[$i] = $_;
$i++;
}
$i = 0;
print "--------------------------------------\n";
while ($i <= 8) {
print "#header[$i++]";
}
print "-------------------------------------\n";
print "Finished $host\n";
What I would like to do, is to be able to read from a file open (FILE, '<', shift); and then every IP in the file, to pass into a the header retrieve loop, which saves me from manually doing one by one.
What I mean by this is to have a file containing (example ips): 1.1.1.1 2.2.2.2 on each line and then parsing all of them through the get header function.
Replace
my #header;
my $host = shift;
...
with
while (<>) {
chomp( my $host = $_ );
my #header;
...
}
You would just open your file, read the contents into a list, then iterate over the list:
open my $fh, '<', $file or die "$!";
my #ips = <$fh>;
close $fh;
foreach my $ip ( #ips ) {
chomp $ip;
...
}

FTP application written in Perl doesn't connect

Why doesn't my program work? It refuses to connect to the host, I've tried two different servers and verified which port is used.
Note that I'm not very experienced when it comes to Perl.
use strict;
use Net::FTP;
use warnings;
my $num_args = $#ARGV+1;
my $filename;
my $port;
my $host;
my $ftp;
if($num_args < 2)
{
print "Usage: ftp.pl host [port] file\n";
exit();
}
elsif($num_args == 3)
{
$port = $ARGV[1];
$host = $ARGV[0];
$filename = $ARGV[2];
print "Connecting to $host on port $port.\n";
$ftp = Net::FTP->new($host, Port => $port, Timeout => 30, Debug => 1)
or die "Can't open $host on port $port.\n";
}
else
{
$host = $ARGV[0];
$filename = $ARGV[1];
print "Connecting to $host with the default port.\n";
$ftp = Net::FTP->new($host, Timeout => 30, Debug => 1)
or die "Can't open $host on port $port.\n";
}
print "Usename: ";
my $username = <>;
print "\nPassword: ";
my $password = <>;
$ftp->login($username, $password);
$ftp->put($filename) or die "Can't upload $filename.\n";
print "Done!\n";
$ftp->quit;
Thanks in advance.
Now that you already have your answer <> -> <STDIN>, I think I see the problem. When #ARGV contains anything, <> is the 'magic open'. Perl interprets the next item in #ARGV as a filename, opens it and reads it line by line. Therefore, I think you can probably do something like:
use strict;
use Net::FTP;
use warnings;
use Scalar::Util 'looks_like_number';
if(#ARGV < 2)
{
print "Usage: ftp.pl host [port] file [credentials file]\n";
exit();
}
my $host = shift; # or equiv shift #ARGV;
my $port = (looks_like_number $ARGV[0]) ? shift : 0;
my $filename = shift;
my #ftp_args = (
$host,
Timeout => 30,
Debug => 1
);
if ($port)
}
print "Connecting to $host on port $port.\n";
push #ftp_args, (Port => $port);
}
else
{
print "Connecting to $host with the default port.\n";
}
my $ftp = Net::FTP->new(#ftp_args)
or die "Can't open $host on port $port.\n";
#now if #ARGV is empty reads STDIN, if not opens file named in current $ARGV[0]
print "Usename: ";
chomp(my $username = <>); #reads line 1 of file
print "\nPassword: ";
chomp(my $password = <>); #reads line 2 of file
$ftp->login($username, $password);
$ftp->put($filename) or die "Can't upload $filename.\n";
print "Done!\n";
$ftp->quit;
Then if you had some connection creditials in a file (say named cred) like
myname
mypass
then
$ ftp.pl host 8020 file cred
would open host:8020 for file using credentials in cred.
I'm not sure you want to do that, its just that THAT is how <> works.

perl - help reworking code to include use of a sub routine

My test script simply does a perl dbi connection to a mysql database and given a list of tables, extracts (1) record per table.
For every table I list, I also want to print that (1) record out to its own file. For example if I have a list of 100 tables, I should expect 100 uniques files with (1) record each.
So far the code works, but I am interested in creating a sub routine, call it create_file for pieces of the code that handles that #Create file
I am not familiar with writing sub routines and need help implementing that if possible.
I am not sure how I would call the part where the data is built. $data='';
Can someone show me good way to do this? Thanks for your help.
code:
# Get list of tables
my #tblist = qx(mysql -u foo-bar -ppassw0rd --database $dbsrc -h $node --port 3306 -ss -e "show tables");
# Data output
foreach my $tblist (#tblist)
{
my $data = '';
chomp $tblist;
#Create file
my $out_file = "/home/$node-$tblist.$dt.dat";
open (my $out_fh, '>', $out_file) or die "cannot create $out_file: $!";
my $dbh = DBI->connect("DBI:mysql:database=$dbsrc;host=$node;port=3306",'foo-bar','passw0rd');
my $sth = $dbh->prepare("SELECT UUID(), '$node', ab, cd, ef, gh, hi FROM $tblist limit 1");
$sth->execute();
while (my($id, $nd,$ab,$cd,$ef,$gh,$hi) = $sth->fetchrow_array() ) {
$data = $data. "__pk__^A$id^E1^A$nd^E2^A$ab^E3^A$cd^E4^A$ef^E5^A$gh^E6^A$hi^E7^D";
}
$sth->finish;
$dbh->disconnect;
#Create file
print $out_fh $data;
close $out_fh or die "Failed to close file: $!";
};
my $dt = "2011-02-25";
my $dbsrc = "...";
my $node = "...";
# Get list of tables
my #tblist = qx(mysql -u foo-bar -ppassw0rd --database $dbsrc -h $node --port 3306 -ss -e "show tables");
my $dbh = DBI->connect("DBI:mysql:database=$dbsrc;host=$node;port=3306",'foo-bar','passw0rd');
foreach my $tblist (#tblist)
{
# This breaks - chomp is given a list-context
#extract_data($dbh, chomp($tblist));
chomp $tblist;
extract_data($dbh, $tblist);
};
$dbh->disconnect;
sub extract_table
{
my($dbh, $tblist) = #_;
my $out_file = "/home/$node-$tblist.$dt.dat";
open (my $out_fh, '>', $out_file) or die "cannot create $out_file: $!";
my $sth = $dbh->prepare("SELECT UUID(), '$node', ab, cd, ef, gh, hi FROM $tblist limit 1");
$sth->execute();
while (my($id, $nd,$ab,$cd,$ef,$gh,$hi) = $sth->fetchrow_array() ) {
print $out_fh "__pk__^A$id^E1^A$nd^E2^A$ab^E3^A$cd^E4^A$ef^E5^A$gh^E6^A$hi^E7^D";
}
$sth->finish;
close $out_fh or die "Failed to close file: $!";
};
Unless you really need to connect to the database for each statement you execute, keep the database open between operations.
Instead of creating your own create_file, you could use write_file from File::Slurp.
It abstracts away the open/close/die/print.