Perl SSH Script unable to load math library? - perl

I am trying to run a Perl script that takes user, pass, ip arguments and uses that to check the version of a network switch through ssh. However when i run it on our server i keep getting:
Math::BigInt: couldn't load specified math lib(s), fallback to Math::BigInt::FastCalc at /usr/lib/perl5/site_perl/5.10.0/Crypt/DH.pm line 6
It the proceeds to hang for a bit then return with no output. What is causing this to fail and how can i get around it? I do not have access to install extra modules to the server.
EDIT: I have checked the currently installed modules and Net::SSH:Perl, Math::BigInt::FastCalc, and Math::Pari are all installed, so i have no idea why it is having problems loading those modules.
Here is my script for reference:
#!/usr/bin/perl
# Outputs the name of the Flash File (which denotes the software version) of the switch
#Input: User Pass Host
open(FH, ">>unparsed.txt");
open (FH2, ">>versions.txt");
use strict;
use Net::SSH::Perl;
my $user = $ARGV[0];
my $pass = $ARGV[1];
my $host = $ARGV[2]; #Hostname given as command line argument
my $cmd = "show version";
my $version;
print("\n");
print($ARGV[2]);
print("\n");
my $ssh = Net::SSH::Perl->new($host);
$ssh->login($user, $pass); # login to switch
my($stdout, $stderr, $exit) = $ssh->cmd($cmd);
printf FH ($stdout); #output all test to file
close(FH);
open(FH, "unparsed.txt");
while(<FH>){ #look through file for flash filename
if($_ =~ /System image file is "(.*)"/){
$version = $1;
}
}
print ($version); #output flash filename
print ("\n");
printf FH2 ($ARGV[2]);
printf FH2 ("\n");
printf FH2 ($version);
printf FH2 ("\n");
close(FH2);
close(FH);

Crypt::DH loads Math::BigInt with:
use Math::BigInt lib => "GMP,Pari";
Therefore, you need either GMP or Pari on your system.
Your distribution's package manager may already provide a means to install them.
Have you tried using Net::SSH?

I prefer Net::SSH, and Net::SFTP::Foreign, math libs usally give me troubles when I try to install them on older systems (specially with the mess that some sysadmin do with paths on Unix systems). Most systems either use OpenSSH, or some fork of it (like SunSSH on Solaris), so it's less likely that you you'll have any sort of trouble using those distributions.

Try using Net::OpenSSH instead of Net::SSH::Perl.

Related

create perl script to connect remotely to another server and read File?

I am new to perl programming also to networking .
My Task is to connect to remote server and read file from server using Perl script.I know how to read file from local machine but not how to read from remote server?
Code to read file from local machine but not know how to connect to remote server and read file?
#!/usr/bin/perl
use 5.010;
use strict;
use warnings;
open(FH, 'C:\Users\saqib riaz\Desktop\saqi\properties.txt');
while(<FH>)
{
"$_";
}
close(FH);
I am using window operating system And strawberry with padre ide latest version
The following works for me with Strawberry Perl version 5.30. It uses Net::SSH2 and prints a file file.txt on the server:
use strict;
use warnings;
use Net::SSH2;
my $host = 'my.host.com';
my $user = 'user_name';
my $password = 'xxxxxxx';
my $ssh2 = Net::SSH2->new();
$ssh2->connect($host) or $ssh2->die_with_error;
$ssh2->check_hostkey('ask') or $ssh2->die_with_error;
$ssh2->auth_password($user, $password);
my $chan = $ssh2->channel() or $ssh2->die_with_error;
$chan->exec('cat file.txt') or $ssh2->die_with_error;
print while <$chan>;
$chan->close;
Note: Net::SSH2 comes preinstalled with Strawberry Perl so no need to install it.
Note: I also tried Net::OpenSSH but could not get it to work.

Can't run Perl script on other computer

When I try to run script on my second computer I get this message:
malformed JSON string, neither array, object, number, string or atom, at character offset 0
(before "LWP will support htt...") at iptest.pl line 21, line 2.
On my first computer, the script works fine.
Line 21:
my $data = decode_json($resp->content);
Does anyone know what the problem can be?
Thanks in advance
I'm a bit surprised that the JSON error is the only error you get. But it does contain a tiny little hint: "LWP will support htt...". I bet that LWP is missing a module it needs to be able to make https connections. You now have two options:
print $response->content to see the full error message.
On the command line, do something like lwp-request https://google.com/. You should see the full error message.
Then install the missing module.
And of course: please, please, please:
use strict and use warnings
Clean that script up and throw away every use-line you don't need: IO::Socket, LWP::Simple, YAML::Tiny.
Read the documentation of the modules that you actually are using. What are you trying to achieve with LWP::UserAgent->new(keep_alive)? Hint: It won't help to quote keep_alive.
Some issues:
Always use use strict; use warnings;.
Never use $response->content. What it returns is useless. Instead, use $response->decoded_content( charset => 'none').
You need to chomp the values you get from STDIN.
You should never use our unless forced to (e.g. our #ISA = ok). my should be used instead.
my $format = '$format'; "$format" is a silly way of writing "\$format".
I applied most of the changes ikegami suggested. Then perl gave me good error messages to fix the remaining issues. It looks like it works now. Don't ask why it didn't work before. Your code is weird that it's hard to say what exactly went wrong. With strict and warnings you're forced to write better code. Maybe add some nicely named subroutines to add more clarity.
#!/usr/bin/perl
use strict;
use warnings;
use IO::Socket;
use LWP::UserAgent;
use open qw(:std :utf8);
use LWP::Simple;
use YAML::Tiny;
use JSON;
use URI;
use List::MoreUtils qw(uniq);
print "Enter Qve:";
my ( $qve, $loc, $key, $href );
chomp( $qve = <STDIN> );
print "Enter Location:";
chomp( $loc = <STDIN> );
$key = '';
my $format = '$format';
$href =
"https://api.datamarket.azure.com/Bing/Search/v1/Web?Query='$qve [loc:$loc]'&Latitude=43&Longitude=19&$format=JSON";
my $ua = LWP::UserAgent->new('keep_alive');
$ua->credentials( "api.datamarket.azure.com" . ':443', '', '', $key );
my $resp = $ua->get($href);
my $data = decode_json( $resp->decoded_content( charset => 'none' ) );
my #urls = map { $_->{'Url'} } #{ $data->{d}->{results} };
my #za;
for my $i ( 0 .. $#urls ) {
my $trz = "www.";
my $host = URI->new( $urls[$i] )->host;
$host =~ s/$trz//g;
push( #za, $host );
}
For the record, this fixed the issue for me (CentOS):
# yum install perl-Crypt-SSLeay
I got in the same situation. After I used 'yum' to install perl, I still got the error when run the perl script.
$ sudo yum install perl
Updating Subscription Management repositories.
Unable to read consumer identity
This system is not registered to Red Hat Subscription Management
...
Eventually I did these, and it works.
$ lwp-request https://google.com/
It returned an error message .. (LWP::Protocol::https not installed)
$ sudo rpm -ivh ~/perl-LWP-Protocol-https-6.07-4.el8.noarch.rpm
$ lwp-request https://google.com/
it returned a HTML page.
and my perl script can run without error.
(my system is:
$ cat /etc/os-release
PRETTY_NAME=Red Hat Enterprise Linux 8.0 )

Display Output In Browser Perl CGI SSH

I'm executing remote commands using Net::OpenSSH using a web frontend. My commands return without failure on the command line, but I get nothing in a web browser. I've done a couple hour research to no avail--any ideas?
Here is some code to give you an example (some removed for obvious reasons).
#!/usr/bin/perl -w
use strict;
use CGI ':standard';
use Net::OpenSSH;
# Here in the code is just the header and standard tags
print "1";
print "2"; # both display
my $ssh = Net::OpenSSH->new($host, user => $uname, key_path => $key); # all works
$ssh- error and die "Can't ssh to host" . $ssh->error;
print "3";
$ssh->system("uname -a") or
die "remote command failed: " . $ssh->error;
my #lsa = $ssh->capture("ls -a");
$ssh->error and
die "remote ls command failed: ". $ssh->error;
print "4";
print "5";
print #lsa; # won't display in browser, just terminal/CLI
Cheers!
I maintain CGI.pm. I recommend these additions to your simple script:
Before you print anything else, print the standard HTTP header: print header();
Add this after the use CGI line: use CGI::Carp qw(fatalsToBrowser); ... that will display any run-time problems in the browser. If you don't get any output after these changes, check that the script compiles with perl -cw script.pl
Below is about the minimum Perl code that worked for me on Debian machine. I suggest you go through it and compare it to your actual code.
However, it did not work out-of-the box on my Debian, I had make some decisions most of which probably aren't very safe, but that's more about specific environment:
make home for user that server runs writable (/var/www)
add host to ~/.ssh/known_hosts beforehand
use the strict_mode => 0 to bypass Net::OpenSSH's security checks instead of finding proper
ctl_dir (Net::OpenSSH requires that the folder and all above folders are 0755 or more strict,
so /tmp I used is normally not good enough)
I believe there are much safer practices than that, but as I said, that's specific to environment.
So the code:
#!/usr/bin/perl
use strict;
use warnings;
use Net::OpenSSH;
use File::Temp qw/ tempdir /;
# necessary minimum for CGI
print "Content-type: text/plain\n\n";
# prepare temp dir
my $temp = tempdir("/tmp/sshme.pl-XXXXXXXX", CLEANUP => 1);
# open SSH session
my %opts = (
user => "user",
password => "password",
ctl_dir => $temp,
strict_mode => 0 ## NOT recommended - see my comments
);
my $ssh = Net::OpenSSH->new("host", %opts);
$ssh->error
and die "Couldn't establish SSH connection: ". $ssh->error;
# perform command and print output
my #lines = $ssh->capture("ls")
or die "remote command failed: " . $ssh->error;
print #lines;
Perhaps your errors get directed to standard error, not standard output. In that case, they'll usually end up in the server log, not the browser window. Perhaps you can use POSIX::dup2 to avoid this:
use POSIX;
# Make sure to send HTTP headers before redirecting streams!
POSIX::close(2); # close original stderr stream. No more output to apache logs!!!
POSIX::dup2(1,2); # redirect error stream to standard output, so errors will appear in browser.
Processes launched by perl, like e.g. some ssh binary, will inherit these streams.

Too late for -CSD

Trying to run this little perl program from parsCit:
parsCit-client.pl e1.txt
Too late for -CSD option at [filename] line 1
e1.txt is here: http://dl.dropbox.com/u/10557283/parserProj/e1.txt
I'm running the program from win7 cmd, not Cygwin.
filename is parsCit-client.pl - entire program is here:
#!/usr/bin/perl -CSD
#
# Simple SOAP client for the ParsCit web service.
#
# Isaac Councill, 07/24/07
#
use strict;
use encoding 'utf8';
use utf8;
use SOAP::Lite +trace=>'debug';
use MIME::Base64;
use FindBin;
my $textFile = $ARGV[0];
my $repositoryID = $ARGV[1];
if (!defined $textFile || !defined $repositoryID) {
print "Usage: $0 textFile repositoryID\n".
"Specify \"LOCAL\" as repository if using local file system.\n";
exit;
}
my $wsdl = "$FindBin::Bin/../wsdl/ParsCit.wsdl";
my $parsCitService = SOAP::Lite
->service("file:$wsdl")
->on_fault(
sub {
my($soap, $res) = #_;
die ref $res ? $res->faultstring :
$soap->transport->status;
});
my ($citations, $citeFile, $bodyFile) =
$parsCitService->extractCitations($textFile, $repositoryID);
#print "$citations\n";
#print "CITEFILE: $citeFile\n";
#print "BODYFILE: $bodyFile\n";
From perldoc perlrun, about the -C switch:
Note: Since perl 5.10.1, if the -C option is used on the "#!" line, it
must be specified on the command line as well, since the standard
streams are already set up at this point in the execution of the perl
interpreter. You can also use binmode() to set the encoding of an I/O
stream.
Which is presumably what the compiler means by it being "too late".
In other words:
perl -CSD parsCit-client.pl
Because command-line options in a #! "shebang" are not passed consistently across all operating systems (see this answer), and Perl has already opened streams before parsing the script shebang, and so cannot compensate for this in some older OSs, it was decided in bug 34087 to forbid -C in the shebang. Of course, not everyone was happy with this "fix", particularly if it would have otherwise worked on their OS and they don't want to think about anything other than UTF-8.
If you think binmode() is ugly and unnecessary (and doesn't cover command-line arguments), you might like to consider the utf8::all package which has a similar effect to perl -CSDL.
Or were you using *nix, I would suggest export PERL_UNICODE="SDA" in the enclosing script to get Perl to realise it's in a UTF-8 environment.

Why does my jzip process hang when I call it with Perl's system?

I am definitely new to Perl, and please forgive me if this seem like a stupid question to you.
I am trying to unzip a bunch of .cab file with jzip in Perl (ActivePerl, jzip, Windows XP):
#!/usr/bin/perl
use strict;
use warnings;
use File::Find;
use IO::File;
use v5.10;
my $prefix = 'myfileprefix';
my $dir = '.';
File::Find::find(
sub {
my $file = $_;
return if -d $file;
return if $file !~ /^$prefix(.*)\.cab$/;
my $cmd = 'jzip -eo '.$file;
system($cmd);
}, $dir
);
The code decompresses the first .cab files in the folder and hangs (without any errors). It hangs in there until I press Ctrl+c to stop. Anyone know what the problem is?
EDIT: I used processxp to inspect the processes, and I found that there are correct number of jzip processes fired up (per the number of cab files resides at the source folder). However, only one of them is run under cmd.exe => perl, and none of these process gets shut down after fired. Seems to me I need to shut down the process and execute it one by one, which I have no clue how to do so in perl. Any pointers?
EDIT: I also tried replacing jzip with notepad, it turns out it opens up notepad with one file at a time (in sequential order), and only if I manually close notepad then another instance is fired. Is this common behavior in ActivePerl?
EDIT: I finally solved it, and I am still not entire sure why. What I did was removing XML library in the script, which should not relevant. Sorry I removed "use XML::DOM" purposefully in the beginning as I thought it is completely irrelevant to this problem.
OLD:
use strict;
use warnings;
use File::Find;
use IO::File;
use File::Copy;
use XML::DOM;
use DBI;
use v5.10;
NEW:
#!/usr/bin/perl
use strict;
use warnings;
use File::Find;
use IO::File;
use File::Copy;
use DBI;
use v5.10;
my $prefix = 'myfileprefix';
my $dir = '.';
# retrieve xml file within given folder
File::Find::find(
sub {
my $file = $_;
return if -d $file;
return if $file !~ /^$prefix(.*)\.cab$/;
say $file;
#say $file or die $!;
my $cmd = 'jzip -eo '.$file;
say $cmd;
system($cmd);
}, $dir
);
This, however, imposes another problem, when the extracted file already exists, the script will hang again. I highly suspect this is a problem of jzip and an alternative of solving the problem is simply replacing jzip with extract, like #ghostdog74 pointed out below.
First off, if you are using commands via system() call, you should always redirect their output/error to a log or at least process within your program.
In this particular case, if you do that, you'd have a log of what every single command is doing and will see if/when any of them are stuck.
Second, just a general tip, it's a good idea to always use native Perl libraries - in this case, it may be impossible of course (I'm not that experienced with Windows Perl so no clue if there's a jzip module in Perl, but search CPAN).
UPDATE: Didn't find a Perl native CAB extractor, but found a jzip replacement that might work better - worth a try. http://www.cabextract.org.uk/ - there's a DOS version which will hopefully work on Windows
Based on your edit, this is what I suggest:
#!/usr/bin/perl
use strict;
use warnings;
use File::Find;
use IO::File;
use v5.10;
my $prefix = 'myfileprefix';
my $dir = '.';
my #commands;
File::Find::find(
sub {
my $file = $_;
return if -d $file;
return if $file !~ /^$prefix(.*)\.cab$/;
my $cmd = "jzip -eo $File::Find::name";
push #commands, $cmd;
}, $dir
);
#asynchronously kick off jzips
my $fresult;
for #commands
{
$fresult = fork();
if($fresult == 0) #child
{
`$_`;
}
elsif(! defined($fresult))
{
die("Fork failed");
}
else
{
#no-op, just keep moving
}
}
edit: added asynch. edit2: fixed scope issue.
What happens when you run the jzip command from the dos window? Does it work correctly? What happens if you add an end of line character (\n) to the command in the script? Does this prevent the hang?
here's an alternative, using extract.exe which you can download here or here
use File::Find;
use IO::File;
use v5.10;
my $prefix = 'myfileprefix';
my $dir = '.';
File::Find::find({wanted => \&wanted}, '.');
exit;
sub wanted {
my $destination = q(c:\test\temp);
if ( -f $_ && $_=~/^$prefix(.*)\.cab$/ ) {
$filename = "$File::Find::name";
$path = "$File::Find::dir";
$cmd = "extract /Y $path\\$filename /E /L $destination";
print $cmd."\n";
system($cmd);
}
} $dir;
Although no one has mentioned it explicitly, system blocks until the process finishes. The real problem, as people have noted, is figuring out why the process doesn't exit. Forking or any other parallelization won't help because you'll be left with a lot of hung processes.
Until you can figure out the issue, start small. Make the smallest Perl script that demonstrates the problem:
#!perl
system( '/path/to/jzip', '-eo', 'literal_file_name' ); # full path, list syntax!
print "I finished!\n";
Now the trick is to figure out why it hangs, and sometimes that means different solutions for different external programs. Sometimes you need to close STDIN before you run the external process or it sits there waiting for it to close, sometimes you do some other thing.
Instead of system, you might also try things such as IPC::System::Simple, which handles a lot of platform-specific details for you, or modules like IPC::Run or IPC::Open3.
Sometimes it just sucks, and this situation is one of those times.