I am trying to download few files from specific folders using sftp utility and am doing it through Expect.pm . Below is the code for that :
use strict;
use Expect;
my $userid = `whoami`;
chomp($userid);
my $Password = "<password>";
my $command = "sftp " . "$userid" . "\#<server-name>";
my $spawn_ok = 0;
my $timeout = 10;
print "$command \n";
my $exp = new Expect();
$exp->log_file("FTPLOGFILE.txt");
$exp->spawn("$command") or die "Cannot spawn $command: $!\n";
$exp->log_stdout(0);
$exp->expect($timeout,
[ 'Password:',
sub {
$spawn_ok = 1;
my $fh = shift;
print "Sending password \n";
$fh->send("$Password\r");
exp_continue;
}
],
[ 'sftp> ',
sub {
my $fh = shift;
$spawn_ok = 3;
print "Downloading cfg files \n";
$fh->send("get /home/cfg/*.cfg /tmp/ACC_CCM_CFG","\n");
$fh->send("bye","\n");
exp_continue;
}
]
The problem is apart from downloading files into the above folder : /tmp/ACC_CCM_CFG i also want to run the below command inside the same sftp session :
get /home/appl/*.pl /tmp/ACC_CCM_APPL
But i can't do that since the regular expression for the expect function would be same (sftp>) . How do i run a series of commands inside the same sftp session using expect if the regex of the prompt does not change .
Please throw some light on the above since i can't find any solution .
Any reason for using Expect and system sftp command for this? Using Net::SFTP module would make the job much more easier and clean.
Related
I'm trying to get the following perl script to work. First, a fastq-file is read, this file is then used to be analysed by a number of programs.
Code:
use warnings;
use PBS::Client;
$directory = $ARGV[0];
opendir(DIR, $directory);
#files=();
while ($file = readdir(DIR)) {
push(#files, $file);
}
closedir(DIR);
#fastq_files = grep(/fastq/, #files);
$client = PBS::Client->new();
foreach $fastq (#fastq_files){
#commands = ();
$wd = "/store/www/labresults_QC/snRNA_sequence_analyser/".$ARGV[0];
$name = $fastq."_process_map";
$queue = "system";
$wallt = "72:00:00";
chomp($fastq);
$fastq =~ /.+[^\.fastq]/;
push (#commands, "/opt/fastx_toolkit-0.0.13.2/bin/fastq_quality_filter -q 30 -p 80 -i " . $fastq . " -o ";
push (#commands, "/opt/fastx_toolkit-0.0.13.2/bin/fastx_clipper -i " . $& . "_qc.fastq -o " . $& . "_qc_clipped.fastq -v -l 15 -a TGGAATTCTCGGGTGCCAAGG -Q33\n");
push (#commands, "/opt/fastx_toolkit-0.0.13.2/bin/fastx_collapser -i " . $& . "_qc_clipped.fastq -o " . $& . "_qc_clipped_collapse.fa -v -Q33\n");
push (#commands, "/opt/bowtie-1.0.0/bowtie -f /opt/genomes/9606/GRCh37/bowtie/GRCh37 " . $& . "_qc_clipped_collapse.fa " . $& . "_mapped.sam -k 100 -n 0 -l 25 --best");
$job = PBS::Client::Job -> new(
wd => $wd,
queue => $queue,
name => $name,
wallt => $wallt,
cmd => [[#commands]]);
$client -> qsub($job);
}
However, when trying to execute through a Linux commandline, it gives this error message:
open3: exec of /store/www/labresults_QC/snRNA_sequence_analyser/data/data_raw/test_run/n8XyeYIkfv failed at /store/bin/perl_libs/lib/perl5//PBS/Client.pm line 150
The error message points to this piece of code in the PBS Client module:
#-------------------------------------------------------------------
# Thanks to Sander Hulst
sub call_qsub
{
my #args = #_;
# If the qsub command fails, for instance, pbs_server is not running,
# PBS::Client's qsub should not silently ignore. Disable any reaper
# functions so the exit code can be captured
use Symbol qw(gensym);
use IPC::Open3;
my $stdout = gensym();
my $stderr = gensym();
{
local $SIG{CHLD} = sub{};
my $pid = open3(gensym, $stdout, $stderr, #args); # This is line 150
waitpid($pid,0);
}
confess <$stderr> if ($?);
return <$stdout>;
}
#-------------------------------------------------------------------
Anyone got a clue what this means?
EDIT
After some investigation it seems that this line is failing: $client -> qsub($job);
but I don't know why. Any ideas what I'm doing wrong?
FINAL EDIT:
So, we finally found the real cause of the problem. It turned out something went wrong in the latest installation of PBS::Client we did. So we reverted to an older version, and the problem was gone!
The module generates a script then tries to execute it without having made it executable. Workaround:
use PBS::Client qw( );
BEGIN {
my $orig_genScript = \&PBS::Client::genScript;
my $new_genScript = sub {
my $script_qfn = $orig_genScript->(#_);
chmod(0700, $script_qfn) or die $!;
return $script_qfn;
};
no warnings 'redefine';
*PBS::Client::genScript = $new_genScript;
}
Im a beginner. I have written a perl script which does the following
-Create a directory under “/x01/abcd/abc_logs/abcd_Logs” by the current date, in the format of “YYYYMMDD” if it has not already been created.
i.e: if the script is run on “01st of jan 2013”, the directory “20130101” will be created under the said path. So whenever there is a need to inspect the logs always look for a directory by the current date.
-Check if the log file(s) have already been downloaded earlier within the same day, and if not log(s) will be downloaded to the TODAY’s directory.
Im having a hard time, coming up with a solution to print a message when there are no files in the share. This is of course when the user specify 2 or more files that are not there in the share. I know that this happens because there is a "die" statement in the "sub get_LOGS". I just cannot seem to understand how to return a message when all the files I specify do not happen to be in the share.
usage of this script is as follows
./abc_logs ....<file(n)>
following is the script.
my $LOGS_LOCAL_PATH = "/x02/abc/abcba2/";
chomp $LOGS_LOCAL_PATH;
my $LOGS_REM_PATH = "/x01/INT/abc/vabc2/";
chomp $LOGS_REM_PATH;
my $TODAY = `date +%Y%m%d`;
chomp $TODAY;
my #GETLOOP = #ARGV;
unless ($#ARGV >= 0) {
print "\nUsage: gtp_logs.pl <file1> <file2> <file3>.....<file(n)>\n\n";
exit;
}
system("clear");
unless ( -d "$LOGS_LOCAL_PATH"."$TODAY") {
print "Directory \"$TODAY\" doesn't exist. So creating the directory..!\n";
print "OK..Done.....!\n\n";
system("mkdir $LOGS_LOCAL_PATH/$TODAY");
}
else {
print "Directory already exists. Logs will be downloaded to ==> \"$LOGS_LOCAL_PATH$TODAY\".....!\n\n";
}
# if_DOWNLOADED($LOGS_LOCAL_PATH,$TODAY,#GETLOOP);
chdir("$LOGS_LOCAL_PATH"."$TODAY") || die "cannot cd to ($!)";
foreach my $GETL (#GETLOOP) {
my $is_downloaded = if_DOWNLOADED($LOGS_LOCAL_PATH,$TODAY,$GETL);
if(!$is_downloaded)
{
get_LOGS("172.25.70.221","abc","abc2","/x01/INT/abc",$GETL);
print "File \"$GETL\" downloaded to ==> \"$LOGS_LOCAL_PATH$TODAY\"\n\n";
}
else
{
print "File \"$GETL\" has already been Downloaded to ==> \"$LOGS_LOCAL_PATH$TODAY\"\n\n";
}
}
sub get_LOGS {
my $LOG_HOST = shift;
my $REM_USER = shift;
my $REM_PASSW = shift;
my $REM_PATH = shift;
my $REM_FILE = shift;
print "Connecting to the sftp share! Please wait....!\n";
my $sftp = Net::SFTP::Foreign->new($LOG_HOST, user => $REM_USER, password => $REM_PASSW);
$sftp->setcwd($REM_PATH) or die "unable to change cwd: " . $sftp->error;
print "OK. On the share! Downloading the file \"$REM_FILE\"...................!\n\n\n\n";
$sftp->error and die "Problem connecting to the share...!!!! " . $sftp->error;
$sftp->get($REM_FILE) or die "File does not seem to be present on the remote share. Please re-request..!!!" . $sftp->error;
return $REM_FILE;
}
sub if_DOWNLOADED {
my $DWD_FILE_PATH = shift;
my $DWD_DIR = shift;
my $DWD_FILE = shift;
if (-e "$DWD_FILE_PATH/$DWD_DIR/$DWD_FILE")
{
return 1;
}
else
{
return 0;
}
}
Please can someone help me finding a solution to this matter? Please try to use the same script and modify.
/V
Some comments to your code:
Use strict and warnings in order to catch lots of errors early.
Read some book on style (i.e. Damian Conway's Perl Best Practices). But in any case try to be consistent when naming variables, subroutines, and everything and also with their case.
When you have to use some calculated value in several places, try to calculate it once and save it in a variable.
Don't use subroutines for trivial things.
You don't need to call chomp on variables you have defined and that don't have a "\n" character at the end.
Opening a new SFTP connection for every file transfer is very inefficient. You can open just one at the beginning and use it for all the transfers.
And now, a simplified version of your script:
#!/usr/bin/perl
use strict;
use warnings;
my $host = "172.25.70.221";
my $user = "abc";
my $password = "abc1234321";
my $LOGS_LOCAL_PATH = "/x02/ABC/abc2";
my $LOGS_REM_PATH = "/x01/INT/abc/vim";
my $TODAY = `date +%Y%m%d`;
chomp $TODAY;
my $TODAY_LOCAL_PATH = "$LOGS_LOCAL_PATH/$TODAY";
my #files = #ARGV;
#files or die "\nUsage: gtp_logs.pl <file1> <file2> <file3>.....<file(n)>\n\n";
system("clear");
if ( -d $TODAY_LOCAL_PATH) {
print "Directory already exists. Logs will be downloaded to ==> \"$TODAY_LOCAL_PATH\".....!\n\n";
}
else {
print "Directory \"$TODAY\" doesn't exist. So creating the directory..!\n";
mkdir "$TODAY_LOCAL_PATH" or die "unable to create directory: $!\n";
print "OK..Done.....!\n\n";
}
chdir $TODAY_LOCAL_PATH or die "cannot cd to ($!)\n";
my $sftp = Net::SFTP::Foreign->new($host, user => $user, password => $password);
$sftp->error
and die "Problem connecting to the share...!!!! " . $sftp->error;
my $ok = 0;
my $failed = 0;
foreach my $file (#files) {
if (-e "$TODAY_LOCAL_PATH/$file") {
print "File \"$file\" has already been Downloaded to ==> \"$TODAY_LOCAL_PATH\"\n";
}
else {
if ($sftp->get("$LOGS_REM_PATH/$file")) {
print "File \"$file\" downloaded to ==> \"$TODAY_LOCAL_PATH\"\n";
$ok++;
}
else {
print "Unable to download file \"$file\" : " . $sftp->error . "\n";
$failed++;
}
}
}
print "$ok files have been downloaded, $failed files failed!\n\n";
My perl code does not allow more than 10 forks. For the following perl code, whenever I use more than 10 machines in the list of machines read in to the script, the perl script only forks 10 processes for 10 machines and for the rest it dies with error:
SSHProcessError The ssh process was terminated. at serverLogin.pl 44
It dies at the line where it says $ssh->waitfor('The authenticity of host*',15);.
PERL SCRIPT:
#!/usr/bin/perl -w
use Net::SSH::Expect;
use Term::ReadKey;
print "please enter filename:\n";
$filename = ReadLine;
chomp $filename;
print "please enter user ID:\n";
$userID = ReadLine;
chomp $userID;
print "please enter password:\n";
ReadMode 'noecho';
$passwordforuser = ReadLine 0;
chomp $passwordforuser;
ReadMode 'normal';
open READFILE,"<","$filename" or die "Could not open file listofmachines\n";
my #listofmachines = <READFILE>;
foreach $machine (#listofmachines)
{
my $pid=fork();
if ($pid){
push(#childprocs,$pid);
}
elsif ( $pid == 0 ) {
my $ssh = Net::SSH::Expect->new (
host => "$machine",
user => "$userID",
password=> "$passwordforuser",
timeout => 25,
raw_pty => 1,
);
my $login_output = $ssh->run_ssh or die "Could not launch SSH\n";
$ssh->waitfor('The authenticity of host*',15);
#print "This output for machine $machine\n";
$ssh->send("yes");
$ssh->waitfor('password: ', 15);
$ssh->send("$passwordforuser");
$ssh->waitfor('$ ', 10);
my #commresult=$ssh->exec("uptime");
print $login_output;
print #commresult;
exit 0;
}
else {
die "Could not Fork()\n";
}
}
foreach(#childprocs){
waitpid($_, 0)
}
Please help. Thanks, nblu.
Your script using Net::OpenSSH::Parallel instead of Net::SSH::Expect.
The number of simultaneous connections is limited to 10 to overcome any resource exhaustion problem as happening in your script (probably PTYs):
#!/usr/bin/perl -w
use Net::OpenSSH::Parallel;
use Term::ReadKey;
print "please enter filename:\n";
$filename = ReadLine;
chomp $filename;
print "please enter user ID:\n";
$userID = ReadLine;
chomp $userID;
print "please enter password:\n";
ReadMode 'noecho';
$passwordforuser = ReadLine 0;
chomp $passwordforuser;
ReadMode 'normal';
open READFILE,"<","$filename" or die "Could not open file listofmachines\n";
my #listofmachines = <READFILE>;
chomp #listofmachines;
my $pssh = Net::OpenSSH::Parallel->new(connections => 10);
$pssh->add_host($_,
user => $userID, password => $passwordforuser,
master_opts => [-o => 'StrictHostKeyChecking=no'])
for #listofmachines;
sub do_ssh_task {
my ($host, $ssh) = #_;
my $output = $ssh->capture('uptime');
print "$host: $output";
}
$pssh->all(parsub => \&do_ssh_task);
$pssh->run;
for my $host (#listofmachines) {
if (my $error = $pssh->get_error($host)) {
print STDERR "remote task failed for host $host: $error\n";
}
}
By default, the remote ssh daemon limits the number of concurrent ssh connections to something like 10 per userid. If that is a problem for you, you will need to change the server configuration...
Perhaps you have a limit to the number of processes you can create? Can you create 30 or more processes in a loop where the children just sleep(60)?
If in fact you have a limit of how many you can do at once, try using Parallel::ForkManager.
If this is from hitting a limit on pseudoterminals, how you set that depends on kernel version; what does uname -a say? also depends on whether the code is using BSD or SysV/UNIX98 ptys. If you see it opening files like /dev/ptyXY where X is one of a-e or p-z, it's the former, and you will have a hard limit of 256 systemwide.
You can change passwords without a pseudoterminal using usermod instead of passwd, but this momentarily exposes the crypted password in the process list; that may be acceptable in your case.
I have a password variable $pw and a command variable $cmd.
$pw=UNIX password of a remote machine.
$cmd=Command to be executed in the remote machine.
now if I run the command using back-tick
I will be able to get some value in the output variable.
now if I want to run the same command through a expect I how to achieve the same. I mean how to get the out put of the command run through a expect in a variable.
my expect function is like:
sub expt($$){
my $cmd;
my $timeout;
($cmd, $pw)=#_;
$expect = Expect->new;
$expect->raw_pty(1);
printDebug("Running the command under expt");
$expect->spawn($cmd)
or die "Cannot spawn $cmd: $!\n";
$expect->expect($timeout,
[ qr/password:/i, #/
sub {
my $self = shift;
$self->send("$pw\n");
exp_continue;
}
],
[qr/Are you sure you want to continue connecting \(yes\/no\)?/
, sub { my $self = shift;
$self->send("yes\n");
exp_continue; }],
[qr/Unix password \(user\):/
, sub { my $self = shift;
$self->send("pw\n");
exp_continue; }
],
);
$expect->soft_close();
return 0;
}
And I am calling the function like
expt($cmd,$pw);
By doing this I am able to execute the script in the remote host but my requirement is to store the output of the remote host in a local variable.
Why not using Net::SSH::Expect ? It would be more close to the first method: you "just" need to do something like that:
my $ssh = Net::SSH::Expect->new (
host => "myserver.com",
user => 'myuser',
raw_pty => 1
);
$ssh->run_ssh() or die "SSH process couldn't start: $!";
($ssh->read_all(2) =~ />\s*\z/) or die "where's the remote prompt?"
$ssh->exec("stty raw -echo");
my $output = $ssh->exec($cmd);
Have a look at Net::SSH::Expect pod documentation, it is quite extensive.
How can I check if a file exists on a remote server using Perl?
Can I do this while using the Perl module Net::FTP?
CHECK TO SEE IF FILE EXISTS
if (-e $file_check) {
print "File Exists!\n";
}
else {
print "File Doesn't Exist!\n";
}
You might be best served by using SSH to do this:
#!/usr/bin/perl
use strict;
use warnings;
my $ssh = "/usr/bin/ssh";
my $host = "localhost";
my $test = "/usr/bin/test";
my $file = shift;
system $ssh, $host, $test, "-e", $file;
my $rc = $? >> 8;
if ($rc) {
print "file $file doesn't exist on $host\n";
} else {
print "file $file exists on $host\n";
}
You could use a command such as:
use Net::FTP;
$ftp->new(url);
$ftp->login(usr,pass);
$directoryToCheck = "foo";
unless ($ftp->cwd($directoryToCheck))
{
print "Directory doesn't exist
}
If the file is in the FTP space on the remote server, then use Net::FTP. Get an ls listing of the directory and see if your file is in there.
But you can't just go and see if any arbitrary file is on the server. Think of what a security problem that would be.
Log in to the FTP server, and see if you can get an FTP SIZE on the file you care about:
#!/usr/bin/env perl
use strict;
use warnings;
use Net::FTP;
use URI;
# ftp_file_exists('ftp://host/path')
#
# Return true if FTP URI points to an accessible, plain file.
# (May die on error, return false on inaccessible files, doesn't handle
# directories, and has hardcoded credentials.)
#
sub ftp_file_exists {
my $uri = URI->new(shift); # Parse ftp:// into URI object
my $ftp = Net::FTP->new($uri->host) or die "Connection error($uri): $#";
$ftp->login('anonymous', 'anon#ftp.invalid') or die "Login error", $ftp->message;
my $exists = defined $ftp->size($uri->path);
$ftp->quit;
return $exists;
}
for my $uri (#ARGV) {
print "$uri: ", (ftp_file_exists($uri) ? "yes" : "no"), "\n";
}
You could use an expect script for the same purpose (requires no extra modules). The expect will execute "ls -l" on the FTP server and the perl script will parse the output and decide if file exists. Its really simple to implement.
Here's the code,
PERL script: (main.pl)
# ftpLog variable stores output of the expect script which logs in to FTP server and runs "ls -l" command
$fileName = "myFile.txt";
$ftpLog = `/usr/local/bin/expect /path/to/expect_script/ftp_chk.exp $ftpIP $ftpUser $ftpPass $ftpPath`;
# verify that file exists on FTP server by looking for filename in "ls -l" output
if(index($ftpLog,$fileName) > -1)
{
print "File exists!";
}
else
{
print "File does not exist.";
}
EXPECT script: (ftp_chk.exp)
#!/usr/bin/expect -f
set force_conservative 0;
set timeout 30
set ftpIP [lindex $argv 0]
set ftpUser [lindex $argv 1]
set ftpPass [lindex $argv 2]
set ftpPath [lindex $argv 3]
spawn ftp $ftpIP
expect "Name ("
send "$ftpUser\r"
sleep 2
expect {
"assword:" {
send "$ftpPass\r"
sleep 2
expect "ftp>"
send "cd $ftpPath\r\n"
sleep 2
expect "ftp>"
send "ls -l\r\n"
sleep 2
exit
}
"yes/no)?" {
send "yes\r"
sleep 2
exp_continue
}
timeout {
puts "\nError: ftp timed out.\n"
exit
}
}
I have used this setup in one of my tools and I can guarantee that it works perfectly :)