Fetching files from remote server - Perl/Unix - perl

In regards to the question I previously posted, I have written the following script to load a specific file:
#!/bin/perl
use strict;
use Net::FTP;
my $ftpOb = Net::FTP->new("X")
or die ("Could not connect to the FTP Server: $#");
print "Connected \n";
$ftpOb->login("A", "B")
or die ("Incorrect server credentials");
print "Logged in \n";
print " Current folder is: " . $ftpOb->pwd();
$ftpOb->cwd("root/Folder") or die ("Cannot connect to the folder on Server");
print "Transferred to folder \n";
$ftpOb->get("621418185-006249189002-5383.txt")
or die ("Error occured while fetching the file");
$ftpOb->quit;
However, the code seems to fail to change the working directory and I get the following output:
Connected
Logged in
Cannot connect to the folder on Server at ./GetUploadFile.pl line 16.
Current folder is: /
Can anyone please help me debug the issue here?

Include $ftpOb->message() in your error messages (except for the connect, where $# is the correct error message).
Also do a dir() and list what files/directories are available.

Related

Perl Net::SSH2 scp_put returns unknown error

I am using Net:SSH2 to put file on a remote server with scp_put.
It returns unknown error:
-43, LIBSSH2_ERROR_UNKNOWN(-43), SCP failure
It seems that the error comes after some timeout/delay, as it takes several minutes to return.
Connection to sftp-server is working. I can get a directory list from the directory.
I have access rights to that directory as I can put files there with SFTP-client.
I am using Strawberry Perl in Windows environment.
use warnings;
use strict;
use Net::SSH2;
my $dir1 = '.';
my $file = 'D:\\test\\test.txt';
my $ssh2 = Net::SSH2->new();
$ssh2->connect('testserver') or die "Unable to connect Host $# \n";
$ssh2->auth_password('test','test') or die "Unable to login $# \n";
if($ssh2->scp_put($file, $dir1)) {
print "File $file transferred to $dir1\n";
} else {
print "Couldn't transfer file $file to $dir1\n";
print join ', ', $ssh2->error;
print "\n";
}
SCP support in libssh2 is quite rudimentary and buggy.
Better alternatives are Net::SSH::Any which has a proper pure-perl implementation of SCP or Net::SFTP::Foreign for SFTP. Both can work on top of Net::SSH2.

perl script can not be opened with browser and is ok with Linux shell command

I have a few perl script file which have been up and running for past few years, all of a sudden, for past few days, they were up, down, up, down, ..... There are no syntax error for them, since sometimes they are up, and they have been there for quite a while, i did not change them recently. Plus, I can run them from Lunix shell command without any problem, the file permission is 755, so everything seems to be set up properly. They are hosted by a web hosting company, I have no access to server log file.
The error message is typical perl error message:
"Internal Server Error
The server encountered an internal error or misconfiguration and was unable to complete your request.
Please contact the server administrator and inform them of the time the error occurred, and the actions you performed just before this error.
More information about this error may be available in the server error log."
Add use CGI::Carp qw( fatalsToBrowser ); early in your program to have the error returned to the browser.
Alternatively, you could use the same technique CGI::Carp uses or a wrapper to your script to save the errors in your own log file.
Add the following to the start of a script to have it log errors and warnings to a log file of your choice.
sub self_wrap {
my $log_qfn = "$ENV{HOME}/.web.log"; # Adjust as needed.
open(my $log_fh, '>>', $log_qfn)
or warn("Can't append to log file \"$qfn\": $!"), return;
require IPC::Open3;
require POSIX;
my $prefix = sprintf("[%s] [client %s] ",
POSIX::strptime('', localtime),
$ENV{REMOTE_ADDR} || '???'
);
my $suffix = $ENV{HTTP_REFERER} ? ", $ENV{HTTP_REFERER}" : '';
my $pid = IPC::Open3::open3(
'<&STDIN',
'>&STDOUT',
local *CHILD_STDERR,
$^X, $0, #ARGV
);
while (<CHILD_STDERR>) {
print(STDERR $_);
chomp;
print($log_fh $prefix, $_, $suffix, "\n");
}
waitpid($pid, 0);
POSIX::_exit(($? & 0x7F) ? ($? & 0x7F) | 0x80 : $? >> 8);
}
BEGIN { self_wrap() if !$ENV{WRAPPED}++; }
If your site has recently been transferred onto a different server by your hosting company or the server settings have recently been changed try saving the file with HTML kit by using 'Save as extra' >> 'Save as UNIX format' and then upload.

How to create a Zip file on a remote server in perl

actualy im using Net::FTP::Recursive to download a directory structure , it works nice for what is required. But since some folders have more than 100/ files, downloading then can take ages. Since a zip file is faster do download, how i could, using perl connect to a remote server via ftp and create a zip file from the remote server/folder to download ?
use Net::Config;
use Net::FTP::Recursive;
$ftp = Net::FTP::Recursive->new("$hostname:$ftp_port", Debug => 0)
or die "Cannot connect to $hostname: $#";
$ftp->login($iLogin,$iPass)
or die "failed ", $ftp->message;
$ftp->binary()
or die "Cannot set to Binary";
$ftp->cwd("/admin/packages/$fileName")
or die "Cannot change working directory ", $ftp->message;
$ftp->rget( $fileName );
#or die "Download Failed ", $ftp->message;
$ftp->quit;
Thank you all for your time
The site(ARGS) method is designed for that. You can send a shell command and make it runs on remote server.
http://perldoc.perl.org/Net/FTP.html#METHODS
However, most of the FTP servers I know disabled that permission, so, good luck
I think you'd need to have SSH access to the system to run a ZIP command. But if that's the case, you could also use SCP to transfer your files more securely. FTP does everything in the open.
Thanks,
F.

Different results for command line and CGI for perl web form script

The code for my sub:
sub put_file
{
my($host, $placement_directory, $tar_directory, $filename, $user, $pass) = #_;
my $ftp = Net::FTP->new($host) or die "cannot connect to localhost";
$ftp->login($user, $pass) or die "cannot log in";
$ftp->cwd($placement_directory);
print $tar_directory."/".$filename;
$ftp->put("$tar_directory/$filename") or die "cannot put file ", $ftp->message;
print "File has been placed \n";
}
So when this sub is called from a test script(that runs from command line) that uses the same config file and does all of the same things as the CGI script, no errors are found and the file is placed correctly. When the sub is called from my CGI script the script will output the $tar_directory."/".$filename but not "File has been placed \n" and the ftp->message outputs "cannot put file Directory successfully changed." Which seems to come from the cwd line before it.
Other info:
I have tried running the test script as multiple users with the same result.
I use strict and warnings.
The tar file that is being moved is created by the script.
I'm new to perl so any advice is helpful because I'm stuck on this and cant find any help using the power of The Google.
Just a guess. Your ftp->put is failing, triggering the die. Unless you have:
use CGI::Carp qw(carpout fatalsToBrowser);
you won't see the die message in your browser. Since you died, you don't see the final print statement either.
Check your webserver log for the die output, or just change "die" to "print".
Net::FTP can put() from a filehandle as well as a file name:
open my $fh, '<', $tar_directory . '/' . $filename or die "Could not open file: $!";
$ftp->put($fh, $filename) or die "cannot put file ", $ftp->message;
If the problem is on your side then the open should fail and you should get an error message of some kind that will, hopefully, tell you what is wrong; if the problem is on the remote side then the put should fail and you'll see the same thing you'r seeing now.
That $ftp->message only has the success message from the cwd indicates that everything is fine on the remote side and the put isn't even reaching the remote server.

How can I read a file's contents directly with Perl's Net::FTP?

I want to get the file from one host to another host. We can get the file using the NET::FTP module. In that module we can use the get method to get the file. But I want the file contents instead of the file. I know that using the read method we can read the file contents. But how do I call the read function and how do I get the file contents?
From the Net::FTP documentation:
get ( REMOTE_FILE [, LOCAL_FILE [, WHERE]] )
Get REMOTE_FILE from the server and store locally. LOCAL_FILE may be a filename or a filehandle.
So just store the file directly into a variable attached to a filehandle.
use Net::FTP ();
my $ftp = Net::FTP->new('ftp.kde.org', Debug => 0)
or die "Cannot connect to some.host.name: $#";
$ftp->login('anonymous', '-anonymous#')
or die 'Cannot login ', $ftp->message;
$ftp->cwd('/pub/kde')
or die 'Cannot change working directory ', $ftp->message;
my ($remote_file_content, $remote_file_handle);
open($remote_file_handle, '>', \$remote_file_content);
$ftp->get('README', $remote_file_handle)
or die "get failed ", $ftp->message;
$ftp->quit;
print $remote_file_content;
USE File::Remote for Read/write/edit remote files transparently