I have a perl script that is attempting to FTP files and it seems to be failing every other file.
I determined that for some reason the
$filename = glob $filename
is setting $filename to empty rather than the full filename.
I printed out the filename just before the glob line and it is correct and has no spaces in it.
I also tried just commenting out the glob line but then it returns "Not a GLOB reference" so apparently net ftp requires that.
Any idea what might cause glob to return empty? (and why it works on the first file and every 2nd file after that)
This is what I'm sending to the sub:
ftpToDevice($ftp,$device_dir,$config,$config,'put');
Here is the actual sub:
sub ftpToDevice {
my ($ftp,$fileDir,$fileName,$fileNameDest,$action) = #_;
print "ftp filename $fileName\n";
$fileName =~ s/\s+//g;
$fileNameDest =~ s/\s+//g;
$fileName = glob($fileName);
my $path = "cf3:\\";
chdir $fileDir if $action eq 'put';
print "file location: $fileDir/$fileName to $fileNameDest\n";
if($ftp->cwd("$path")){
if($action eq 'put'){
print "attempting FTP PUT $fileName $fileNameDest\n";
$ftp->put($fileName,$fileNameDest) or return "Error cannot put - " . $ftp->message;
#$ftp->rename($fileName,$fileNameDest) or return "Error cannot put - " . $ftp->message;
} else {
print "Attempting to delete $path $fileNameDest\n";
#$ftp->delete("$fileName") or return "Error cannot delete $fileName - " . $ftp->message;
$ftp->delete($fileNameDest) or return "Error cannot delete $fileName - " . $ftp->message;
print $ftp->message . "\n";
}
}
chdir $masterDir if $action eq 'put';
return 'success';
}
I found a solution but am not sure why it worked.
I commented out the line:
$fileName = glob($fileName);
I added this instead:
$fileName = "" . $fileName; #this converts it to a string I believe
This allowed all files to work (not just every second one) and did not give the GLOB error.
My theory is that somehow the list of files that is generated before the for each loop that uses the ftp sub is causing every second file to be a glob object (array or something?) rather than a plain text filename.
Related
My AIM is to search for a particular word or full statement and paste those words in another text file. For example you can take a log file and i need to find an exception for the particular date time and paste those in another text file example result.txt. Below are the following script. Please help me in this regard
chomp(#ARGV)
if(#ARGV!=2) {
print "Please Pass two parameters";
print "Usage: $ <File_name><pattern\n>";
}
$File_name = $ARGV[0];
$res_File_name = $Filename . "\.res";
$Pattern = $ARGV[1];
open(FD,"<File_name>") or die("File $File_name could not be opened ");
open(WFD,"<res_File_name>") or die("File $res_File_name could not be opened ");
while(<FD>) {
print WFD $-if(/$Pattern);
}
close(FD);
close(WFD);
This is a log text file
log.txt
2015-1-11 11:21:00 [Exception or Error] System.IO exception. I need to find those exceptions and paste that in result.txt. Not that only i have a document say i have some paragraph ex: Hello World, i need to find that hello world and paste that in text file.
For this i have passed two parameters here if (#ARGV != 2), i need to print a message.
Second I need to pass those two parameters
ARGV[0],ARGV[1]
File_name and res_file_name , i am getting scalar values from my original file and passing it to next res file.
O/p: 2015-01-10 or any other argument you pass that will be displayed
#hari_cheerful: You can try the below perl code:
use strict;
use warnings;
chomp(#ARGV);
if(#ARGV!=2)
{
print "Please Pass two parameters\n". "Usage: <Filename><pattern>\n";
exit;
}
my $Filename = $ARGV[0];
my ($file,$ext) = $Filename =~ /(.*)(\..*?)/;
my $res_File_name = "$file" . "\.res";
my $Pattern = $ARGV[1];
open(my $ip,'<', $Filename) or die("File $Filename could not be opened ");
open(my $op,'>', $res_File_name) or die("File $res_File_name could not be opened ");
while(<$ip>) {
if($_ =~ /$Pattern/is)
{
print $op $Pattern . "\n";
}
elsif($_ =~ /\d{4}\-\d{1,2}\-\d{1,2}\s*\d{2}:\d{2}:\d{2}\s*\[Exception\s*.*?/is)
{
print $op $_ . "\n";
}
}
close($ip);
close($op);
I am using a perl script that takes directory name as input from user and searches files in it. After searching file it reads the contents of file. If file contents contain a word "cricket" then using unlink function I should be able to delete the file. But using unlink the file that contains the word "cricket" still exists in the directory after execution of the code. Please help. My code is:
use strict;
use warnings;
use File::Basename;
print "enter a directory name\n";
my $dir = <>;
print "you have entered $dir \n";
chomp($dir);
opendir DIR, $dir or die "cannot open directory $!";
while (my $file = readdir(DIR)) {
next if ($file =~ m/^\./);
my $filepath = "${dir}${file}";
print "$filepath\n";
print " $file \n";
open(my $fh, '<', $filepath) or die "unable to open the $file $!";
my $count = 0;
while (my $row = <$fh>) {
chomp $row;
if ($row =~ /cricket/) {
$count++;
}
}
print "$count";
if ($count == 0) {
chomp($filepath);
unlink $filepath;
print " $filepath deleted";
}
}
By your test if($count==0) {...} you'll only delete files if they don't contain "cricket". It should work as you describe if you change it to if($count) {...}.
Additionally you're creating the filepath by concatenating the dir and file names in a manner that will only work if the dir name the user entered includes a trailing slash (${dir}${file}): this would be less error-prone as $dir/$file, or, if you wanted to go to town:
use File::Spec;
File::Spec::catfile($dir, $file);
Additionally, as the comments point out, you're not closing the open file handle, whether or not you try to delete it. This is bad practice, however, on Linux at least it should still work. Use close($fh) before your deletion test.
Note also that "cricket" is case-sensitive so files with "Cricket" won't be deleted. Use $row =~ /cricket/i for case-insensitive search.
I am trying to read multiple .txt files in a folder. Each file should be read line by line, however, I failed to read multiple .txt files by using glob. Any advice on my code?
my %data;
#FILES = glob("*.txt");
$EmailMsg .= "EG. Folder(week) = Folder(CW01) --CW01 = Week 1 -- Number is week\n ";
$EmailMsg .= "=======================================================================================================\n";
# Try to Loop multiple files here
foreach my $file (#FILES) {
local $/ = undef;
open my $fh, '<', $file;
$data{$file} = <$fh>;
# Read the file one line at a time.
while (my $line = <$fh>) {
chomp $line;
$line =~ s/^\s+//;
$line =~ s/\s+$//;
my ($name, $date, $week) = split /\:/, $line;
if ($name eq "NoneFolder") {
$EmailMsg .= "Folder ($week) - No Folder created on the FTP! Failed to open folder!\n";
}
if ($name eq "EmptyFiles") {
$EmailMsg .= "Folder ($week) - No Files insides the folder! Failed download files!\n";
}
}
}
$EmailMsg .= "=======================================================================================================\n";
$EmailMsg .= "Please note that if you receive this email means that the script is running fine just that no folder is created or no files inside the folder for the week on the FTP.\n";
# close the file.
#close <$fh>;
Currently output:
EG. Folder(week) = Folder(CW01) --CW01 = Week 1 -- Number is week
=======================================================================================================
=======================================================================================================
Please note that if you receive this email means that the script is running fine just that no folder is created or no files inside the folder for the week on the FTP.
It failed to get any .txt files.
You are trying to read each file twice: firstly into the hash %data and then again line by line.
Once you have reached end of file, you have to either reopen the file or use seek to move the read pointer back to the beginning.
You also need to set $/ back to its original value, otherwise your loop will read the entire file instead of one line at a time.
It's not clear whether you really need the second copy of the file data in the hash, but you can avoid having to reset $/ by putting the change within a block, like this
open my $fh, '<', $file;
$data{$file} = do {
local $/ = undef;
<$fh>;
};
and then reset the file pointer to the start again before the while loop.
seek $fh, 0, 0;
#!/usr/bin/perl
use strict;
use warnings FATAL => 'all';
my #files=('Read a file.pl','Read a single text file.pl','Read only one
file.pl','Read the file using while.pl','Reading the file.pl');
foreach my $i(#files) {
open(FH, "<$i");
{
while (my $row = <FH>) {
chomp $row;
print "$row\n";
}
}
}
The file globbing works for me. You might want to specify scope for your #FILES variable and check that there actually are files matching the path you have specified,
#!/bin/env perl
use strict;
use warnings;
## glob on all files in home directory
## see: http://perldoc.perl.org/File/Glob.html
use File::Glob ':globally';
my #configs = <~myname/project/etc/*.cfg>;
foreach my $fn (#configs) {
print "file $fn\n";
}
your code,
my %data;
#here are some .c files,
my #FILES = glob("../*.c");
foreach my $fn (#FILES) {
print "file $fn\n";
}
exit;
This way catches more garbage for about the same amount of code.
my $PATH = shift #ARGV ;
chomp $PATH ;
opendir(TXTFILE,$PATH) || die ("failed to opendir: $PATH") ;
my #file = readdir TXTFILE ;
closedir(TXTFILE) ;
foreach(#file) { #
next unless ($_ =~ /\.txt$/i) ; # Only get .txt files
$PATH =~ s/\/$//g ; $PATH =~ s/$/\// ; # Uniform trailing slash
my $thisfile = $PATH . $_ ; # now a fully qualified filename
unless (open(THISFILE,$thisfile)) { # Notify on busted files.
warn ("$thisfile failed to open") ;
next ;
}
while(<THISFILE>) {
# etc. etc.
}
close(THISFILE) ;
}
following is my Perl code:
use strict;
use File::Find;
use MIME::Base64;
use File::Temp qw(tempfile);
sub loadFiles(); #udf
sub mySub(); #udf
my #files = ();
my $dir = shift || die "Argument missing: directory name\n";
my $finalLoc;
my $filePath;
my $fileContents;
my $base64EncFile;
my $domain = "WTX";
my $devFilePath;
my $deviceDir;
my $position;
my $user = "admin";
my $encPwd = "YzNKcGNtRnRZVEF4";
my $decPwd;
my $response;
my $temp;
my $tempFilename;
loadFiles(); #call
foreach (#files) {
#take the file path into a variable
$filePath = $_;
#replace the '/' with '\' in the file path
$filePath =~ s/\//\\/g;
#take the file path into a variable
$devFilePath = $_;
#replace the '\' with '/' in the file path
$devFilePath =~ s/\\/\//g;
#perform string operation to derive a target file path
$position = index( $devFilePath, "RPDM" );
$deviceDir = "local:///" . substr( $devFilePath, $position );
#open handle on file to read the contents
open( FILE, "< $filePath" );
#read the entire file into a variable, 'fileContents'
$fileContents = do { local $/; <FILE> };
#base64 encode the file contents
$base64EncFile = encode_base64($fileContents);
#replace the <CR><LF> characters in the file and flatten the base64 string
$base64EncFile =~ s/[\x0A\x0D]//g;
#printing file path
print "FilePath=$filePath\n";
#creating a temp file with 9 random characters at the end, example 'tempUKv1vqBTp'
$temp = File::Temp->new(
TEMPLATE => "tempXXXXXXXXX",
UNLINK => 0
) or die "Could not make tempfile: $!";
$tempFilename = $temp->filename;
#Printing temp file name
print "TempFileName=$tempFilename\n";
#open the temp file for writing
open(TEMP, ">$tempFilename");
select(TEMP);
while($base64EncFile){
#??? HOW TO PRINT THE VARIABLE $base64EncFile CONTENTS INTO THE TEMP FILE ???
}
#creating a final request for sending to the web service
my $dpString = "<env:Envelope xmlns:env='http://schemas.xmlsoap.org/soap/envelope/' xmlns:dp='http://www.datapower.com/schemas/management'><env:Body><dp:request domain='$domain'><dp:set-file name='$deviceDir'>". $base64EncFile."</dp:set-file></dp:request></env:Body></env:Envelope>";
#decode the encoded password
$decPwd = decode_base64($encPwd);
system('C:\\apps\\curl-7.15.0\\curl.exe', '-#', '-k', '-u', "admin:$decPwd", '--data-binary', "$dpString", 'https://host/service/fileSet');
print "-----------------------------------------------------------\n";
close(TEMP);
close(FILE);
}
sub loadFiles() {
find( \&mySub, "$dir" ); #custom subroutine find, parse $dir
}
# following gets called recursively for each file in $dir, check $_ to see if you want the file!
sub mySub() {
push #files, $File::Find::name
if (/(\.xml|\.xsl|\.xslt|\.ffd|\.dpa|\.wsdl|\.xsd)$/i)
; # modify the regex as per your needs or pass it as another arg
}
Task I am trying to accomplish is, given a folder argument to the above perl program will make recursive calls to a given web service end point. Problem is - using the System command in Perl is unable to send files over 32 Kb. While trying to use File::Temp module in perl, I am not sure how to set the contents of a variable into a temp file (my first week using Perl).
Any help to achieve this will be helpful. Thanks!
Are you asking how to write a string to an open file?
print $fh $string;
should do the trick.
In your example, that would translate to replacing L62-65 with something like:
print TEMP $base64EncFile;
I'm trying to use the Tenjin module but it fails because it can't find the template file but it exists. I've added some debug statements into the module and it's not passing
return $filepath if (-f $filepath);
even when $filepath is correct. I've tried in a standalone script and it works fine but when I copy it to the mod_perl script it fails. Any ideas?
$filepath is a full absolute path: /something/another/dir/2/filename.plhtml
This is the function form the module. Notice my "Debug"...it prints the correct path to the file which is 777 but it never prints YES.
sub find_template_file {
my ($this, $filename) = #_;
my $path = $this->{path};
if ($path) {
my $sep = $^O eq 'MSWin32' ? '\\\\' : '/';
foreach my $dirname (#$path) {
my $filepath = $dirname . $sep . $filename;
print STDERR "--$filepath--\n";
if (-f $filepath){
print STDERR "--- YES ---\n\n";
}
return $filepath if (-f $filepath);
}
} else {
return $filename if (-f $filename);
}
my $s = $path ? ("['" . join("','", #$path) . "']") : '[]';
die "Tenjin::Engine: $filename not found (path=$s).";
}
Fails with
Tenjin::Engine: index.plhtml not found (path=['/var/2.0/templates/search']). at /usr/lib/perl5/site_perl/5.8.8/Tenjin/Engine.pm line 56.\n
The Apache process also needs read and execute access on every subdirectory up to the full path. (If symbolic links are involved, it will be trickier to determine what the accesses are).
If you can debug the script in place on the web server, you might want to get Perl to deliver you an error message:
if (! -f $filename) {
open(ACK, "<", $filename);
print STDERR "Couldn't open $filename because of: $!\n";
}
-f will return false if the file doesn't exist but undef if the stat call failed for some other reason.
Test if the return is defined and if it is not, show the error that will have been set in $!.
That may give you a clue.
Give -f the full path to the file, and make sure it is readable by Apache.
Are you using absolute or relative pathnames? Your assumptions about the current directory may simply be wrong.
I'm going to totally ignore what you asked and answer something completely different instead! I'm just that crazy!
Well, not really, I'm leveraging a core perl module, File::Find, instead of writing my own directory parser.
On request, here's the question I'm actually answering:
"How do I find the path to a file that is somewhere in a sub-directory of a specific set of paths?"
use File::Find;
# Other parts of the class here
sub find_template_file {
my ($this, $filename) = #_;
my $file_path;
my $path = $this->{path};
# Note that this inner sub uses variables we defined above
find(sub {
if ($_ eq $filename)
$file_path = $File::Find::name;
}, #$path);
if ($file_path)
return $file_path;
my $s = $path ? ("['" . join("','", #$path) . "']") : '[]';
die "Tenjin::Engine: $filename not found (path=$s).";
}