I want to take all my log files from /var/log and cat them into a master log file, then zip that master file. How exactly would I do that?
I have cat in my code because that's what I know how to do in bash. How would I do it in Perl?
#!/usr/bin/perl
use strict;
use warnings;
use IO::Compress::Zip qw(zip $ZipError);
# cat /var/log/*log > /home/glork/masterlog.log
my #files = </var/log/*.log>;
zip \#files => 'glork.zip'
or die "zip failed: $ZipError\n";
#files = </var/log/*.log>;
if (#files) {
unlink #files or warn "Problem unlinking #files: $!";
print "The job is done\n";
} else {
warn "No files to unlink!\n";
}
As noted in the comments, there are several less involved ways to do this. If you really need to roll your own, Archive::Zip will do whatever you tell it to.
#!/usr/bin/env perl
use strict;
use Archive::Zip ':ERROR_CODES';
use File::Temp;
use Carp;
# don't remove "temp" files when filehandle is closed
$File::Temp::KEEP_ALL = 1;
# make a temp directory if not already present
my $dir = './tmp';
if (not -d $dir) {
croak "failed to create directory [$dir]: $!" if not mkdir($dir);
}
my $zip = Archive::Zip->new();
# generate some fake log files to zip up
for my $idx (1 .. 10) {
my $tmp = File::Temp->new(DIR => $dir, SUFFIX => '.log');
my $fn = $tmp->filename();
print $tmp $fn, "\n";
}
# combine the logs into one big one
my $combined = "$dir/combined.log";
open my $out, '>', $combined or die "couldn't write [$combined]: $!";
for my $fn (<$dir/*.log>) {
open my $in, '<', $fn or die "couldn't read [$fn]: $!";
# copy the file line by line so we don't use tons of memory for big files
print($out $_) for <$in>;
}
close $out;
$zip->addFile({ filename => $combined, compressionLevel => 9});
# write out the zip file we made
my $rc = $zip->writeToFileNamed('tmp.zip');
if ($rc != AZ_OK) {
croak "failed to write zip file: $rc";
}
Related
Im just trying to copy a file to a different directory before I process it. Here is the code:
use File::stat;
use File::Copy;
use LWP::UserAgent;
use strict;
use warnings;
use Data::Dumper;
use Cwd qw(getcwd);
my $dir = "\\folder\\music";
my $dir1 = "c:\\temp";
opendir(my $dh, $dir) or die "Cant open directory : $!\n";
#my #list = readdir($dh)
my #files = map { [ stat "$dir/$_", $_ ] }
grep( /Shakira.*.mp3$/, readdir( $dh ) );
closedir($dh);
sub rev_by_date
{
$b->[0]->ctime <=> $a->[0]->ctime
}
my #sorted_files = sort rev_by_date #files;
my #newest = #{$sorted_files[0]};
my $name = pop(#newest);
print "Name: $name\n";
#**********************
#Upto here is working fine
my $new;
open OLD,"<",$name or die "cannot open $old: $!";
from here the problem starts
open(NEW, "> $new") or die "can't open $new: $!";
while ()
{
print NEW $_ or die "can't write $new: $!";
}
close(OLD) or die "can't close $old: $!";
close(NEW) or die "can't close $new: $!";
The error im getting is :
cannot open Shakira - Try Everything (Official Video).mp3: No such file or directory at copy.pl line 49.
when Im chomping the filename, like
my $oldfile = chomp($name);
then the error is :
Name: Shakira - Try Everything (Official Video).mp3
old file is 0
cannot open 0: No such file or directory at copy.pl line 49.
Any idea?
chomp changes its argument in place and returns the number of removed characters. So the correct usage is
chomp(my $oldfile = $name);
Also, you probably wanted
while (<OLD>) {
instead of
while () {
which just loops infinitely.
Moreover, you correctly prepend $dir/ to a filename in the stat call, but you shold do so everywhere.
I am writing a perl script to get a filename present in a directory in a remote server , i couldn't find any WMIC command to traverse through directory ,is there any other command to access a remote server and traverse in a specific path to find a file and retrieve the file name.
use strict;
use warnings;
use File::Find::Rule;
use File::Basename qw(basename);
my $path = "\\\\vmw2160\\dir1";
my #full_pathes = File::Find::Rule->file->name('data.html')->in($path);print ".";
my #files = map { lc basename $_ } #full_pathes;
print foreach(#files);
my %file = map { $_ => 1 } #files;
print foreach(%file);
You would need to use File::Find::Rule module from CPAN with Number::Compare being a dependency. see comments next to some parts of the script.
use strict;
use warnings;
use File::Find::Rule;
use File::Basename qw(basename);
my $path = "\\\\devicename\\sharename"; #Enter your path here, i.e Network drive
my $report = 'notfound.txt'; #This is just a log to tell you which files you searched for does not exist on the drive
print 'Enter file that contains list of files to search: ';
my $expected = <STDIN>;
chomp $expected;
open(my $fh, '<', $expected) or die "Could not open '$expected' $!\n";
open(my $out, '>', $report) or die "Could not open '$report' $!\n";
my #full_pathes = File::Find::Rule->file->name('*')->in($path);
my #files = map { lc basename $_ } #full_pathes;
my %file = map { $_ => 1 } #files;
while (my $name = <$fh>) {
chomp $name;
if ($file{lc $name}) {
print "$name found\n";
} else {
print $out "$name\n";
}
}
close $out;
close $fh;
Then create a file with a list of files you want to search for. Let's call it myfiles.txt and enter the files in list form:
filename1.txt
filename2.pdf
filename3.bat
then Run the script and upon request, enter the filename myfiles.txt to the prompt and enter.
EDIT modified the code to take UNC paths.
Trying to write a script which opens a directory and reads bunch of multiple log files line by line and search for information such as example:
"Attendance = 0 " previously I have used grep "Attendance =" * to search my information but trying to write a script to search for my information.
Need your help to finish this task.
#!/usr/bin/perl
use strict;
use warnings;
my $dir = '/path/';
opendir (DIR, $dir) or die $!;
while (my $file = readdir(DIR))
{
print "$file\n";
}
closedir(DIR);
exit 0;
What's your perl experience?
I'm assuming each file is a text file. I'll give you a hint. Try to figure out where to put this code.
# Now to open and read a text file.
my $fn='file.log';
# $! is a variable which holds a possible error msg.
open(my $INFILE, '<', $fn) or die "ERROR: could not open $fn. $!";
my #filearr=<$INFILE>; # Read the whole file into an array.
close($INFILE);
# Now look in #filearr, which has one entry per line of the original file.
exit; # Normal exit
I prefer to use File::Find::Rule for things like this. It preserves path information, and it's easy to use. Here's an example that does what you want.
use strict;
use warnings;
use File::Find::Rule;
my $dir = '/path/';
my $type = '*';
my #files = File::Find::Rule->file()
->name($type)
->in(($dir));
for my $file (#files){
print "$file\n\n";
open my $fh, '<', $file or die "can't open $file: $!";
while (my $line = <$fh>){
if ($line =~ /Attendance =/){
print $line;
}
}
}
At the moment this code replaces all occurences of my matching string with my replacement string, but only for the file I specify on the command line. Is there a way to change this so that all .txt files for example, in the same directory (the directory I specify) are processed without having to run this 100s of times on individual files?
#!/usr/bin/perl
use warnings;
my $filename = $ARGV[0];
open(INFILE, "<", $filename) or die "Cannot open $ARGV[0]";
my(#fcont) = <INFILE>;
close INFILE;
open(FOUT,">$filename") || die("Cannot Open File");
foreach $line (#fcont) {
$line =~ s/\<br\/\>\n([[:space:]][[:space:]][[:space:]][[:space:]][A-Z])/\n$1/gm;
print FOUT $line;
}
close INFILE;
I have also tried this:
perl -p0007i -e 's/\<br\/\>\n([[:space:]][[:space:]][[:space:]][[:space:]][A-Z])/\n$1/m' *.txt
But have noticed that is only changes the first occurence of the matched pattern and ignores all the rest in the file.
I also have tried this, but it doesn't work in the sense that it just creates a blank file:
use v5.14;
use strict;
use warnings;
use DBI;
my $source_dir = "C:/Testing2";
# Store the handle in a variable.
opendir my $dirh, $source_dir or die "Unable to open directory: $!";
my #files = grep /\.txt$/i, readdir $dirh;
closedir $dirh;
# Stop script if there aren't any files in the list
die "No files found in $source_dir" unless #files;
foreach my $file (#files) {
say "Processing $source_dir/$file";
open my $in, '<', "$source_dir/$file" or die "Unable to open $source_dir/$file: $!\n";
open(FOUT,">$source_dir/$file") || die("Cannot Open File");
foreach my $line (#files) {
$line =~ s/\<br\/\>\n([[:space:]][[:space:]][[:space:]][[:space:]][A-Z])/\n$1/gm;
print FOUT $line;
}
close $in;
}
say "Status: Processing of complete";
Just wondering what am I missing from my code above? Thanks.
You could try the following:
opendir(DIR,"your_directory");
my #all_files = readdir(DIR);
closedir(DIR);
for (#all_files) .....
I want a Perl script to search in the mentioned directory and find those files
which contains the string ADMITTING DX and push those files to a new folder.
I am new to Perl and was trying this:
#!/usr/bin/perl
use strict;
use warnings;
use File::Find;
my $dir = '/usr/share/uci_cmc/uci_new_files/';
my $string = 'ADMITTING DX';
open my $results, '>', '/home/debarshi/Desktop/results.txt'
or die "Unable to open results file: $!";
find(\&printFile, $dir);
sub printFile {
return unless -f and /\.txt$/;
open my $fh, '<',, $_ or do {
warn qq(Unable to open "$File::Find::name" for reading: $!);
return;
};
while ($fh) {
if (/\Q$string/) {
print $results "$File::Find::name\n";
return;
}
}
}
You are reading the lines from the file as:
while ($fh)
which should be
while (<$fh>)
You can really do it with Perl and that's a great way. But there's no any complex text processing in your case so I'd just advise using bash one-liner:
for f in *.txt; do grep 'ADMITTING DX' $f >/dev/null && mv $f /path/to/destination/; done
And if you still need a Perl solution:
perl -e 'for my $f (glob "*.txt") { open F, $f or die $!; while(<F>){ if(/ADMITTING DX/){ rename $f, "/path/to/destination/$f" or die $!; last } close $f; }}'
There are two errors in your code. Firstly you have a superfluous comma in the open call in printFile. It should read
open my $fh, '<', $_ or do { ... };
and secondly you need a call to readline to fetch data from the opened file. You can do this with <$fh>, so the while loop should read
while (<$fh>) { ... }
Apart from that your code is fine