I'm creating a script where I need to get the Last Modified Date of the files
I checked this thread How do I get a file's last modified time in Perl?
So I used the script below to get the last modified, at first it was working but when I try to run it again, the timestamp returns 00:00 January 1, 1970.
Why is that happening and how can I get the correct last modified date and time?
my $dir = '/tmp';
opendir(DIR, $dir) or die $!;
#content=readdir(DIR);
foreach(#content)
{
next unless ($_ =~ m/\bfile.txt|file2.csv\b/);
my $epoch_timestamp = (stat($_))[9];
my $timestamp = localtime($epoch_timestamp);
$f_detail = $_ .' '.$timestamp;
print "$f_detail\n";
}
closedir(DIR);
exit 0;
When I tried to run the perl, I will get this result
file.txt Thu Jan 1 00:00:00 1970
file2.csv Thu Jan 1 00:00:00 1970
Ok, last update, it is working now, I try to run all of the scripts you've given to me, standalone script. I found what's causing the default time, see the script below, I remove that in my program and it works, didn't notice this at first, sorry. But still, it feels weird because I was sure that it is working when I first run it, but now it is working so yeah thank you guys!
if (($month = ((localtime)[4] + 1)) < 10)
{
$month = '0' . $month;
}
if (($day = ((localtime)[3])) < 10)
{
$day = '0' . $day;
}
if (($year = ((localtime)[5]+1900)) >= 2000)
{
if (($year = $year - 2000) < 10)
{
$year = '0' . $year;
}
}
else
{
$year = $year - 1900;
}
$date = $month . $day . $year;
readdir returns file names without the full path. You need to prepend the path manually:
for (#content) {
next unless /^(?:file\.txt|file2\.csv)\z/;
my $epoch_timestamp = (stat("$dir/$_"))[9];
# ~~~~~~~~~
Also note how I changed the regex to match the file names.
If you have a directory name, and you want to see if some files whose names you already know exist in that directory, there's really no need for opendir/readdir - that's more helpful if you don't know the filenames ahead of time. When you do, you can just build a path using both parts and use file test operators/stat/etc. on it.
#!/usr/bin/env perl
use strict;
use warnings;
use feature qw/say/;
my $dir = '/tmp';
my #files = qw/file.txt file2.csv/;
for my $file (#files) {
# Better to use File::Spec->catfile($dir, $file), but your question
# title said no modules...
my $name = "$dir/$file";
if (-e $name) { # Does the file exist?
# _ to re-use the results of the above file test operator's stat call
my $epoch_timestamp = (stat _)[9];
my $timestamp = localtime $epoch_timestamp;
say "$file $timestamp";
}
}
Example execution:
$ perl demo.pl
file.txt Tue Feb 8 07:26:07 2022
file2.csv Tue Feb 8 07:26:10 2022
Following demo code utilizes glob to obtain modification time for specified files in a directory.
use strict;
use warnings;
use feature 'say';
my $dir = '/tmp';
my #files = qw(file.txt file2.csv);
my $mask = join ' ', map { "$dir/$_" } #files;
say "$_\t" . localtime((stat($_))[9]) for glob($mask);
Related
I have a script which should create a report on daily basis on some alarm data. These report files(.csv) will be stored in the perticular path.
Before generating the report, I need to look whether there is existance of any old files which are older than x days (passed as an argument to the script based on requirement).
Below is script to delete the old logs/reports.
...
use File::Find;
my $days = $ARGV[0]; #pass as a parameter, i.e., number of days
my $PATH = "/path/to/the/logfiles/";
print "backup path -> $PATH\n";
my #file_list;
my #find_dirs = ($PATH); # directories to search
my $now = time(); # get current time
my $seconds_per_day = 60*60*24; # seconds in a day
my $AGE = $days*$seconds_per_day; # age in seconds
find ( sub {
my $file = $File::Find::name;
if ( -f $file ) {
push (#file_list, $file)
}
}, #find_dirs);
for my $file (#file_list) {
my #stats = stat($file);
if ($now-$stats[9] > $AGE) {
unlink $file;
}
}
print "Deleted files older than $days days.\n";
#
# GENERATE REPORT CODE
#
...
...
This script works fine for me. I want to know whether we have any Standard Perl Module which will do the same operation which can act as logrotation.
im trying to check if there is a file (i dont care about folders) that older than X minuts. unfortunatly i can;t tell where is my bug on this code.
i will appriciate any help :)
1. Find the files older than X number of minute
#!/usr/bin/perl
my $maindir = "C:\\Users\\Dor\\Desktop\\aba";
my $minutesold = 60;
my $now = time;
my $filedisc;
# Declare arrays
my #xmlfiles;
my #qulfiedfiles;
# Declare a Dictionary
my %filedisc;
opendir(my $dh, $maindir) or die "opendir($maindir): $!";
# Read all the files
while (my $de = readdir($dh))
{
# get the Full path of the file
my $f = $maindir . $de;
if ( -f $f )
{
push (#xmlfiles, $f);
}
}
closedir($dh);
# For every file in directory
for my $file (#xmlfiles) {
# Get stats about a file
my #stats = stat($file);
# If time stamp is older than minutes provided
if ($stats[9] >= ($now - (( $minutesold * 60) ))){
# Put the File and Time stamp in the dictionary
print($stats[9] ." .| " .$file ."\n\n");
}
#print($now ."\n")
#print($now - ( $minutesold * 60) ."\n");
}
It's usually best to use glob rather than opendir/readdir, so as to avoid having to “rebuild” the full path to the file for every result
You will probably want to enable the :bsd_glob option on Windows so that paths with spaces, such as C:\Program Files, are handled correctly
use strict;
use warnings 'all';
use File::Glob ':bsd_glob'; # Provide for spaces in path
my $root = 'C:\Users\Dor\Desktop\aba';
my $minutesold = 60;
my #old_files = grep { -f and -M * 24 * 60 > $minutes_old } glob "$root\\*.*";
The path and file isn't correct.
my $f = $maindir . $de;
Should be (add slash between path and file)
my $f = "$maindir/$de";
Solving this in kind of a functional programming style is the way to go here I think:
my $dir = shift() || $ENV{HOME}; #command line arg or else home dir
my $minutesold = 60; #1h
opendir my $dh, $dir or die "ERR: opendir($dir) $!\n";
print
map "$$_{timestamp} .| $$_{file}\n",
#sort { $$a{timestamp} <=> $$b{timestamp} } # sort by age
#sort { $$a{file} cmp $$b{file} } # sort by name
grep $^T-$$_{timestamp} >= 60*$minutesold, # $^T is program startup time()
map {{timestamp=>(stat($_))[9], file=>$_}}
grep -f $_,
map "$dir/$_",
readdir $dh;
closedir $dh;
You have missed one simple way to get the modification time of file in perl: the -M switch.
my $modifiedTimeinDays = -M "$file";
my $modifiedTimeinSec = $modifiedTimeinDays*60*60*24;
if($modifiedTimeinSec > 60)
{
# file older than 60 sec
}
As simple as that.
See perldoc -f -X to learn about all of the file tests.
I am having an issue with getting File::stat to output the last modified date of the file. This is my code so far:
#!/usr/bin/perl
use Time::localtime;
use File::stat;
use warnings;
use File::Find;
my $dirloc = 'E:\tmp\testdir';
sub find_txt {
my $F = $File::Find::name;
if ( ! -d $F && $F =~ /.tar|.exe|.zip/ ) {
my #result = $F;
foreach my $result (#result){
my $timestamp;
$timestamp = (stat("$result"))->[9] or die "No $_: $!";
print "$result : $timestamp\n";
}
}
}
find({wanted => \&find_txt}, $dirloc);
It is outputing something like this:
C:/tmp/testdir/foo/bar/test.tar : 1415305933
I need it to output instead (date format doesn't have to be what is listed, i just want to see the date):
C:/tmp/testdir/foo/bar/test.tar : 11/07/2014
I know that the output it is giving me is the time since epoch but I thought stat was supposed to give the date. Am I doing something wrong? Thanks!
edit: I have tried localtime, and i get: Time::tm=ARRAY(0x245b220), not sure what is happening there
You can use the localtime (Note: not Time::localtime) function to convert the timestamp into something useful
my $date = localtime $timestamp
which will make it a human readable string like Fri Nov 7 15:33:00 2014
Or you can use it in a list context to spit it into individual fields:
my($sec, $min, $hour, $day, $month, $year, $weekday, $yearOfDay, $isDST) = localtime $timestamp
im trying to build a perl script for housekeenping a windows server but it's getting really dificult. so if you guys could help i apreciate.
so...this program shoul find PDF files, zip them and then delete files (outside the zip file) which are bigger than 1mb .
and i think the problem is in the IF filesize condition. so when i put 2 PDF files into a directory, (one smaller than 1mb and another larger than 1mb) , no matter how many times i changed the code this is result:
zip all files, delete the pdf files (not those in the zip file).
zip all files, delete the bigger file (not those in the zip file).
zip the bigger file, delete the pdf files (not those in the zip file).
this is my code:
#!/usr/bin/perl
#1 megabyte = 1000000 bytes
use File::Find;
use lib qw(/st/APPL/PORTABLE/Perl/5.8.8);
use MIME::Lite;
use Strict;
use warnings;
use Win32::DriveInfo;
use Archive::Zip;
use Switch;
use IO::Compress::Zip qw(zip $ZipError);
use File::stat;
#my $backup_root = "/path/to/folder"
my $backup_root = "D:/st/APPL/PORTABLE/Perl/bin/teste";
# purge backups older than AGE in days
my #file_list;
my #find_dirs = ($backup_root); # directories to search
my $now = time(); # get current time
my $days = 31; # how many days old
my $seconds_per_day = 60 * 60 * 24; # seconds in a day
my $AGE = $days * $seconds_per_day; # age in seconds
find(
sub {
my $file = $File::Find::name;
my $filesize = stat($file)->size;
if ( -f $file ) {
push( #file_list, $file );
print "Ficheiro $file encontrado!\n";
print "Size: $filesize\n";
}
if ( $filesize >= 1105593 ) {
#my #files = <*20131221*.pdf>;
my #files = <*.pdf>;
zip \#files => 'output.zip'
or die "zip failed: $ZipError\n";
for my $file (#file_list) {
my #stats = stat($file);
if ( $now - $stats[9] > $AGE ) {
unlink $file;
}
}
print "Deleted files older than $days days.\n";
} elsif ( $filesize <= 1105593 ) {
print "O ficheiro e mais pequeno que 1 mb !";
}
},
#find_dirs
);
It's not clear how output.zip connected with files in $backup_root, but I think that conceptual mistake is recreating zip file inside find callback(=inside loop).
Try following code:
#!/usr/bin/perl
# purge backups older than AGE in days
use strict;
use warnings;
use File::Find;
use lib qw(/st/APPL/PORTABLE/Perl/5.8.8);
use IO::Compress::Zip qw(zip $ZipError);
use File::stat;
my $backup_root = "D:/st/APPL/PORTABLE/Perl/bin/teste";
my #file_list;
my #find_dirs = ($backup_root); # directories to search
my $now = time(); # get current time
my $days = 31; # how many days old
my $seconds_per_day = 60 * 60 * 24; # seconds in a day
my $AGE = $days * $seconds_per_day; # age in seconds
# get list of all files and directories in #find_dirs
find( sub { push #file_list, $File::Find::name; }, #find_dirs );
# zip all pdfs found in $backup_root directory
my #found_pdfs = grep{/\.pdf$/} grep {-f} #file_list;
zip \#found_pdfs => 'output.zip' or die "zip failed: $ZipError\n";
# unlink old big files in $backup_root directory
unlink $_ for grep {$now - stat($_)->mtime > $AGE } grep {-s >= 1105593} grep {-f} #file_list;
I am running a perl script. In my perl script, I check the current date and the folder name(which is also in the date format like for example 11-12-07). This perl script run automatically when It checks the curent date with the folder name. The folder is a tar folder which is loaded from other server.
So, basically I need to run the script if it matched with the folder name and current date.
Problem: Sometimes, I used to get the folder next day and my perl script checks only for the current date. The folder i get has the name which is previous date (not the current date).So, I need to do processing of the folder manually. I need to automate it in my perl script.
Please suggest me some ideas to make it happen.
Thanks!!
Code for Reference:
my $tfilename = 'tarmd5.tar';
my $td = `date '+%y-%m-%d'`; # date in yy-mm-dd format
chomp ($td);
my $td2 = `date '+%Y%m%d'`; # date in yyyymmdd format
chomp ($td2);
#
# get directory from command line
$dir = shift;
leave("'$dir' is not a valid directory") unless (-d $dir);
if ($dir eq '.') {$dir = cwd();}
elsif ($dir !~ /^\//) {$dir = cwd()."/$dir";}
# print out the time
print scalar(localtime()),"\n";
######## This section unpacks transferred data ########
# go to directory for today and find *tar.gz files to copy
my $dday = "$dir/$td";
next unless (-d "$dday");
#gzfiles = glob("$dday/*tar.gz");
foreach $zf(#gzfiles) {
next if (($zf =~ /BMP/) || ($zf =~ /LG/) || ($zf =~ /MAP/) || ($zf =~ /STR/));
print "$zf\n";
($status,$message) = systemcall("/bin/cp $zf $fdir");
}
Maybe using DateTime to do the math. I redid the solution as the first was poorly written. Changed DateTime->today to DateTime->now because one wants the hms portion when converting back to the desired time zone (from 'floating' or 'UTC').
Also used Perl functions instead of shelling out to the Unix system, (date functions, current working directory - cwd, and copy function).
Update: elsif ($dir != /^\//) is incorrect. Changed to elsif ($dir !~ /^\//).
#!/usr/bin/perl
use strict;
use warnings;
use Cwd;
use DateTime;
use File::Copy;
# set to your desired time zone
my $today = DateTime->now( time_zone => "America/New_York" );
my $td = $today->strftime("%y-%m-%d");
# strongly recommended to do date math in the 'floating'/UTC zone
my $yesterday = $today->set_time_zone('floating')->subtract( days => 1);
my $yd = $yesterday->set_time_zone('America/New_York')->strftime("%y-%m-%d");
my $dir = shift or die "Provide path on command line. $!";
if ($dir eq '.') {
$dir = cwd;
}
elsif ($dir !~ /^\//) {
$dir = cwd() . "/$dir";
}
opendir my $dh, $dir or die $!;
my #dir = sort grep {-d and /$td/ || /$yd/} readdir $dh;
closedir $dh or die $!;
#dir or die "Found no date directories. $!";
my $dday = "$dir/$dir[-1]"; # is today unless today not found, then yesterday
my $fdir = '/some/example/path/';
my #gzfiles = glob("$dday/*tar.gz");
foreach my $zf (#gzfiles) {
next if (($zf =~ /BMP/) || ($zf =~ /LG/) || ($zf =~ /MAP/) || ($zf =~ /STR/));
print "$zf\n";
copy($zf, $fdir) or die "Unable to copy. $!";
}
So you want to get all the directory names that match the current day or any previous days? I presume you move the directories somewhere else when they're done being processed.
A good place to start is the DateTime module. Getting the current date is easy enough:
my $now = DateTime->now();
Then you need to iterate through all directories and pick out the dates you want. Use "perldoc -f" to lookup opendir(), readdir(), and closedir() for getting the directories. To match them, parse out the day/month/year, and create another DateTime object:
my $dir_date = DateTime->new(
day => $dir_day,
month => $dir_month,
year => $dir_year,
);
Once you have all that together, finding if the given directory is a hit is as easy as:
processDir( $dir_name )
if DateTime->compare( $now, $dir_date ) >= 0;
I wonder if it wouldn't be simpler with a bash script. If I understand what you are trying to do, it is
find recent .tar.gz files, with names not containing "BMP", "LG", etc.
copy these files to another dir ($fdir, which is undefined in your example)
Maybe you could just ignore the whole folder-name problem, and search for files not older than 24 hours?
dir=/your/bas/dir
fdir=/your/destination
find $dir -iname "*.tar.gz" -mtime -1 -not \( -name "*BMP*" -o -name "*LG*" -o -name "*MAP*" \) -exec cp "{}" "$fdir" \;