Find files modified within one hour in HP-UX - find

I'm searching through the manual page for find I can't see a way to run a command which will find all files modified within an hour. I can see only a way to do it for days.

Guess this should do
find / -type f -mmin -60
This will be listing files starting from root and modified since the past 60 mins.

the best you can do in HP-UX using the find command is to look for everything that was modified in the last 24 hours. The HP-UX find command only checks modified time in 24-hour increments. This is done by:
find / -type f -mtime 1
This will list all of the filed recursively starting in the root directory that were modified in the last 24 hours. Here's the entry from the man page on the -mtime option:
-mtime n
True if the file modification time subtracted from the initialization time is n-1 to n multiples of 24 h. The initialization time shall be a time between the invocation of the find utility and the first access by that invocation of the find utility to any file specified in its path operands.

If you have the permissions to create the file, use this:
touch -t YYYYMMDDHHMM temp
Then use the -newer option
find . -newer temp
This should list the files newer than the temp file which can be created one hour ago.

Related

Using tac on most recent log file out of several log files in a directory

I have several log files in a directory that we’ll call path/to/directory that are in the following format after long listing in Red Hat Enterprise 6:
-rw-r——-. 1 root root 17096 Sep 30 11:00 logfile_YYYYDDMM_HHMMSS.log
There are several of these log files that are generated everyday. I need to automatically tac the most recently-modified file without typing the exact name of the log file. For example, I’d like to do:
tac /path/to/directory/logile*.log | grep -m 1 keyword
And have it automatically tac the most recently modified file and grep the keyword in the reverse direction from the end of the log file so it runs quicker. Is this possible?
The problem I’m running into is that there is always more than one log file in the /path/to/directory and I can’t get Linux to automatically tac the most recently modified file as of yet. Any help would be greatly appreciated.
I’ve tried:
tac /path/to/directory/logfile_$(date +%Y%m%d)*.log
which will tac a file created on the present date but the part that I’m having trouble with is using tac on the newest file (by YYYYMMDD AND HHMMSS) because multiple files can be generated on the same date but only one of them can be the most current and the most current log file is the only one I care about. I can’t use a symbolic link either.. Limitations, sigh.
The problem you seem to be expressing in your question isn't so much about tac, but rather .. how to select the most recent of a set of predictably named files in a directory.
If your filenames really are in the format logfile_YYYYDDMM_HHMMSS.log, then they will sort lexically without the need for an innate understanding of dates. Thus, if your shell is bash, you might:
shopt -s nullglob
for x in /path/to/logfile_*.log; do
[[ "$x" > "$file" ]] && file="$x"
done
The nullglob option tells bash to expand a glob matching no files as a null rather than as a literal string. Following the code above, you might want to test for the existence of $hit before feeding it to tac.

find 30 minutes ago modified file in Solaris

i want to find 30 minutes ago modified file in solaris. i have written below command to find 1 day ago modified file.
find . -mtime 1 -exec ls -l {} \;
please help me to find out 30mins ago modified file. My server is solaris.
I found the similar question and answer on unix.stackexchange.com.
https://unix.stackexchange.com/questions/72861/delete-n-minutes-old-file-in-solaris
As Solaris' find command has no -mmin option, the article suggests to use the absolute time.
And there are some other solution helped by the usage of script language like perl.
http://www.unix.com/shell-programming-and-scripting/69234-how-delete-files-30-min-older.html
Hope it helps.
You can use "gfind" on later versions of Solaris. It's GNU find and does have the "-mmin" option.

Stat and file not modified in last minutes

I need to get with stat unix command (or similar like find) possibly on one line of command all file in a folder that ARE NOT changed in the last 5 minutes for example.
I found a lot of examples with opposite: search file in a dir modified in last 3 minutes or similar.
What I need is to find files that are NOT changed (using modification time os size in bytes) in last x minutes.
Is possible to do that?
Stefano
find supports the -not operator for any option.
So use the most appropriate find command you've found and put -not in there.
Try this:
find . -maxdepth 1 -not -mmin -5

How can I remove a file based on its creation date time in Perl?

My webapp is hosted on a unix server using MySQL as database.
I wrote a Perl script to run backup of my database. The Perl script is inside the cgi-bin folde and it is working. I only need to set the cronjob and run the Perl script once a day.
The backups are stored in a folder named db_backups,. However, I also want to add a command inside my Perl script to remove any files inside the folder db_backups that are older than say 10 days ago.
I have searched high and low for unix commands and cannot find anything that matches what I needed.
if (-M $file > 10) { unlink $file }
or, coupled with File::Find::Rule
my $ten_days_ago = time() - 10 * 86400;
my #to_delete = File::Find::Rule->file()
->mtime("<=$ten_days_ago")
->in("/path/to/db_backup");
unlink #to_delete;
On Unix you can't, because the file's creation date is not stored in the filesystem.
You may want to check out stat, and -M (modification time)/-C (inode change time)/-A (access time) if you want a simple expression with relative timestamps (how long ago).
i have searched high and low for unix commands
and cannot find anything that matches what i needed.
Check out find(1) and xargs(1). Warning: these commands may change your life at the shell prompt.
$ find /path/to/backup -type f -mtime +10 -print0 | xargs -0 echo rm -f
When you're confident that will Do What You Want (tm), remove the echo. It says, roughly, starting in /path/to/backup, descend looking for plain files whose mtime is greater than 10 days, and print their names to xargs, which will pass those names to rm in batches.
(print0 and its complement -0 are GNU extensions -- you mentioned you were on Linux -- which let you deal with whitespace in filenames safely.)
You should be able to do it without resorting to Unix commands. Loop through the files in your directory, use stat on each file to get its last modify time for a file, then use unlink on the file to delete it if it's older than what you want.

How can I login to an FTP site and remove files that are more than 7 days old?

I need a shell or Perl script which would connect to the FTP server and delete all the files which are more than 7 days old.
cheers
Use the Net::FTP module to connect to the ftp server as outlined in the CPAN document. To browse through the site listings you may have to combine cwd/cdup in order to handle directories (unless ofcourse all the files are in the root directory).
To get the file's modification time use the mdtm(FILE) method, just make sure to check if this is supported on the current server by calling
if( $ftp->feature( 'MDTM' ) ) {
...check modification time for file...
}
If not, then you might try calling the 'dir' method which will get you the listings in the long format, and then extract the date information from the individual file listings in order to compare and delete.
To compare the two dates use the Date::Calc module. The 'Delta_Days' method should give you the number of days between two dates; this can be used just as easily for either of the methods specified above.
In Perl, you'd want to use Net::FTP's ls, mdtm, and delete commands.
If it's a shell script you're after, you might be better off running a script in a crontab.
find /tmp -type f -mtime +7 -exec rm {} \;