find 30 minutes ago modified file in Solaris - solaris

i want to find 30 minutes ago modified file in solaris. i have written below command to find 1 day ago modified file.
find . -mtime 1 -exec ls -l {} \;
please help me to find out 30mins ago modified file. My server is solaris.

I found the similar question and answer on unix.stackexchange.com.
https://unix.stackexchange.com/questions/72861/delete-n-minutes-old-file-in-solaris
As Solaris' find command has no -mmin option, the article suggests to use the absolute time.
And there are some other solution helped by the usage of script language like perl.
http://www.unix.com/shell-programming-and-scripting/69234-how-delete-files-30-min-older.html
Hope it helps.

You can use "gfind" on later versions of Solaris. It's GNU find and does have the "-mmin" option.

Related

how to rename multiple files in a folder with a specific format? perl syntax explanation [duplicate]

This question already has answers here:
How to rename multiple files in a folder with a specific format?
(2 answers)
Closed 2 years ago.
I asked a similar question previously, but need help to understand the Perl commands that achieve the renaming process. I have many files in a folder with format '{galaxyID}-psf-calexp-pdr2_wide-HSC-I-{#}-{#}-{#}.fits'. Here are some examples:
7-psf-calexp-pdr2_wide-HSC-I-9608-7,2-205.41092-0.41487.fits
50-psf-calexp-pdr2_wide-HSC-I-9332-6,8-156.64674--0.03277.fits
124-psf-calexp-pdr2_wide-HSC-I-9323-4,3-143.73514--0.84442.fits
I want to rename all .fits files in the directory to match the following format:
7-HSC-I-psf.fits
50-HSC-I-psf.fits
124-HSC-I-psf.fits
namely, I want to remove "psf-calexp-pdr2_wide", all of the numbers after "HSC-I", and add "-psf" to the end of each file after HSC-I. I have tried the following command:
rename -n -e 's/-/-\d+-calexp-/-\d+pdr2_wide; /-/-//' *.fits
which gave me the error message: Argument list too long. You can probably tell I don't understand the Perl syntax. Thanks in advance!
First of all, Argument list too long doesn't come from perl; it comes from the shell because you have so many files that *.fits expanded to something too long.
To fix this, use
# GNU
find . -maxdepth 1 -name '*.fits' -exec rename ... {} +
# Non-GNU
find . -maxdepth 1 -name '*.fits' -print0 | xargs -0 rename ...
But your Perl code is also incorrect. All you need is
s/^(\d+).*/$1-HSC-I-psf.fits/
which can also be written as
s/^\d+\K.*/-HSC-I-psf.fits/

Stat and file not modified in last minutes

I need to get with stat unix command (or similar like find) possibly on one line of command all file in a folder that ARE NOT changed in the last 5 minutes for example.
I found a lot of examples with opposite: search file in a dir modified in last 3 minutes or similar.
What I need is to find files that are NOT changed (using modification time os size in bytes) in last x minutes.
Is possible to do that?
Stefano
find supports the -not operator for any option.
So use the most appropriate find command you've found and put -not in there.
Try this:
find . -maxdepth 1 -not -mmin -5

Find files modified within one hour in HP-UX

I'm searching through the manual page for find I can't see a way to run a command which will find all files modified within an hour. I can see only a way to do it for days.
Guess this should do
find / -type f -mmin -60
This will be listing files starting from root and modified since the past 60 mins.
the best you can do in HP-UX using the find command is to look for everything that was modified in the last 24 hours. The HP-UX find command only checks modified time in 24-hour increments. This is done by:
find / -type f -mtime 1
This will list all of the filed recursively starting in the root directory that were modified in the last 24 hours. Here's the entry from the man page on the -mtime option:
-mtime n
True if the file modification time subtracted from the initialization time is n-1 to n multiples of 24 h. The initialization time shall be a time between the invocation of the find utility and the first access by that invocation of the find utility to any file specified in its path operands.
If you have the permissions to create the file, use this:
touch -t YYYYMMDDHHMM temp
Then use the -newer option
find . -newer temp
This should list the files newer than the temp file which can be created one hour ago.

How can I remove a file based on its creation date time in Perl?

My webapp is hosted on a unix server using MySQL as database.
I wrote a Perl script to run backup of my database. The Perl script is inside the cgi-bin folde and it is working. I only need to set the cronjob and run the Perl script once a day.
The backups are stored in a folder named db_backups,. However, I also want to add a command inside my Perl script to remove any files inside the folder db_backups that are older than say 10 days ago.
I have searched high and low for unix commands and cannot find anything that matches what I needed.
if (-M $file > 10) { unlink $file }
or, coupled with File::Find::Rule
my $ten_days_ago = time() - 10 * 86400;
my #to_delete = File::Find::Rule->file()
->mtime("<=$ten_days_ago")
->in("/path/to/db_backup");
unlink #to_delete;
On Unix you can't, because the file's creation date is not stored in the filesystem.
You may want to check out stat, and -M (modification time)/-C (inode change time)/-A (access time) if you want a simple expression with relative timestamps (how long ago).
i have searched high and low for unix commands
and cannot find anything that matches what i needed.
Check out find(1) and xargs(1). Warning: these commands may change your life at the shell prompt.
$ find /path/to/backup -type f -mtime +10 -print0 | xargs -0 echo rm -f
When you're confident that will Do What You Want (tm), remove the echo. It says, roughly, starting in /path/to/backup, descend looking for plain files whose mtime is greater than 10 days, and print their names to xargs, which will pass those names to rm in batches.
(print0 and its complement -0 are GNU extensions -- you mentioned you were on Linux -- which let you deal with whitespace in filenames safely.)
You should be able to do it without resorting to Unix commands. Loop through the files in your directory, use stat on each file to get its last modify time for a file, then use unlink on the file to delete it if it's older than what you want.

How can I login to an FTP site and remove files that are more than 7 days old?

I need a shell or Perl script which would connect to the FTP server and delete all the files which are more than 7 days old.
cheers
Use the Net::FTP module to connect to the ftp server as outlined in the CPAN document. To browse through the site listings you may have to combine cwd/cdup in order to handle directories (unless ofcourse all the files are in the root directory).
To get the file's modification time use the mdtm(FILE) method, just make sure to check if this is supported on the current server by calling
if( $ftp->feature( 'MDTM' ) ) {
...check modification time for file...
}
If not, then you might try calling the 'dir' method which will get you the listings in the long format, and then extract the date information from the individual file listings in order to compare and delete.
To compare the two dates use the Date::Calc module. The 'Delta_Days' method should give you the number of days between two dates; this can be used just as easily for either of the methods specified above.
In Perl, you'd want to use Net::FTP's ls, mdtm, and delete commands.
If it's a shell script you're after, you might be better off running a script in a crontab.
find /tmp -type f -mtime +7 -exec rm {} \;