find -mindepth -prune conflict? - find

I have a dir structure like:
gameplay/ch1_of_182/ajax.googleapis.com/...
gameplay/ch1_of_182/platform.twitter.com/...
gameplay/ch1_of_182/privacy-policy.truste.com/...
gameplay/ch1_of_182/www.facebook.com/...
gameplay/ch1_of_182/www.gameplay.com/...
gameplay/ch1_of_182/www-mega.gameplay.com/...
gameplay/ch2_of_182/ajax.googleapis.com/...
gameplay/ch2_of_182/platform.twitter.com/...
gameplay/ch2_of_182/privacy-policy.truste.com/...
gameplay/ch2_of_182/www.facebook.com/...
gameplay/ch2_of_182/www.gameplay.com/...
gameplay/ch2_of_182/www-mega.gameplay.com/...
...
gameplay/ch182_of_182/ajax.googleapis.com/...
gameplay/ch182_of_182/platform.twitter.com/...
gameplay/ch182_of_182/privacy-policy.truste.com/...
gameplay/ch182_of_182/www.facebook.com/...
gameplay/ch182_of_182/www.gameplay.com/...
gameplay/ch182_of_182/www-mega.gameplay.com/...
created using wget. Now I want to delete all of the directories and all the files they contain except for the two directories with "gameplay.com" in their name.
I have been trying different variants of:
find . -mindepth 2 ! -path "*gameplay.com" -prune -exec rm -r {} \;
but without luck.
When run from the gameplay parent dir, the 4 of 6 directories in each of the 182 directories (actually only using 1 for testing purposes) that do not contain the "gameplay.com" name pattern are deleted along with all their contents, as desired. And although the 2 directories that do contain the "gameplay.com" name pattern are left undeleted, all the files contained within each are deleted, which is not good.
I thought the -prune option was supposed to let find know to basically ignore the directory specified, but I must be specifying it wrong, because that is not happening, leaving me with two empty directories in each of the 182 parent dirs.
I think there is a potential conflict with the -mindepth and -prune options, but because I am running the command from the gameplay parent directory, without it all 182 child directories, each containing two directories I want left in tact, are deleted.
I guess I could write a for loop to cycle thru each dir, but it seems to me find should be able to accomplish this task in one foul swoop if someone doesn't mind shedding some light how.

Logically, this is find . -mindepth 2 \( ! -path "*gameplay.com" \) -a \( -prune \) -a \( -exec rm -r {} \; \)
So since your -prune comes after the path check, it's never gotten to if your path check is false. Try switching the order.
find . -mindepth 2 -prune ! -path "*gameplay.com" -exec rm -r {} \;

Related

excluded directories in find command not properly piped to -exec cp

I am trying to copy a folder containing a subfolder structure, while excluding a specified subfolder by using the find -exec cp command. I have managed to use multiple working excluding options when I am using the find command alone, but once I add the '-exec cp' command, the excluding terms work no longer.
Imagine the directory of interest containing multiple files and subfolders, with one subfolder named "exclusion_string"
This find command works properly when used alone:
find ~/directory/of/interest/ -maxdepth 2 ! -name "*exclusion_string*"
... while this command negates the exclusion criterium:
find ~/directory/of/interest/ -maxdepth 2 ! -name "*exclusion_string*" -exec cp -r '{}' . \;
Likewise, when using other criteria or arguments, the exclusion of a subdirectory is lost, E.g:
find ~/directory/of/interest/ -maxdepth 2 -name "*" -size -100k -exec cp -r '{}' . \;
find ~/directory/of/interest/ -maxdepth 2 -name "*exclusion_string*" | xargs cp -rt .
What am I missing here?

delete directories with find and exclude other directories

I'm attempting to delete some directories and I want to be able to exclude a directory called 'logs' from being deleted.
This is my basic find operation (without the exclusion):
# find . -type d |tail -10
./d20160124-1120-df8mfb/deployments
./d20160124-1120-df8mfb/releases
./d20160131-16993-vazqg5
./d20160131-16993-vazqg5/metadata
./d20160131-16993-vazqg5/deployments
./d20160131-16993-vazqg5/releases
./logs
./d20160203-27735-1tqbjh6
./d20160125-1120-1yccr9p
./d20160131-16993-1yf9lnc
I'm just tailing the output so that you have an idea of what's going on without taking up the whole page. :)
If I try to exlclude the logs directory with the prune command I get back no results.
root#ops-manager:/tmp/tmp# find . -type d -prune -o -name 'logs' -print
root#ops-manager:/tmp#
What am I doing wrong?
Once I get this right, I'll tack on an -exec rm rf {} \; command so I can delete those directories.
Any help here would be appreciated!
-prune always evaluates to true, which means the expression on the other side of -o is never evaluated. You need to change the order:
find . -type d -name 'logs' -prune -o -print

How to copy a file on several directories with the name *Co* (where *=wildcard)

How to copy a file to several directories of the form *Co*? or *52?
Apparently, just typing
cp fileA *Co*
won't work.
My other concern is that if a directory already contains fileA, I don't want it to be overwritten. That is, if the directory *Co* contains fileA, do NOT copy. Is there a one line solution for this, since I think writing a script with if-else is an overkill.
Thanks!
If your version of cp supports -n, you can do:
find . -name '*Co*' -exec cp -n fileA {} \;
If not:
find . -name '*Co*' -exec sh -c 'test -f $0/fileA || cp fileA $0' {} \;
Note that these will each descend recursively: if you don't want that you can limit the scope of find. To find either Co or *52, you can do:
find . \( -name '*Co*' -o -name '*52' \) -exec ...

unix find and multiple commands to find all files, exclude a dir, and base results on time

Given a folder, I want to ignore a folder within, and then find all files outside that folder, whether they be folders or files, and eventually delete them via a cron style action on OS X.
On Mac OS X will use Launchd to run this, so far, I have this:
find /Users/me/Downloads -not \( -path /Users/me/Downloads/In\ Progress -prune \) -name "*" -Btime 1m
With the -Btime 1m or mime 1m I get zero results, without it i get results I can exec to rm:
find /Users/me/Downloads -not \( -path /Users/me/Downloads/In\ Progress -prune \) -name "*"
/Users/me/Downloads
/Users/me/Downloads/.DS_Store
/Users/me/Downloads/test
/Users/me/Downloads/text.txt
Eventually my criteria will be 1 week, but for now, I use 1 minute, as that surely has passed.
cd ~/Downloads
$find . -mtime 1m
$
Or
cd ~/Downloads
$find . -mtime 1m -print
$find: *: unknown primary or operator
I believe I figured it out, but if someone could look over my shoulder, I certainly would appreciate it:
/usr/bin/find /Users/$USER/Downloads -not ( -path /Users/$USER/Downloads/In\ Progress -prune -o -type d ) -mtime +1s
That finds all files in ~/Downloads at are older than 1 second(s) and are not inside the "In Progress" directory. At first, it was also locating the path /Users/$USER/Downloads as the first hit
/usr/bin/cd ~/Downloads/
echo "You are in: " $(/bin/pwd)
/usr/bin/find /Users/$USER/Downloads -not \( -path /Users/$USER/Downloads/In\ Progress -prune -o -type d \) -mtime +1s
So, can anyone tell me why it is not picking up /Users/$USER/Downloads I certainly don't mind, but I also don't like not understanding.
Finally, the last step would be, change mime to +1w and then append xargs rm -rf {} \; and I should be good to go? If I also wanted to pass in a little list of what find has found and >> it to a log file, where would be a good place to shove that?

Deleting empty (zero-byte) files

What's the easiest/best way to find and remove empty (zero-byte) files using only tools native to Mac OS X?
Easy enough:
find . -type f -size 0 -exec rm -f '{}' +
To ignore any file having xattr content (assuming the MacOS find implementation):
find . -type f -size 0 '!' -xattr -exec rm -f '{}' +
That said, note that many xattrs are not particularly useful (for example, com.apple.quarantine exists on all downloaded files).
You can lower the potentially huge number of forks to run /bin/rm by:
find . -type f -size 0 -print0 | xargs -0 /bin/rm -f
The above command is very portable, running on most versions of Unix rather than just Linux boxes, and on versions of Unix going back for decades. For long file lists, several /bin/rm commands may be executed to keep the list from overrunning the command line length limit.
A similar effect can be achieved with less typing on more recent OSes, using a + in find to replace the most common use of xargs in a style still lends itself to other actions besides /bin/rm. In this case, find will handle splitting truly long file lists into separate /bin/rm commands. The {} is customarily quoted to keep the shell from doing anything to it; the quotes aren't always required but the intricacies of shell quoting are too involved to cover here, so when in doubt, include the apostrophes:
find . -type f -size 0 -exec /bin/rm -f '{}' +
In Linux, briefer approaches are usually available using -delete. Note that recent find's -delete primary is directly implemented with unlink(2) and doesn't spawn a zillion /bin/rm commands, or even the few that xargs and + do. Mac OS find also has the -delete and -empty primaries.
find . -type f -empty -delete
To stomp empty (and newly-emptied) files - directories as well - many modern Linux hosts can use this efficient approach:
find . -empty -delete
find /path/to/stuff -empty
If that's the list of files you're looking for then make the command:
find /path/to/stuff -empty -exec rm {} \;
Be careful! There won't be any way to undo this!
Use:
find . -type f -size 0b -exec rm {} ';'
with all the other possible variations to limit what gets deleted.
A very simple solution in case you want to do it inside ONE particular folder:
Go inside the folder, right click -> view -> as list.
Now you'll find all the files listed as a list. Click on "Size" which must be a column heading. This will sort all the files based on it's size.
Finally, you can find all the files that have zero bites at the last. Just select those and delete it!